hannibal
Distinguished
Oh well. I guess I'll be attempting to beat the world record of longest working GPU then. Anybody knows what's the current score?
Hmmm... C64 is working still so it is more that 50 years...
Oh well. I guess I'll be attempting to beat the world record of longest working GPU then. Anybody knows what's the current score?
It is almost like you didn't read the last sentence in my post. You know, the one that indicated that the margin had to cover all of the costs of the business.Do you think that the equipment on the assembly lines, the factory buildings they are housed in or the workers to man those lines and the electricity for those factories come at $0 cost?
Do you think shipping a container of these half way around the world is free as well?
Even when a engineer on their R&D team takes a weeks vacation those cost are covered how, by the profits made on the gpu's said engineer helped design!
Actual cost figured into that product are lot more long reaching than just the cost of components that go in the box!
And oh yeah, that box or the box art on it were not at a 0 cost either!
Look at the entire cost and you can see these companies have a lot of expenses to cover.
LOL. Everything Nvidia has said about core and shader counts basically indicates that Ada is refined Ampere, on a new process node, with some enhancements that will specifically require developer intervention.Why would you attempt to compare core and shader counts even when the architecture is not even similar?
JayzTwoCents says a lot of factually incorrect things and gives some bad advice. He also gets free hardware like most people testing this stuff, which means he can run whatever is "best" and not worry about cost.From what I'm gathering you're going to need a much larger PSU than 1000 watt for this card! Jayztwocents runs it at approx 1600 watt.
The actual numbers are largely meaningless these days. But we don't scale in three dimensions, just two. So 50% smaller is potentially 4X the density, but you need more spacing (proportionally) with smaller nodes. In actuality, GA102 was 45.04 MTrans/mm^2, and AD102 is 125.41 MTrans/mm^2. So density is 2.78X higher, and that's quite impressive.If it is a true 8nm to 4nm move and the die size is roughly the same, the you could have up to close to 16x the transistors. Yet only 2x improvement? Something not passing the sniff test. Not that I'm doubting the die size but that is a horrible decrease in efficiency. Soomething is wrong.
'increasing prices'?If they keep increasing prices for video cards, amount of buyers will keep decreasing, because not everyone can afford $1000 & $1500 for video cards.
they'll juts buy used or lower tier cards.
and if it gets too extreme i could see consoles becoming the answer to the high price of gaming. (im pc master race but there IS a limit)
I have to say... Long, long time!
AMD has had better price / performance ratio many times and still people buy Nvidia, so Nvidia don´t have to worry people going to buy AMD... They still buy Nvidia no matter what. Same this as Apple vs Android phones. The price means nothing to some people. And it seems that that group is the big majority! About 80% of the customers.
Highlight mine and an important nitpick that is worth mentioning over and over: they mentioned the FE card had a better power delivery, not the overall 4090 design(s) they've shared with the partners. As always, I'd love to be wrong on such matters, but I doubt it.Nvidia is also talking about better transients with RTX 4090 vs. RTX 3090 Ti, which means if you have a PSU that can handle a 3090 Ti, it should definitely handle a 4090. It will put less of a strain on the PSU because it won't spike or dip as much.
You could say that to every Lamborghini owner. Just buy the card you can afford, or 2nd hand. I'll be selling 3090 for half the price or lower when the time comes.Not everyone has your privilege sir.
Who or what is forcing people to upgrade?If they keep increasing prices for video cards, amount of buyers will keep decreasing, because not everyone can afford $1000 & $1500 for video cards. For some people it's whole month salary & in poorer countries they don't even have that much salary. It's pretty sad that company doesn't care about low class gamers anymore. Personally I'm stuck with GTX 1070 & since that I can't afford upgrade, sadly.
So, whether it's brand loyalty, or people pushing incorrect info, there's still that "you HAVE to get Nvidia" sort of mentality lingering.
Nvidia, I have little doubt, knows this, and, I also have little doubt, shamelessly takes advantage of it.
I miss the days when their behavior was that of the little guy.
Holy-!not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.
not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.
Holy-!
You used userbenchmark to justify that purchase...
That site is biased against AMD's cpus and gpus...
Every now and then someone uses links from cpu/gpubench not realizing that it's just UBM.yeah, their black propaganda worked. I'm now brainwashed.
using the 3050 right now. It runs all my playstation 4 and playstation 5 games beautifully with no lag, so it feels the money went to good use. The 3050 even runs Dead or Alive VI at max graphics easily, something that the 1050ti in my backup pc cannnot do.
if I had purchased the 6600XT, who knows what evil things have happened to my rig - just joking with that. lol
AMD likely could undercut Nvidia, but I'm not sure they will, at least not by much. Maybe the AMD of a few years ago would have, but lately they seem to be reducing costs, but not actually passing those savings on to consumers. If they have a card roughly as fast as the 4090, then it wouldn't be surprising to see them price it similarly. And if they have anything notably faster, it might even be priced higher. I would like to see them undercut Nvidia by a decent amount, but I have some doubts that's going to happen, considering the not-so-competitive pricing of their new Ryzen lineup.And how AMD's RDNA3 chiplets could seriously undercut Nvidia
That's not a great example though, seeing as the Titan was generally considered a terrible value at the time, and was barely any faster than the GTX 780 that launched around the same time. Most sites didn't even bother reviewing Titan cards, and Nvidia generally didn't send out review samples. Of course, the 3090 was much the same relative to the 3080, or at least would have been if it weren't for crypto-shortages pushing the 3080's pricing up to a similar level.A decade ago, the GTX Titan released at $1,000 MSRP
Today, the RTX 4080 at $1,200.
Bringing multiple times the performance, for 20% higher price.
That was more ATI though. AMD had already bought the company prior to the launch of that series, but owned it less than two years at that point, and the companies were likely still operating mostly separate from one-another. The cards were still branded as "ATI" cards, and no doubt were designed by the same crew that had been designing ATI's prior cards up to that point. By now, the Radeon division has undoubtedly been integrated more into AMD, so it's arguably a different company.People needs to remember the Radeon HD4870.
I wish AMD went back to that time in how they did things instead of trying to blow up their cards for a few extra frames. Same with nVidia, mind you.
Userbench is so bad. Their writeups for hardware can hardly be considered "reviews", and are more like the factually incorrect rantings of a 12 year old fanboy in some tech forum. : P And their benchmarks are synthetic, often okay for getting a rough idea of the relative performance of hardware released years apart, and generally decent at comparing cards using identical architecture, but often inaccurate when comparing different architectures. And they clearly have an agenda against AMD, perhaps for stock manipulation, if not getting paid off directly by their competitors. When comparing cards currently on the market, it's a much better idea to look at actual reviews that test performance in real games and make at least some attempt to be objective with their recommendations.not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.
No one is forcing people to upgrade, but when your card gets so old that it struggles in most games, it means it's time to upgrade, but when card costs twice & 3 times more than it used to be in the past, you're basically stuck with your current card, because Jensen Huang or whatever his is name decided to become richer and doesn't give a rat's ass about middle or low class people. You understand it now?Who or what is forcing people to upgrade?
Yes, the only reason for this is pure greed and I don’t understand why people keep giving them their moneyNo one is forcing people to upgrade, but when your card gets so old that it struggles in most games, it means it's time to upgrade, but when card costs twice & 3 times more than it used to be in the past, you're basically stuck with your current card, because Jensen Huang or whatever his is name decided to become richer and doesn't give a rat's ass about middle or low class people. You understand it now?