Microcenter has them pre-listed with a few $800 models (that will never be restocked after the initial drop) and models ranging all the way up to $1050. Completely unacceptable pricing for a -70 part.
And just to drive that point home - the 2070 Super is only 3.5 years old.LTT has a nice image showing how insane these new GPU are priced compared to previous gens.
Nvidia has basically doubled the price of GPU since the 1070.
Microcenter has them pre-listed with a few $800 models (that will never be restocked after the initial drop) and models ranging all the way up to $1050. Completely unacceptable pricing for a -70 part.
Probably right. My local Microcenter shows 2 models at 799. A PNY and a Zotac for that. The rest of them go over 800 and up over $1000. For that price they can keep them. They've got 7900XT sitting for 900. Why would I pay over 1000 for 800 dollar cards??? I'll keep my 3080.
Agree. same at Mayfield, OH Microcenter. They have 25+ 7900XT sitting there doing nothing. Most are reference cooler design, so people in the know are probably avoiding them like the plague with the current issues AMD is having.Probably right. My local Microcenter shows 2 models at 799. A PNY and a Zotac for that. The rest of them go over 800 and up over $1000. For that price they can keep them. They've got 7900XT sitting for 900. Why would I pay over 1000 for 800 dollar cards??? I'll keep my 3080.
They're only "wrong" when everyone refuses to pay them. Unfortunately, we're being shown time and time again that there are apparently enough people willing to spend $800 for this level of performance that we'll continue to see them. As stated in the conclusion, if Nvidia had tried to sell this as a $600 card, or a $500 card, and then scalpers just snapped them all up and asked for $800 or more, we'd be right back where we started. Except then we'd have scalpers contributing nothing and taking a chunk of the profits.
So yeah, don't buy a $800 card if you don't want to spend that much. Wait for prices to come down, or go with a cheaper and slower alternative. But if others keep paying a lot more than you're willing to pay, nothing is going to change.
Agree. same at Mayfield, OH Microcenter. They have 25+ 7900XT sitting there doing nothing. Most are reference cooler design, so people in the know are probably avoiding them like the plague with the current issues AMD is having.
Your 3080 should be kicking ass and taking names for a long time. I just finally retired my 1070 GTX this year, to my kids, it still plays games pretty good on reasonable settings. Over 100 fps in COD: MW2 warzone at 1080p with a capable cpu.
I'm not going to try to make things pretty, but I do have the data. (1) = 1080p medium, (2) = 1080p Ultra, (3) = 1440p Ultra, and (4) = 4K Ultra (in the file names on the left). Here are the DLSS numbers:It would be interesting to see power consumption with and without dlss 3
Ultra removes the CPU bottleneck more, and while today's games may not always look all that different at ultra vs. high, in the long run it's just a number and things will inevitably trend downward. You can see the 1080p medium results, where everything becomes CPU limited. 1440p at high would generally perform close to 1080p at ultra. Basically, I have to choose a setting and stick with it, and since I'm showing what a GPU is potentially capable of handling, maxing out (mostly) the settings does that better in my book.Ultra doesn't do much for visual quality and slows down the FPS.
If people knock this default setting down, they're going to have much better experiences. It's somewhat amusing when I see 3060ti users can enjoy the heck out of a game, then a guy with a 4090 posts about his fury that he cant enjoy the same game because it won run 144hz at 4k with ultra RTX on.
This 4070ti looks like its at the low end borderline where you can start to push over 60fps at 4k (with 'high', not in 'ultra', and def not with RTX) and the requisite supporting components (high end CPU & 4k 144+hz monitor) to support that performance costs a whole lot more than what you need to accompany a cheap 3060ti paired with a 1440 60-144hz monitor
Nvidia "unofficially" dropped the price of the 3090 Ti FE to $1,099 at Best Buy. Other companies followed — this was one of the things that pissed off EVGA's CEO. Then those cards mostly sold out, they're no longer being manufactured (AFAIK), and prices have crept back up. But the number of units for 3090/3090 Ti has dropped substantially. Plus, suggesting anyone with half a brain would pay $2000 for a 3090 Ti that is provably slower and worse in virtually every scenario than the $1,200 4080 is silly. So, like I said in the text, I punted a bit on the prices for FPS/$.MSRP for the RTX 3090 Ti is $2000 USD.
I've retested some of the cards on a different PC, with different drivers, and a slightly different gaming suite. I suppose it depends on what you're looking at. The original 4090 and 4080 launch reviews used a 12900K. Then the 13900K and 7950X arrived (technically before the 4080) and I got new test hardware, so I switched to the 13900K on the AMD 7900 reviews. Now I'm still using the 13900K with the 4070 Ti, and I had to also retest every other GPU in these charts on that system, generally using the latest drivers.What is causing the average / mean frames to change?
I notice that the 4090 RT frames was 103.4 and is now today, 104.9 at 1440p ultra RT, and the Non RT average was 153.8 before and is now 156.1.
Do you rerun the tests for new versions of the drivers?
I have them in my personal price to performance spreadsheet and they seem to be changing.
I did the math and for me, with a 6700XT, if I were to prioritise RT at 1440p, and assuming a starting price of £800 for the 4070TI,Do you not know what "almost" means? And while some would take "palatable" to mean really tasty, that's not the way I normally use it. I use it more as "acceptable but not awesome." I wouldn't call an excellent dinner "palatable," I'd say it was delicious or some other word that means I really like it. Taco Bell is palatable, for example. So is Wendy's. But neither is great, just like an $800 replacement that's only moderately faster than the outgoing $800 cards.
You must be the most hardworking person in the building.Nvidia "unofficially" dropped the price of the 3090 Ti FE to $1,099 at Best Buy. Other companies followed — this was one of the things that pissed off EVGA's CEO. Then those cards mostly sold out, they're no longer being manufactured (AFAIK), and prices have crept back up. But the number of units for 3090/3090 Ti has dropped substantially. Plus, suggesting anyone with half a brain would pay $2000 for a 3090 Ti that is provably slower and worse in virtually every scenario than the $1,200 4080 is silly. So, like I said in the text, I punted a bit on the prices for FPS/$.
I've retested some of the cards on a different PC, with different drivers, and a slightly different gaming suite. I suppose it depends on what you're looking at. The original 4090 and 4080 launch reviews used a 12900K. Then the 13900K and 7950X arrived (technically before the 4080) and I got new test hardware, so I switched to the 13900K on the AMD 7900 reviews. Now I'm still using the 13900K with the 4070 Ti, and I had to also retest every other GPU in these charts on that system, generally using the latest drivers.
I also dropped Fortnite from the benchmarks, because the latest update totally broke my test sequence and RT / Lumen don't work in Creative mode, so I literally cannot test the latest GPUs on the previous Fortnite benchmark. I added Spider-Man: Miles Morales as a replacement, because it's easy enough to benchmark and also supports DLSS 2/3, FSR 2, and XeSS. I added A Plague Tale: Requiem to the list of benchmarks as well, again because it's easy enough to test, has DLSS 2/3 support, and represents a modern game engine. Besides the above, with the 7900 review, I noticed Forza Horizon 5 performance had dropped quite a bit on Nvidia cards. I pinged my Nvidia contacts and they said they were aware of the issue and it was being fixed in a future driver. The launch (preview) driver for the 4070 Ti includes the fixes, so I retested all of the Nvidia cards in Forza.
TL;DR: I routinely retest some games on newer drivers or with new patches if I notice a discrepancy, and switching to a new test bed (plus a secondary AMD test bed) also adds to the "fun."