News Nvidia RTX 40-series Super models revealed — 4070 Super coming Jan 17 at $599

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Can someone explain to me what happened like I'm 5?
Inflation, pandemics, AI, AAA games, 4K gaming on the same sized screens, and rAyTrAcInG don't add up to Nvidia's price points.
The NV 9 series brought pricing back to rational. The 10 series maintained that pricing, but the performance increases were big.

NV decided to raise prices with the 20 series because of their market position and brought the >$1000 flagship. While this didn't go over well AMD didn't really have full stack competition and NV eventually released Supers.

The 30 series was mostly pretty good for the pricing (3050 was bad, but it is a bad card, 3060 should have been ~$30 less, and the 3070 only had 8GB VRAM while it wasn't hugely faster than the 3060 Ti), but crypto boom happened. To give you a clue as to how bad this was the 3080 launched Sept 2020, I bought mine July 2022 because that's the first time pricing got close to MSRP.

By the time we rolled into the 40 series people had at least 18 months of stupid pricing and things had just started normalizing. Outside of the volume of cards, especially high end, moving AMD/NV didn't really reap the benefits of the pricing. By this time NV also had ~80% of the market and pulled another 20 series like price rise. AMD was perfectly happy to price their cards in between NV with regards to performance and Intel has yet to release another generation.

So here we are today where we have the folks who accepted it and bought 4090s, those who are tired of it and bought nothing, and lastly the real victims those who just wanted to buy a video card at a decent value to play games.
 
Rolling in here with my 1070 and memories of what cards used to cost...
Is it me or is everyone just numb to the insane pricing?
Can someone explain to me what happened like I'm 5?
Inflation, pandemics, AI, AAA games, 4K gaming on the same sized screens, and rAyTrAcInG don't add up to Nvidia's price points.
I'm with you on this. It's the Covid syndrome. After that hit and supply chains broke world-wide, companies jacked up everything double and triple the price from A to Z and it will be a cold day in hell when pricing goes back to reasonable since greed has permeated everywhere.

I'm still using my six year old Asus 1080Ti 11GB card which was a $400 top of the line card 6 years ago when it came out. I can still play everything on high or Ultra using my QHD 27inch 1440p monitor which is noticeably sharper the FHD and since that's still the gamers sweet spot I'm good to wait until prices drop but the eye candy in new games coming out can certianly impact my decisions.

While I have no need for a $700, $800 or $900 video card I might be inclined to get the 4070 Super for $600 which is light years ahead of the 1080ti and also needed to drive the 49inch curved monitor needed for working at home. If that all works well for gaming it will be icing on the cake to justify the costs since it will last at least another 6-8 years. At least thats what I'll keep telling myself 😉
 
  • Like
Reactions: sundragon
I am sticking with my straight from AMD RX 6800, for awhile longer. I got it for MSRP, almost 3yrs ago. I refuse to pay these greed inflated prices, that both manufacturers are trying to peddle us. AMD might be cheaper, but they are still overpriced.

Gotta give you props for that. Speaking with your wallet. Seriously. 👍
 
We know, people want price pressure on Ngreedia, so they can buy Ngreedia.

None of you are interested in buying AMD, no matter what they offer.

I honestly wish that AMD would just drop from PC gaming GPU's, just to see how much are all of you willing to pay for Ngreedias GPU's.

Then again, we will all be forced to do that anyways, since all of their proprietary tech keeps taking away our options and instead of fighting them, we are demanding it (Starfield DLSS anyone?)

Really? I’m happy with my 6800xt personally. I’ve used both amd and nvidia cards and am getting to where I somewhat prefer amd cards. In the price ranges they compete in they seem to do well and they seem to be catching up on features with fsr 3 etc. Plus I don’t care for all of nvidias business practices. So I’d like to keep my amd card.
 
  • Like
Reactions: Jagar123
I am sticking with my straight from AMD RX 6800, for awhile longer. I got it for MSRP, almost 3yrs ago. I refuse to pay these greed inflated prices, that both manufacturers are trying to peddle us. AMD might be cheaper, but they are still overpriced.
I got a 7900 XTX for around US$850 and it came with 2 free games worth around 170 both, so you could say I paid 680 for it.

Which is definitely not bad, but I do agree, all these GPUs are simply overpriced. Ngreedias more so, or course.
 
It looks like an optics move to me. If the Super parts are indeed as near powerful as the next tier up the ranks then they are just reducing prices without actually reducing the price of their old parts. They could simply have lowered the prices of their existing parts but now they get to make a big deal and market "new" things.

*EDIT* If I recall they did this with the overpriced original 2000 series as well.
The 4070 Ti 12GB and 4080 non Super are being phased out as in they are no more after they sell out the remaining stock.
 
Jarred - an error in your chart, I think? It shows 64MB L2 cache for the 4070 Ti Super, but Nvidia's slide says 48MB.

I'm really looking forward to that card. I had my eye on the regular 4070 or Ti, but have been hesitating, because I want some room to upgrade to a 4K screen this/next year, and am worried those cards could age really quickly at that resolution. Glad I decided to hold off for a couple of months to see what Super brings, because the 4070 Ti Super looks like it would be perfect for me. : )
Fixed now. Also, the 4070 Super will be 48MB L2 we were told — it's a typo from the earlier slide deck.

Nvidia so far has done either the full 8MB of L2 per 32-bit memory channel, or else it has cut that down to 6MB per channel. (Or you can call it 16MB/12MB per 64-bits if you prefer.) I have a spreadsheet formula to calculate the L2 and forgot to update it for the initial table used here. :)
 
  • Like
Reactions: AgentBirdnest
Status
Not open for further replies.