News Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the previous generation RTX 4070?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The RTX 2070 launched in October 2018 at $599 MSRP.

The RTX 3070 launched at $499.

The RTX 4070 launched at $599.

Now the RTX 5070 is launching at $549, $50 less than the RTX 2070 launched at even with 6 years of inflation (24% according to the CPI inflation calculator; $599 in October 2018 is equivalent to $747 today).

Please explain to me exactly how the 5070 is so unreasonably priced.

The fastest 70-series GPU yet, at a price that’s less than the same class of GPU 6 years ago. I swear, people will dunk on Nvidia for being greedy no matter how they price their GPUs.

it is expensive not as expensive at the 4070 but that price was over inflated in the first place. 12gb of vram is quite low.

the 2070 pricing was quite bad as well which they got slack for which is why we got super in the first place.

rtx 5070 is had a power bump from the 4070. a few more units and imagine it wont be a huge jump from 4070 because of the 192 bit bus.

have a feeling thats been done delibrately for the 5070 super or ti.

i personally never buy first release cards as usually the better more efficient cards are super cards.

i personally will skip blackwell and wait till the next generation is out which will be more energy efficient most likely.
 
I don’t see how you’re expecting larger per SM gains than Ada when there’s no new node or clock speed bump. The 4070 is the only card on Ada without a change in SM count or core count and it was 20% faster than a 3070…
Agree even the 30% seems very optimistic.
 
I agree. Zero issues with my Sapphire 7900 XT besides the ~30W idle power draw early in 2024, but that's long been corrected. If that was my worst issue... yeah, AMD has come a long way in graphics driver stability. I think their biggest issue is that they have shot themselves in the foot for the longest time in regard to marketing and still do to this day. As some would say "AMD never misses an opportunity to miss an opportunity."

At this point, it's just "nVidia is best" -- who doesn't want that, right?? Like, what price discount, performance advantage, or combination of both does it take for some to many gamers to "step down" to the #2 brand? In the world of hi-tech, almost everything is pretty much a two-dog race and indeed top dog tends to have a decent lead over #2 (look at Intel vs. AMD CPU's for the longest time, and heck, it's still taking AMD years to chip away at the market leader incumbent's market share).

That's ok though, green always finds a way to get a new talking point. In the last few years, it's their upscaling tech and in general, their AIAIAIAIAIAIAIAI.
The main issue is people think AMD is after GPU business. They’re not. They’re a CPU company that develops graphics IP strictly to have the best APUs available. The graphics architectures are designed with an APU in mind. They basically only sell discrete cards with a beefed up cache hierarchies to use up their excess wafer supply.
 
The main issue is people think AMD is after GPU business. They’re not. They’re a CPU company that develops graphics IP strictly to have the best APUs available. The graphics architectures are designed with an APU in mind. They basically only sell discrete cards with a beefed up cache hierarchies to use up their excess wafer supply.
Umm no….they don’t….That kind of assertion requires evidence
 
  • Like
Reactions: Ttristtan
The main issue is people think AMD is after GPU business. They’re not.
I would agree to the extent that AMD clearly has shown no interest in gaining market share based on their pricing and release strategies.
They’re a CPU company that develops graphics IP strictly to have the best APUs available.
Perhaps you have information I'm unaware of to back it up in which case please do share. Based on what's publicly available the first APU optimized architecture out of AMD is RDNA 3.5. This was developed off of the work they've done with Samsung after licensing RDNA to them. The console APUs have been based off of existing architectures, but with added and/or removed features. On the enterprise side their first APU didn't come until the MI300A.
 
I would agree to the extent that AMD clearly has shown no interest in gaining market share based on their pricing and release strategies.

Perhaps you have information I'm unaware of to back it up in which case please do share. Based on what's publicly available the first APU optimized architecture out of AMD is RDNA 3.5. This was developed off of the work they've done with Samsung after licensing RDNA to them. The console APUs have been based off of existing architectures, but with added and/or removed features. On the enterprise side their first APU didn't come until the MI300A.
It’s not “publicly available info”. It’s just obvious in some of the design trends from AMD. One obvious example is giving an architecture iGPU-appropriate base level cache hierarchies then slapping an L3 on top for the desktop parts.
 
It’s not “publicly available info”. It’s just obvious in some of the design trends from AMD. One obvious example is giving an architecture iGPU-appropriate base level cache hierarchies then slapping an L3 on top for the desktop parts.
So now you are asserting that Nvidia designs their architectures for iGPUs as well? because AMD’s base cache hierarchy was comparable to Nvidia’s before AMD innovated and came out with RDNA Infinity Cache with the 6000 series. This caused Nvidia to significantly increase their L2 cache sizes to compete. So it’s actually not very obvious…
 
The main issue is people think AMD is after GPU business. They’re not. They’re a CPU company that develops graphics IP strictly to have the best APUs available. The graphics architectures are designed with an APU in mind. They basically only sell discrete cards with a beefed up cache hierarchies to use up their excess wafer supply.

Intel has claimed they learned a lot from developing ARC, that they can apply to their IGP, so you are probably not too far off. I don't agree with the last sentence thogh.