News AMD Brags That Radeon 16GB GPUs Start at $499, Unlike Nvidia

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Okay. Lets do some VRAM comparison between both the camps, based on gen after gen.


NVIDIA Generational VRAM Comparison:

GPUGEFORCE 40 SERIESGEFORCE 30 SERIESGEFORCE 20 SERIESGEFORCE 10 SERIES
90-90 Ti Tier24 GB24 GBN/AN/A
80-80 Ti Tier16 GB10-12 GB8-11 GB8-11 GB
70-70 Ti Tier12 GB8 GB8 GB8 GB
60-60 Ti TierTBA8-12 GB6-8 GB3-6 GB
50-50 Ti TierTBA8 GB4-6 GB2-4 GB

AMD Generational VRAM Comparison:

GPURADEON 7000 SERIESRADEON 6000 SERIESRADEON 5000 SERIES
900 Tier24-20 GB16 GBN/A
800 TierTBA16 GBN/A
700 TierTBA12 GB8 GB
600 TierTBA8 GB6 GB
500 TierTBA4-8 GB4-8 GB
400 TierTBA4 GBN/A
You forgot Radeon VII with 16GB.

But regardless, your point is well made. AMD's marketing dept seems to have an awful case of "shortmemoritys" and may need to see a doc about it.

How about not screwing up your launches before you talk smack about the competition, AMD? Jeez...

Regards.
 

peterf28

Distinguished
Apr 7, 2013
99
16
18,535
Nvidia has become for GPUs what Canon is for Cams and Apple to smartphones.. they only thrive because consumers are mostly non thinking npcs

$200.. 6600XT, Nvidia has 2060S, one tier lower
$300.. 6700XT, Nvidia 3060, one tier lower
$400.. 6800, Nvidia has 8GB 3070 lol, one tier lower
$600.. 6950XT, Nvidia has 4070, one tier lower
$950!.. 7900XTX, matches the $1600 4090 on a lot of games...... enough said

People who buy a $800 12GB today from Nvidia, even a $1200 16GB, are completelly clueless. For $900 we have today AMD with better raster than both and 24GB, its mind blowing actually that people buy based on marketing not on real value.

I'am worried to buy AMD hardware : exploding burning CPUs, and GPUS with driver problems, freezing, crashing, etc. That is what I think about AMD.
 

klavs

Proper
Feb 27, 2023
142
44
110

Okay. Lets do some VRAM comparison between both the camps, based on gen after gen.


NVIDIA Generational VRAM Comparison:

GPUGEFORCE 40 SERIESGEFORCE 30 SERIESGEFORCE 20 SERIESGEFORCE 10 SERIES
90-90 Ti Tier24 GB24 GBN/AN/A
80-80 Ti Tier16 GB10-12 GB8-11 GB8-11 GB
70-70 Ti Tier12 GB8 GB8 GB8 GB
60-60 Ti TierTBA8-12 GB6-8 GB3-6 GB
50-50 Ti TierTBA8 GB4-6 GB2-4 GB

AMD Generational VRAM Comparison:

GPURADEON 7000 SERIESRADEON 6000 SERIESRADEON 5000 SERIES
900 Tier24-20 GB16 GBN/A
800 TierTBA16 GBN/A
700 TierTBA12 GB8 GB
600 TierTBA8 GB6 GB
500 TierTBA4-8 GB4-8 GB
400 TierTBA4 GBN/A
You need to add a row to the Nivida table named 40-40 and enter N/A, otherwise the tables are misleading.
Aligning the columns might also be helpful - which can be done merging the two tables into one.
 
  • Like
Reactions: KillswitchEngage
You forgot Radeon VII with 16GB.

But regardless, your point is well made. AMD's marketing dept seems to have an awful case of "shortmemoritys" and may need to see a doc about it.

How about not screwing up your launches before you talk smack about the competition, AMD? Jeez...

Regards.

Yes, I didn't include the Radeon VII because it was mainly a prosumer card more like. Not explicitly designed for gaming but it was indeed a very powerful and capable card.

It produced incredible frame rates in most AAA titles and it has the added benefit of being incredibly powerful for productivity tasks like graphical video and editing, CAD/CAM, video editing, professional visualization performance and other similar stuff.

I think there are still gamers using this card since it packs a lot of horsepower with 2.0Gbps HBM2 mem clock, 4096-bit interface and obviously 16GB VRAM. The only downside is the high power consumption.
 
Last edited:
  • Like
Reactions: Amdlova

Amdlova

Distinguished
Microsoft Xbox series S and X don't have diferent type of download of the games? Can't do something like that? When people download choose the right texture pack and stop that madness... I'm not a gaymer anymore just play some old titles do some work with the machine...
Want a new graphics only for av1 codec but with that prices will buy a 12100 and uses the intel graphics.
 

sherhi

Distinguished
Apr 17, 2015
77
51
18,610
I don't think 4gb statement came to haunt them, it was kind of true just as much currently trending (after 4070 fiasco) ""8gb not enough for the future" is. Them releasing 4gb low end card does not contradict their statement, it's a market and they released a product for less demanding consumers.

Badly optimized games are one thing but many studios are starting to use more and more potential of current gen consoles which have more than 8gbs so I would say AMD knows well what they are talking about.

There are many good tests showing whether you need more than 8 GBs and more than 6 cores on CPU for current games and yes for maybe 90-95% of them you are fine with 6core/8gbs combo but that doesn't tell you anything about future proofing. Buying 4070 (which is what, upper midtier card?) to barely play at medium setting for some new games like Hogwarts or something? Not okay imo. I think Vanguard makes better use of 8 cores as well? Not sure...but trend is there and it doesn't take a genius to see it.

AMD is not wrong but would be more understood if they phrased it maybe like I did instead of some half-assed tweets.
 
  • Like
Reactions: Amdlova and King_V

abufrejoval

Reputable
Jun 19, 2020
189
82
4,660
I'd love to give AMD GPUs another spin, but I'm afraid I'll be stopped at the first hurdle: their drivers refusing to work on my primary 24x7 system, which just happens to run Windows 2022, a very nice Windows 10 with even less of that phoning-home stuff (or a Windows store) that plagues even the Enterprise desktop variants.

AMD has refused Windows server OSs for decades now and I am getting very, very tired of it, especially since in my case it's runing on a Ryzen 7 5800X3D because I want ECC RAM support and Intel couldn't deliver at all or at reasonable price points. But it's been a bane since Richmond APU times and Server 2003, causing extra efforts to make it work anyway.

Neither Nvidia nor Intel drivers (except for NICs) do this nonsense for GPU or chipset drivers and while I know that Microsoft charges extra for server driver signatures, I doubt it would ruin AMD.
 

Warrior24_7

Distinguished
Nov 21, 2011
30
17
18,535
The ONLY people :tonguewink: enough to buy an AMD GPU are AMD fanboys, everybody else knows better. This isn’t just my opinion, the market bears this out as well! Nvidia “commands” over 90% market share. This is not competition, it’s a beat down. AMD is no Nvidia competitor. You just get a better more reliable product from Nvidia than you do AMD! AMD’s brand new CPUs are burning up! Quality control is horrible over there. I stay far, far away.
 

rluker5

Distinguished
Jun 23, 2014
470
271
19,060
I don't think 4gb statement came to haunt them, it was kind of true just as much currently trending (after 4070 fiasco) ""8gb not enough for the future" is. Them releasing 4gb low end card does not contradict their statement, it's a market and they released a product for less demanding consumers.

Badly optimized games are one thing but many studios are starting to use more and more potential of current gen consoles which have more than 8gbs so I would say AMD knows well what they are talking about.

There are many good tests showing whether you need more than 8 GBs and more than 6 cores on CPU for current games and yes for maybe 90-95% of them you are fine with 6core/8gbs combo but that doesn't tell you anything about future proofing. Buying 4070 (which is what, upper midtier card?) to barely play at medium setting for some new games like Hogwarts or something? Not okay imo. I think Vanguard makes better use of 8 cores as well? Not sure...but trend is there and it doesn't take a genius to see it.

AMD is not wrong but would be more understood if they phrased it maybe like I did instead of some half-assed tweets.
There is no such thing as 100% future proof.
There will always be games like Shadow of Mordor with ultra textures, or a Crysis 2 DX11, or a W3 with hairworks, or anything path tracing, or Jedi: Survivor.
And the consoles have 16GB for both the CPU and GPU. It may need more than 8GB sometimes, but will it in games that make use of the CPU? How long have games needed more than 8GB system ram? They are starting to need up to 32GB in some poorly optimized ones. That leaves consoles with -16GB vram. The 16GB consoles are what is protecting the 8GB vram cards.

Since shared system memory, DX11.2 and tiled resources, and now with DirectStorage, needing more than 8GB vram is more just evidence of poor optimization than hardware lacking resources to put out a good image.

There will never be future proofing against poorly made games or games that are altered to favor the latest product.
 

watzupken

Reputable
Mar 16, 2020
934
464
5,270
While I totally agree that AMD have a VRAM advantage against Nvidia, I still cannot forget when AMD released the awful RX 6500 XT with 4GB VRAM. At that point, they also seems very proud that the lack of VRAM will deter miners. Just less than 2 years after the release of the RX 6500 XT, it is not fit for purpose anymore.
 
Apr 1, 2020
998
707
5,760
TechPowerUp posted this article yesterday, and I'll borrow a couple of charts. While VRAM is important, it's only part of the equation. Look at the 3070 vs the 6700XT, for example. Despite having less VRAM than the game can use if unrestricted, performance is greater than a card with more VRAM than required. Reviews show those two cards are effectively equal in power, so VRAM usage isn't a factor.

vram.png


performance-3840-2160.png


TechpowerUp - Star Wars Jedi: Survivor Benchmark

I wonder how much VRAM usage, especially in the extreme cases like Resident Evil 4 remake, is simply caused by coding requiring assets be kept in VRAM instead of paged out because they don't want people with SATA SSDs or, unthinkably in 2023, HDDs, to complain about performance hiccups, while other games, like SWJ:S, are able to deal with it.
 

tvargek

Commendable
Dec 3, 2020
6
6
1,515
it's interesting how neither of them are ready to fill the true needs of 250€ cards hole with 3070ti specs and 12gb of memory, whoever does that will be the king of average gamer
 
Oh man, this is just cringe-worthy. Sure, the RX 6000-series is perhaps the most underrated Radeon generation in history but somehow, AMD managed to screw up the following generation.

AMD brags about how their 16GB cards start at $500 and sure, they do, but they only do NOW because they sure as hell didn't 12 short months ago.

Considering just how badly they botched the RX 7000-series launch, they should be keeping their crowing to themselves. If I were Lisa Su, Sasa Marinkovic would've been made unemployed as soon as I saw what he wanted to do with the RX 7000-series of cards. The cynical naming of the RX 7900 XTX and RX 7900 XT, the lack of any new releases over the last six months and that disastrous launch event wouldn't have had me questioning his talents, they would've had me denying them outright.

I buy Radeon cards because they have superior value, they're fantastic products but also because AMD is the lesser of three evils (the other two being Intel and nVidia). Now, with the botched release of RDNA3, not only are they going to release the RX 7600 before the RX 7700 and RX 7800, they're only giving it 8GB. This basically means that a card with similar performance to their 10/12GB RX 6700/XT is only going to have 8GB of VRAM. This alone means that Sasa should be keeping his big mouth shut.

I don't know what happened but AMD's marketing of Ryzen, RDNA1 and RDNA2 was fantastic. Now, they're acting like a bunch of insecure high school jocks.

They should just shut their mouths and let the tech press do the talking for them. It would be better for them that way.
 
Last edited:
Nvidia has become for GPUs what Canon is for Cams and Apple to smartphones.. they only thrive because consumers are mostly non thinking npcs

$200.. 6600XT, Nvidia has 2060S, one tier lower
$300.. 6700XT, Nvidia 3060, one tier lower
$400.. 6800, Nvidia has 8GB 3070 lol, one tier lower
$600.. 6950XT, Nvidia has 4070, one tier lower
$950!.. 7900XTX, matches the $1600 4090 on a lot of games...... enough said

People who buy a $800 12GB today from Nvidia, even a $1200 16GB, are completelly clueless. For $900 we have today AMD with better raster than both and 24GB, its mind blowing actually that people buy based on marketing not on real value.
Oh there's no shortage of clueless people. Just look at how many paid more than the cost of an RX 6800 XT for the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti. It's like paying more for a Chevy Camaro than a Nissan GT-R. It's just plain stupid.
 
Oh man, this is just cringe-worthy. Sure, the RX 6000-series is perhaps the most underrated Radeon generation in history but somehow, AMD managed to screw up the following generation.

AMD brags about how their 16GB cards start at $500 and sure, they do, but they only do NOW because they sure as hell didn't 12 short months ago.

Considering just how badly they botched the RX 7000-series launch, they should be keeping their crowing to themselves. If I were Lisa Su, Sasa Marinkovic would've been made unemployed as soon as I saw what he wanted to do with the RX 7000-series of cards. The cynical naming of the RX 7900 XTX and RX 7900 XT, the lack of any new releases over the last six months and that disastrous launch event wouldn't have had me questioning his talents, they would've had me denying them outright.

I buy Radeon cards because they have superior value, they're fantastic products but also because AMD is the lesser of three evils (the other two being Intel and nVidia). Now, with the botched release of RDNA3, not only are they going to release the RX 7600 before the RX 7700 and RX 7800, they're only giving it 8GB. This basically means that a card with similar performance to their 10/12GB RX 6700/XT is only going to have 8GB of VRAM. This alone means that Sasa should be keeping his big mouth shut.

I don't know what happened but AMD's marketing of Ryzen, RDNA1 and RDNA2 was fantastic. Now, they're acting like a bunch of insecure high school jocks.

They should just shut their mouths and let the tech press do the talking for them. It would be better for them that way.

It's because the recent console optimized games like "The Last of Us" and "Hogwarts Legacy" are highlighting the need for more VRAM since the PS5 and Xbox Series X have 16GB of shared ram. Developers heavily optimized these games to run on those consoles which are able to step over 8GB of VRAM allocation. They don't have to strictly adhere to optimizing ground clutter density, texture resolution and number of textures per scene to strictly stay under 8GB for consoles. If it works up to 9 or 10GB, it's fine as long as the system doesn't need that for system memory.

In the past, PC's have always had more Ram and Vram than their console cousins. But that's not really the case with these latest consoles, having 10 Teraflops of performance with 16GB of GDDR6 shared with the system memory.

For PC gamers, 32 GB of system RAM and 16GB of VRAM should be the minimum today to play anything at 1440p with high graphics. Otherwise, you should relegate yourself to being a console peasant, lol.
 
It's because the recent console optimized games like "The Last of Us" and "Hogwarts Legacy" are highlighting the need for more VRAM since the PS5 and Xbox Series X have 16GB of shared ram. Developers heavily optimized these games to run on those consoles which are able to step over 8GB of VRAM allocation. They don't have to strictly adhere to optimizing ground clutter density, texture resolution and number of textures per scene to strictly stay under 8GB for consoles. If it works up to 9 or 10GB, it's fine as long as the system doesn't need that for system memory.

In the past, PC's have always had more Ram and Vram than their console cousins. But that's not really the case with these latest consoles, having 10 Teraflops of performance with 16GB of GDDR6 shared with the system memory.

For PC gamers, 32 GB of system RAM and 16GB of VRAM should be the minimum today to play anything at 1440p with high graphics. Otherwise, you should relegate yourself to being a console peasant, lol.
Yeah, but they're still releasing the RX 7600 with 8GB of VRAM less than a month after saying this. That card is DOA if it costs even $1 more than $250 because it'll have RX 6700/XT levels of performance with the demonstrably problematic 8GB frame buffer instead of the RX 6700's 10GB or the RX 6700 XT's 12GB, cards that sell for $280 and $320 respectively.

Whoever this Sasa Marinkovic fool is, he has only demonstrated that he is unfit to be the senior director of marketing because all he has successfully achieved is making AMD look terrible. Lisa should seriously fire his a$$ and immediately!
 
Yeah, but they're still releasing the RX 7600 with 8GB of VRAM less than a month after saying this. That card is DOA if it costs even $1 more than $250 because it'll have RX 6700/XT levels of performance with the demonstrably problematic 8GB frame buffer instead of the RX 6700's 10GB or the RX 6700 XT's 12GB, cards that sell for $280 and $320 respectively.

Whoever this Sasa Marinkovic fool is, he has only demonstrated that he is unfit to be the senior director of marketing because all he has successfully achieved is making AMD look terrible. Lisa should seriously fire his a$$ and immediately!

The 8GB VRAM problem only started to surface in the last 3 months with Hogwarts Legacy released in Feb and the last of us in April. Guaranteed the 7600 was planned with design specs well before that. Marketing is currently just reactioning to the current PC gaming controversy. Moving forward, I'm sure they'll add more VRAM, but so will Nvidia.
 
The 8GB VRAM problem only started to surface in the last 3 months with Hogwarts Legacy released in Feb and the last of us in April. Guaranteed the 7600 was planned with design specs well before that. Marketing is currently just reactioning to the current PC gaming controversy. Moving forward, I'm sure they'll add more VRAM, but so will Nvidia.
I'll like to correct you there. The 8GB debacle has been going for a good while. It just became a "trending topic" or relevant topic because of how badly the 3070 dropped frames in HL and how ironic and ludicrous it was for RT to perform better on AMD when the VRAM is saturated. This just spread like wildfire, for some reason.

To say it differently, and not even talking about modding, when the 3080 was released there were many people, very vocally I may add, that flat out said FarCry6 just looked like crap with the 10GB model and high res textures because the engine was doing what happens in HLegacy now: dropping quality to keep performance. Because of that, no one really went back to the graphs and look closer if the quality was on par on both GPU families. If we include modding, then Skyrim has made it abundantly clear that 8GB was not going to cut it for long. Less mainstream: VR gaming.

And as a hard tangent: that is one of the biggest reasons why automating the testing of games is a terrible idea.

Regards.
 
  • Like
Reactions: AgentBirdnest