News [The] Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory': AMD justifies RX 9060 XT's 8GB of VRAM

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If 8gb were sufficient "for the majority of gamers who play at 1080p", then a 16gb variant wouldn't exist now would it?
Eh? Of course it would.
If an 8gb 9060XT is for the majority
Then a 16gb 9060XT is obviously for the minority, isn't it?
However, anybody who wants to play at 2k or 4k would probably invest in a stronger 16gb card.
Me personally? If I wanted a GPU and this card let me play at 1080p@60 and was a good price, yeah, sure, I'd get it and game at 1080, the game is more important anyway
 
FPS is only part of the picture though.

60 FPS is great, until it is achieved by having textures pop in and out of existence, or with very high frame times while it fetches things from system memory.
The real issue is the 1% low, which I demonstrated is pretty much the same. Port of console games to PC that natively use around 14GB of VRAM is a good example demonstrating the issue.

TLOUP1 is the most known of them.
 
  • Like
Reactions: atomicWAR
The real issue is the 1% low, which I demonstrated is pretty much the same. Port of console games to PC that natively use around 14GB of VRAM is a good example demonstrating the issue.

TLOUP1 is the most known of them.
Granted, but I have seen a few comparisons and while I don't mind games looking like that, a bit of nostalgia really, it is certainly a big difference when the driver just removes things to fit the game in VRAM.

I'm still happy at 12GB for now. Trying to hold out for an Intel miracle. Though if nothing else, Nvidia again when they get around to doing a node shrink.
 
I don't think people actually care that an 8GB variant exists.

I think people care that an 8GB card costs $299 USD when the Radeon RX580 8GB launched at $229.

In short, it's the naming and the pricing... Should have called it an 9060 8GB or 9050 and been $199.

In saying that, if you are a 1080P eSports gamer, the 9060XT 8GB is going to be a great "budget" card.
 
I don't think people actually care that an 8GB variant exists.

I think people care that an 8GB card costs $299 USD when the Radeon RX580 8GB launched at $229.

In short, it's the naming and the pricing... Should have called it an 9060 8GB or 9050 and been $199.

In saying that, if you are a 1080P eSports gamer, the 9060XT 8GB is going to be a great "budget" card.
Sure, it would be nice. I could be wrong, but I think they'd be losing money on every card at those prices.
 
If people will buy them then why would AMD or NVIDIA stop creating the 8GB versions. You will not buy a low tier card for high fidelity fps like 1440p anyway, it just doesn't make sense. Also if people do not think that future proofing is a priority then let them buy the 8GB model. There is such a small price jump ($50) that is people do not want the 8GB they can get the 16GB without breaking the bank. I don't know why everyone huffs and puffs about this, if you don't want 8GB then don't buy it, it is that simple!
 
I don't think people actually care that an 8GB variant exists.

I think people care that an 8GB card costs $299 USD when the Radeon RX580 8GB launched at $229.

In short, it's the naming and the pricing... Should have called it an 9060 8GB or 9050 and been $199.

In saying that, if you are a 1080P eSports gamer, the 9060XT 8GB is going to be a great "budget" card.
You have a minor point. Look at the technology in the newer cards. AI and Raytracing and node shrinkage play a factor in price, also do not forget that the dollar is more inflated now then when the RX580 came out. $299 is not that bad considering all I listed above. In my opinion there is nothing to complain about. Let the people that want 8GB to buy them, the people that buy the 16GB models are just crying because they like to cry. If there were no such thing as TechTubers doing all these reviews then there wouldn't be anyone complaining would there?
 
Disregarding any other arguments by AMD or Tom's commentariat this is AMD's time to step up or step out. NVIDIA's brand is tarnishing fast and tone deaf statements such as they released (regardless of the accuracy) just go to show how disconnected ALL these companies are. Honestly if their marketing team does not have their eyeballs glued to enthusiast YouTube channels and enthusiast sites such as Tom's they are in fact, doing their job wrong. A pure, 16GB model would have been very warmly received regardless of its utility, and on that note, so many said the 12GB RTX 3060 was silly. That opinion aged like warm milk, unlike the card.
 
I always love all the people who rush to defend billion and trillion dollar companies fleecing people with a poor value product, because not every use case needs the better version. There are large margins built in (especially with nvidia) to every GPU made and BOM on boards, power delivery, memory, cooling and display output aren't particularly high. GPU margins hitting AIBs is the only reason video cards are as expensive as they are now (referring to MSRP not market pricing). Let's not forget this was part of the reason EVGA got out of the video card business.

The 9060 XT, like the 5060 and 5060 Ti 8GB, undoubtedly has enough rendering power to drive playable frame rates on games that use more than 8GB VRAM. What AMD and nvidia are doing is selling a compromised product at $299/$379 MSRP and telling people it's a good deal because it's $50 cheaper than the alternative. All the while hoping everyone doesn't realize that 8GB VRAM and clamshell manufacture doesn't cost them $50 in the first place and ignores the high margins on their GPUs.

There's absolutely a place for 8GB cards in the market, and they should be there, but not at these price points. Nobody should accept an 8GB card with a higher MSRP than Intel's 12GB one.
 
  • Like
Reactions: Nitrate55
That's assuming the extra 8 GB can be addressed by the physical GPU. AMD could artificially limit the addressable address space to 8 GB

So quick class on how these are wired. Each GPU memory bus is 32-bits wide and has a single GDDRAM chip soldiered to it. That 32-bit bus connection is split into two 16-bit lanes such that the GPU can access two separate WORDS (that is a 16-bit value) per clock cycle. Currently GDDR6 memory has a max capacity of 16Gbe (2GB) per chip, thus 128-bit bus equals four 32-bit chips equals 8GB of GDDRAM. Now GDDR has a special low performance mode called "clam shell" in which the GPU will instead address four BYTES (8-bit values) per clock cycle allowing for double the number of GDDRAM chips at half the bandwidth.

So not only would you be paying twice the amount of money for GDDRAM on the product, but the product would consume more power and have the same memory performance. This configuration mode exists as a way to double capacity for datacenter and professional class cards, using it on a lowly entry level card with a measly 128-bit interface is dubious at best. It's not impossible just something the market has to demonstrate a demand for.

People keep trying to treat these entry level cards like they aren't... and it's kinda weird. Guys if you see a 128-bit memory bus on a product made in the past decade, immediately put it in the entry level category.
 
In reality, many markets can't afford a 70 class GPU... especially now!

I think nVidia was successful in breaking peoples brains with the 40 series release and how they reordered the product stack. AMD just resequenced their own cards to better reflect the intended nVidia product tier they are competing against. The "new" 60 models of the 40 and 50 series are really entry level 50 models. The "new" 70 models are more like the older mainstream 60 tier, and it's like that all the way up the stack until you hit 80. This is why you have this massive leap between "enthusiast" 80 model and the crown for 90 / Titan models. People are irrationally getting bent out of shape over why manufacturers are treating entry level 50 class cards like entry level 50 class cards.

All the numbers point to 8GB being appropriate for entry level cards that are restricted more by memory bandwidth and GPU clocks then memory capacity.
 
I don't think people actually care that an 8GB variant exists.

I think people care that an 8GB card costs $299 USD when the Radeon RX580 8GB launched at $229.

In short, it's the naming and the pricing... Should have called it an 9060 8GB or 9050 and been $199.

In saying that, if you are a 1080P eSports gamer, the 9060XT 8GB is going to be a great "budget" card.

So as above, all the xx"60" models are really "xx50" models, thank nVidia for that bit of marketing BS.

As for the prices, that's a different problem. Costs from making GPU's has really gone up in the last few generations, there are lots of reasons and it deserves it's own article (hint hint Toms), but suffice to say that the days of "cheap" dGPU's are over. The only "cheap" dGPU's I can find are the older 3050's with 6GB of memory at 96-bit (three chips) bandwidth.

https://www.amazon.com/ZOTAC-GeForce-Low-Profile-Graphics-ZT-A30510L-10L/dp/B0D4TRRSDV

https://www.amazon.com/Yeston-Graphics-Express-Desktop-Computer/dp/B0DGT7KKNK

Everything 40 series and newer is just ridiculously more expensive.
 
When playing Starfield @ 3440x1440 ultra settings, I've never seen more than 9.5GB of Vram ever get used & that is the maximum for any given playing session, usually it averages a fraction over 8Gb - this is measured with HWiNFO.
 
If you look at it from another angle, it is a good move from nvidia and amd to keep game developer inflated use of memory in check. We the consumer may have better optimized game because of this.
 
If you look at it from another angle, it is a good move from nvidia and amd to keep game developer inflated use of memory in check. We the consumer may have better optimized game because of this.
No we won't. Games will continue to be held back by it and be as much of a mess as they already are. Development will continue to take longer than it needs to as they figure out workarounds and/or games will release semi broken like TLOU1 and Hogwarts only to get bandaids for lower VRAM added after launch.

I will say that at this point a lot of the issues they have would probably be solved by just releasing games on PC with poor texture quality and putting up a pack for those who have proper amounts of VRAM. Space Marine II basically did this, whether that was the intent or not, and the visual fidelity was increased a fair amount with the 4k texture pack.
 
  • Like
Reactions: Bikki and Nitrate55
This model uses PCI express gen 5 at 16 lanes so it can fulfill the required memory spaces for use in mainstream gaming market segment which used 1080p monitor. 16 lanes gen 5 PCIe interface bandwidth has an aggregate 128 Gigabyte/s for CPU-GPU connection.
 
Honestly if their marketing team does not have their eyeballs glued to enthusiast YouTube channels and enthusiast sites such as Tom's they are in fact, doing their job wrong. A pure, 16GB model would have been very warmly received regardless of its utility, and on that note, so many said the 12GB RTX 3060 was silly. That opinion aged like warm milk, unlike the card.
1) Making people who do not play but whine on the Internet happy would be bad for economy
2) They are aiming for sales, not for warm reception by people not buying this product
 
  • Like
Reactions: adbatista
Having a 16GB variant doesn't inherently imply that 8GB isn't sufficient for the majority. It just means they see another market that they want to address, despite it being a minority. The existence of more expensive/capable/premium products doesn't automatically render all 'lesser' products inadequate.

There may be other, valid arguments as to why 8GB is insufficient, but this ain't it.

Yeah, my car has an engine variant with a turbocharger. Just because it exists doesn't mean that most of the people buying the car actually need it and are being deprived by not having one. The existence of it isn't the problem.
 
Last edited:
  • Like
Reactions: adbatista
So as above, all the xx"60" models are really "xx50" models, thank nVidia for that bit of marketing BS.

As for the prices, that's a different problem. Costs from making GPU's has really gone up in the last few generations, there are lots of reasons and it deserves it's own article (hint hint Toms), but suffice to say that the days of "cheap" dGPU's are over. The only "cheap" dGPU's I can find are the older 3050's with 6GB of memory at 96-bit (three chips) bandwidth.

https://www.amazon.com/ZOTAC-GeForce-Low-Profile-Graphics-ZT-A30510L-10L/dp/B0D4TRRSDV

https://www.amazon.com/Yeston-Graphics-Express-Desktop-Computer/dp/B0DGT7KKNK

Everything 40 series and newer is just ridiculously more expensive.

Did you only look at Nvidia?

RX 6600

For $240 instead of $200 you get a faster video card with 8GB rather than 6GB. The 3050 gets its ass handed to it by a 6600.
 
  • Like
Reactions: King_V
Yeah no.
AMD called Nvidia out ages ago for having 6GB vram....they cant try and act like 8GB is enough atm.
games use a ton of vram now even at 1080p. (thank you unoptimized caring devs) & thats ignoring if you run multiple things that sue GPU's vram...and heaven forbid you want to use "ai".

the price of 8gb vram is so small compared to benefit of having it when you need it.

Class 60 community is the biggest globally, like it or not. Class 60 cards sell the biggest volume, by far.

I'd personally never touch an 8 GB card, even if they gifted me the damn thing, but the world is not about what you or me think. There are millions of other gamers out there.
 
Last edited:
FPS is only part of the picture though.

60 FPS is great, until it is achieved by having textures pop in and out of existence, or with very high frame times while it fetches things from system memory.
The 5-10fps 1% lows in many games say it all for these 8GB cards. And let's not forget they are being marketed (and priced) as being RT ready when in fact they are not fit for that purpose at all, and this applies to the 5060.

Where they and Nvidia are disingenuous in the naming of these cards. Call them a 9050 and 5050, cut the price and no one complains at all. Being called a 9060 it should be 12GB IMO.
 
  • Like
Reactions: Nitrate55
Did you only look at Nvidia?

RX 6600

For $240 instead of $200 you get a faster video card with 8GB rather than 6GB. The 3050 gets its ass handed to it by a 6600.
The 3050 6GB is basically the only game in town if you want a good GPU limited to 75W for certain SFF builds. Although the Intel Arc Pro B50 could provide an interesting alternative with that magic 16 GB of VRAM at 70 Watts, for only $300.

3050 6GB prices are certainly bad, and the RX 6600 is not great either since going out of production. I think they used to bottom out at $180 (new).