News [The] Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory': AMD justifies RX 9060 XT's 8GB of VRAM

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The 3050 6GB is basically the only game in town if you want a good GPU limited to 75W for certain SFF builds. Although the Intel Arc Pro B50 could provide an interesting alternative with that magic 16 GB of VRAM at 70 Watts, for only $300.

3050 6GB prices are certainly bad, and the RX 6600 is not great either since going out of production. I think they used to bottom out at $180 (new).

If that's the goal and one doesn't mind a significant performance hit then it makes sense. I don't think there is a huge martket with those requirements though.
 
Class 60 community is the biggest globally, like it or not. Class 60 cards sell the biggest volume, by far.

I'd personally never touch an 8 GB card, even if they gifted me the damn thing, but the world is not about what you or me think. There are millions of other gamers out there.
I won't speak for anyone else, but even though I'm not buying 60 class cards I think those who do should be getting more for their money. As Palladin pointed out above nvidia shifted the stack one over. Now people are paying 60 class money for 50 class cards and that's just awful. I absolutely understand why we don't have a thriving $200 and below market, but that's no excuse for what's being done.
 
  • Like
Reactions: Nitrate55
If one company makes a certain class of product, and it sells, then I don't blame a second company for making a competing product. In this case, it would be a poor business choice for AMD to NOT make a competitor for Nvidia's 5060. Sure, it's not a GPU that I would ever be interested in, but no company in their right mind would pass up an opportunity for market share just because some people consider it a sub-standard product.

After all, Chevy Sonics and Lamborghinis exist in the same world... why should anyone complain or insist that the GPU market be different? If you want more power, it costs more - plain and simple. Doesn't mean the slow version doesn't have its place in the market.
 
  • Like
Reactions: adbatista
I would venture that the Venn diagram of "people who understand what an x50 or x60 GPU typically represent in terms of relative performance in a generation" and "people who understand that in certain scenarios, especially with AAA games, there are limitations for the 8 GB version" is basically a circle.

If Nvidia simply subtracted 10 from the names of every GPU in the current product line, to make the relative positioning more consistent with past lines, nobody would be any happier. Yes, historically 5070 would be a 5060, but then nobody's going to be happy to "pay $600 for a 60-class card." But if Nvidia, did the opposite, kept the names of the GPUs the same but instead lopped off a third of the price, nobody would be disgruntled that their still-misnamed 5060 has an MSRP of around $250 instead of $379.

The fundamental thing that we're all unhappy about is pricing, not naming. I prefer paying less for things, too. Give me RTX 5080 performance at $700, and you can name it to the GeForce Fartunicorn HD 5523 Skidoo, and I wouldn't care about the naming.

On a basic level, a highly sought-after product with a limited supply that's actively being scalped is going to see the regular prices trending upwards aggressively, because the source of the products wants the money to go into their pockets rather than the middleman's.
 
Last edited:
I will have to say it again: while this might still be mostly true, it's not gonna last for very long. New releases are more and more VRAM hungry and some are literally unplayable with 8 GB at 1080p. Here is an example:

https://www.pcgamesn.com/nvidia/geforce-rtx-5060-first-benchmarks
I think that is more of a driver issue or the 128 bit memory.
RTX 2080 8GB card. That is the card in my PC right now and I have no problems plays everything I play @1440p

View: https://www.youtube.com/watch?v=lgKhTxgQs3o
 
Last edited:
The 5-10fps 1% lows in many games say it all for these 8GB cards. And let's not forget they are being marketed (and priced) as being RT ready when in fact they are not fit for that purpose at all, and this applies to the 5060.

Where they and Nvidia are disingenuous in the naming of these cards. Call them a 9050 and 5050, cut the price and no one complains at all. Being called a 9060 it should be 12GB IMO.
So cut the price so you can't make any money and people will still complain on the internet. It's what people do.
If you don't like the card don't buy it fairly simple.

Like it or not it's about making money not just giving stuff away, the naming on it has nothing to do with anything.
If your buying a card do research, buy what you need.
 
AMD has two problems: the market (nobody bought their GPUs when they were superior bang for buck, particularly for VRAM amounts) and that they’re a business (charge what the market will bear).

There is no evidence to show that AMD behaves better than nVidia when they’re doing well. And one person’s “doing right by others” is another’s “wasted money”.

I’m inclined to see AMD as striving for nVidia’s income and thus mimicking nVidia’s behaviour. Would Intel be any better? See all Intel behaviour of any kind over the past decades when they’re doing well…

Even so! I try to support the underdog, unless second place is playing in a different (lesser) league. That’s why I bought the 4070 (RT is a big deal for me) and why, were I in the market this generation, I would probably get a 9070XT. And if a good Celestial card rears its head, it will be in contention for my video card needs when I do upgrade next year, +/-.
 
To be fair, most of the comments here are from home builders, and that demographic is not typically going to be buying 8 GB GPUs if they can avoid it unless they are building something predominately for other uses(secondary PC, for wife or kids kind of thing), and light gaming.
The pre-built/OEM vendors will be using all of these they can get their mitts on, because that is a somewhat different market.
I am not going to knock AMD for making a GPU configuration that will sell like hotcakes, whether I'd like one or not. I'm not a bean counter, but they have a whole floor of cubicle guys who are that will tell them what is worth doing.
 
If all you're doing is eSports, an 8G card like this is fine.... Making there a niche for it. My only complaint is they list it as the same model as the 16G version. That's creates a vary real likelihood that someone gets is who isn't Ninja, and ends up very disappointed trying to play Hogwarts.

Make this a RX 9060, and the 16G an RX 9060 XT.

To be clear, I have the exact same complaint with team green.
 
Can't we just admit yet that everyone is actively trying to screw the consumer as a means of subsidizing AI hardware for high paying customers?

No provider anywhere in the chain is innocent, you just have bizarre fanboy cults in denial that try to bully other people into accepting the "new normal"..... and the more money you are willing to spend the easier you will stall in denial.

It's the same old story, head in the sand until it finally breaches your individual bubble and SUDDENLY IT'S A CRISIS.... as if no one ever mentioned it before.
 
  • Like
Reactions: Nitrate55
I think that is more of a driver issue or the 128 bit memory.
RTX 2080 8GB card. That is the card in my PC right now and I have no problems plays everything I play @1440p

View: https://www.youtube.com/watch?v=lgKhTxgQs3o
Both the 5060 and the 5060 Ti have a 128 bit memory so it's not that. And I would be really surprised if the driver was causing a so huge difference with those two cards. And all the 8 GB cards the PCGamesN guy tested behave similarly, the AMD one included. And he tested them at ultra. The guy in your video uses medium settings, as pointed out by thestryker. And even then, the vram in your video is constantly close to 8 GB.

This shows that new games are more and more VRAM hungry, especially if you want high quality texture.
 
  • Like
Reactions: Nitrate55
1) Making people who do not play but whine on the Internet happy would be bad for economy
2) They are aiming for sales, not for warm reception by people not buying this product
You missed the entire point. Those outlets are marketing tools. They need to use them (properly).

(Edit: supporting data)

Gamers Nexus has 2.45M subscribers, Hardware Unboxed has 1.11M subscribers, JayzTwoCents has 4.2M fricking subscribers. These channels are the mainstream, so not accounting for the smaller outlets but accounting for those who sub to all three it's highly likely that these viewers account for a majority portion of the consumer GPU market. This is a powerful marketing force to have on their side. Release a product that these guys recommend and see the sales climb. How do you think the X3D cpu line became so popular? It wasn't prime time advert spots on daytime television that's for sure. It was mass coverage on YouTube and media like Tom's.
 
Last edited:
  • Like
Reactions: Nitrate55
I think that is more of a driver issue or the 128 bit memory.
RTX 2080 8GB card. That is the card in my PC right now and I have no problems plays everything I play @1440p

View: https://www.youtube.com/watch?v=lgKhTxgQs3o

It's a settings issue, specifically texture sizes. I've explained it in the past and the usual suspects just kinda ignore it in favor of their own flawed idea of how modern graphics frameworks work.

It's impossible to run out of graphics memory while being on any OS newer then Windows 10 and virtually impossible to do since Windows 7. Graphics VRAM isn't treated as a separate resource and instead is used more like a local resource cache to the GPU. Graphics frameworks are loading resources into and out of VRAM ahead of time so we really need to understand what is going on at any one moment in time. A game consistently trying to display more on the screen then it has VRAM for will be an incredibly noticeable stuttering effect that is absolutely measurable. The easiest way to fix this is to reduce texture sizes because "Ultra" is stupidly oversized most of the time, especially since most are just upscaled from a lower resolution before being downscaled to form mipmaps.

This is basically how a textures is stored

example_mipmap_chain-1.png


The base texture is what you set for "texture quality" and the lower level mipmaps are then generated from it. 2K textures take up about 25% the space as 4k textures and some games are now claiming they have "8K" textures, which are 400% the size of 4k textures and 1600% larger then 2k textures.

Since most of you are not playing with the camera zoomed into a wall at point blank, raw textures are rarely displayed on the screen and instead one of their lower level mipmaps are being used instead. But that determination can not be made until it's render time, meaning most of the texture data in memory is completely useless.

nVidia article walks through this.

https://developer.nvidia.com/gpugem...rendering/chapter-28-mipmap-level-measurement


28_mipmap_01a.jpg


In that scene the yellow represents the texture rendered at "level 0" which is raw, while all the other colors are for the various lower level mipmaps of that texture. This is the one place where screen space, aka display resolution, really matters. Someone doing 1080p simply does not have the screen space for a 4k or 8k texture to ever even be rendered at level 0, they will always be rendered at a lower resolution variant. Similarly someone at 1440p will never see a 8k texture rendered at level 0 and rarely see a 4k texture rendered at level 0. It's not until we get to 2160p that 8k even becomes possible and 4k becomes reasonable.

People wanting to play at good feeling fps at 2160p are not going to be buying an entry level dGPU with 8GB of VRAM. People playing at 1080p / 1440p are not going to get anything out of "Ultra" texture sizes and turning them down one or two notches is perfectly acceptable.
 
Last edited:
  • Like
Reactions: King_V
1080p is dead resolution. Any attempt to sell new gpu for $300+ for 1080p should be ignored. I remember amd talking about 1440p gaming on amd 7950 over a decade ago. Its stagnation. The vram, bus width, small die sizes are all to blame.
 
1080p is dead resolution. Any attempt to sell new gpu for $300+ for 1080p should be ignored. I remember amd talking about 1440p gaming on amd 7950 over a decade ago. Its stagnation. The vram, bus width, small die sizes are all to blame.
And yet the overwhelming majority of real people still use it..... because higher res monitors are very very very expensive and the CPU+mobo's and GPU+PSU's required for higher resolutions are also very very very expensive.... and every year it's getting even more expensive not less.

Yes higher res monitor prices have fallen.... slightly. They are slightly more affordable.... this does not make them affordable to most people globally.

It's all fine and dandy to WANT a resolution to be "dead".... it's another thing for it to actually be. It's cost to the consumer as well as stagnating wage patterns that is artificially stagnating growth. Don't be blinded by whatever bubble you live in because it's going to pop eventually.
 
People wanting to play at good feeling fps at 2160p are not going to be buying an entry level dGPU with 8GB of VRAM. People playing at 1080p / 1440p are not going to get anything out of "Ultra" texture sizes and turning them down one or two notches is perfectly acceptable.
Unless you can properly categorize the settings for the masses all of this is moot. All you have is everyone confused as to why something looks good or bad because at this point you need to take an extended course to understand anything and it's constantly changing anyway.

All the common man can do is push everything to the highest and hope for the best because he has other things to do than to become an expert in a field of moving goal posts.

At best this is a case of experts not understanding that everyone does not share their expertise and genuinely not understanding why everyone is upset at them.... at worst it's simple taking the masses for as long a ride as possible to get as much from them as possible before they notice. It did not use to be this way.... it does not help that 4k is an impossible target to hit without resorting to magic tricks and hoping no one cares and 1440p being an obviously transitionary middle ground target to distract from this the same way 720p was.

Ironically 1440p could become a dead resolution once 4k gets figured out and 1080p will remain the same way 1024p remained (and is still useful sometimes) for as long as it did.
 
  • Like
Reactions: thestryker
1440p being an obviously transitionary middle ground target
It's not particularly as 1600p/1440p have been around in volume level displays for around two decades (there were lots of random resolution single SKU types during this time, but I wouldn't really count those since they're more one offs). 1440p mostly supplanted 1600p with the shift away from 16:10 to 16:9 rather than being a middle ground to bridge the gap to 4k. That goes back to the era when monitors were still driving panel technology since they switched to LCD in volume before TVs did.
 
  • Like
Reactions: rambo919
And yet the overwhelming majority of real people still use it..... because higher res monitors are very very very expensive and the CPU+mobo's and GPU+PSU's required for higher resolutions are also very very very expensive.... and every year it's getting even more expensive not less.

Yes higher res monitor prices have fallen.... slightly. They are slightly more affordable.... this does not make them affordable to most people globally.

It's all fine and dandy to WANT a resolution to be "dead".... it's another thing for it to actually be. It's cost to the consumer as well as stagnating wage patterns that is artificially stagnating growth. Don't be blinded by whatever bubble you live in because it's going to pop eventually.

Don't know what you consider expensive but 1440p and even 4k monitors aren't that bad. GPU's to make best use of them can be. It sounds like you live in a place where higher end hardware is very expensive, and I am sorry if that is true.
 
1080p is dead resolution. Any attempt to sell new gpu for $300+ for 1080p should be ignored. I remember amd talking about 1440p gaming on amd 7950 over a decade ago. Its stagnation. The vram, bus width, small die sizes are all to blame.
It's the most common gaming resolution by a very wide margin. 55.25% is 1080p with 1440p being the next most common at 19.90%, less then half. Most common VRAM is 8GB, most common system RAM is 16GB with six core CPUs. 2160p is less then 5%, same with higher end cards.

This is where echo chambers have people convinced that every drives a Mercedes to work and their Lamborghini on the weekends.
 
  • Like
Reactions: King_V and rambo919
All the common man can do is push everything to the highest and hope for the best because he has other things to do than to become an expert in a field of moving goal posts.

This is fundamentally dumb. It's like saying all people can do is mash the accelerator to the floor because they can't be bothered to learn how to drive.

You do not need to learn to be an expert, just use a litte common sense. Social media has distorted people's views into thinking they are inferior if they aren't running everything at top settings, even of those settings don't do much for them. If someone is like the majority of PC gamers, they'll be running on high or less.
 
  • Like
Reactions: adbatista
This is fundamentally dumb. It's like saying all people can do is mash the accelerator to the floor because they can't be bothered to learn how to drive.

You do not need to learn to be an expert, just use a litte common sense. Social media has distorted people's views into thinking they are inferior if they aren't running everything at top settings, even of those settings don't do much for them. If someone is like the majority of PC gamers, they'll be running on high or less.
The point is the settings are so complicated people don't know what they do.... so they assume higher is better and that's that.... it's also partly why HDR is just not a thing yet.... people don't know you need special monitors for it that info is just now starting to penetrate.... because almost no one has one so they don't know that they don't know the difference.

On the other hand you have the problem of a lack of consistency between games... on some you notice the difference between settings and on some everything above medium looks exactly the same to you. People just don't have a frame of reference to grab unto unless they actually have the high end hardware to even notice. This is especially true of course for high refresh rates.... which as above almost no one has except in high wealth bubbles though that seems to be changing slowly as they now are increasingly common.