News GPUs with 8GB of VRAM in 2025 are 'like bringing a butter knife to a gunfight' reckons Grok AI

Honestly I think 8GB GPUs above the entry level (--60 laptop and --50 desktop) shouldn't exist. Given the power of even entry level gaming desktop cards like the --60, 8GB cards only exist to upsell a $100+ more expensive 16gb variant.
 
What if you're completely wrong?

https://www.tomshardware.com/pc-com...-due-to-lack-of-vram-and-not-just-at-4k-ultra

And this is on top of the 10%+ performance penalty you suffer if you dare to use PCIe 4.0.

https://www.tomshardware.com/pc-com...-when-using-pcie-4-0#xenforo-comments-3878849
Easily over 60 FPS is “struggling” now? No. It was over 60FPS in everything at 1080p and was more than serviceable at 1440p, it struggled and wasn’t usable on most things at 4K. Theres not much the 16GB model can comfortably play at 1080 or 1440 that the 8GB can’t outside of unoptimised edge cases at maxed out detail settings . Which for a 1080p card like the 60 series are is fine. For a 70 card which is aimed at 1440p it would be an issue but that’s not what we’re talking about.
 
  • Like
Reactions: King_V and ivan_vy
I think the game devs are getting paid by AMD and Nvidia to obsolete 8GB cards.
It is naturally in their interest to appeal to the largest market reasonably possible and in the GPU manufacturers interest to cut off some models so they sell more. The two are at odds there and some convincing seems plausible.
The games that need more than 8GB don't look better than the ones that don't so 8GB doesn't seem technically necessary, just a limitation placed by substandard optimization or intent.
 
I think the game devs are getting paid by AMD and Nvidia to obsolete 8GB cards.
Yeah, they've only been adding more than 8GB to cards since... the GTX 1080Ti from 2015.
16GB VRAM for games solidified in 2020 when PS5 and Xbox came out with that configuration.
It's 2025 right now. There really isn't any excuse for being stuck in 2015.
 
  • Like
Reactions: Amdlova
I just gave my niece's son a
I 5 6600
16gb 3000 ram
GTX 1070 SC
500gb ssd
1000gb hd
24" 1080p Samsung monitor
It plays all of his games @ max /near max settings.
Nvidia uses a much more efficient compression technology for textures etc. So it can do more with less ram.
These are mid level budget cards.
8 gig is is still fine for this level card.
PERSPECTIVE is in our own mind.
 
Non-story. It's just regurgitating what outlets like this one have published.

1080p is going to be achievable with 8 GB in many games. You just have to forget about texture quality going forward, and some new games won't run. And you shouldn't be paying over $300 for it. It should be in the entry-level, as in anything touching the $100-200 range.
 
Yeah, they've only been adding more than 8GB to cards since... the GTX 1080Ti from 2015.
16GB VRAM for games solidified in 2020 when PS5 and Xbox came out with that configuration.
It's 2025 right now. There really isn't any excuse for being stuck in 2015.
The Xbox and PS5 don’t have 16GB of VRAM. They have 16GB of unified memory, 3.5GB of which isn’t available to devs so even if the CPU used 4GB of VRAM the PS5 has 8.5GB available. Realistically I’d expect the 12.5GB to be split pretty much evenly at around 6/6 for the base PS5 and the Pro, which gives devs just shy of 14GB to play with, I’d expect a 6/8 split in favour of the GPU
 
  • Like
Reactions: rluker5
Non-story. It's just regurgitating what outlets like this one have published.

1080p is going to be achievable with 8 GB in many games. You just have to forget about texture quality going forward, and some new games won't run. And you shouldn't be paying over $300 for it. It should be in the entry-level, as in anything touching the $100-200 range.
Yeah nothing is going under 500 when these tariffs come in to the US. Just depends if the rest of the world gets a hike too
 
Yeah, they've only been adding more than 8GB to cards since... the GTX 1080Ti from 2015.
16GB VRAM for games solidified in 2020 when PS5 and Xbox came out with that configuration.
It's 2025 right now. There really isn't any excuse for being stuck in 2015.
This is what the market consists of in 2025: https://store.steampowered.com/hwsurvey/videocard/
In what world are most of those in the top of the list above 8GB?

What motivation do game devs have to abandon good performance in the majority of their customer base? Why would they not have reasonable reduced settings available to accommodate what they know to be a significant number of potential customers? Without there being some additional influence it doesn't make sense.

I know you can buy a card that has more than 8GB, but the masses have not. You have to deal with the world you live in, not some fictional utopia.
 
  • Like
Reactions: King_V
The rhetoric is that you should buy for older games or esports, so why are people buying newer cards when their old already could play them without issue?

Notice how many posters here make no logical sense?

All of them own 8GB cards and can't see the elephant in the room.

GG.
 
In this case, Grok seems to be only caring about those who play the latest AAA titles at maximum settings.

Most people don't necessarily feel the need to do that.

I think I can live quite comfortably with "merely" high settings, and not necessarily playing the most demanding games. I have a small collection of games on Steam compared to most, and I've still got a backlog of titles, some of which are lighter games, and some of which were demanding by the standards of a few years ago.

Grok's statement, for whatever weight that can be given to an AI's "opinion," is catering that impractical fraction of gamers who insist on the latest and greatest at all times.
 
  • Like
Reactions: ivan_vy and rluker5
If AMD is wise they will rebrand the 8GB model to the 9050, or something other than a 9060
I agree -- they should have made 192-bit VRAM models with 12 GB of GDDR6 six as 9060 series (and not exclusively the 9070 GRE) and 128-bit VRAM models as the 9050 series to really throw some punches at nVidia (besides 9070 series).

Moreover, 8 GB of VRAM for 1080p isn't the worst thing in the world -- it's just a pricing problem as others astutely mentioned. $150 - $250 is acceptable for 8 GB, but anything over $249 USD should have 12 GB. Again -- and surprisingly so -- Intel changed the status quo on this already with Battlemage. A

So funny to me to see some still defending this practice of high-cost 8 GB gaming GPU's in mid-2025 as acceptable. I'm not sold on "it's an edge case problem" and "devs are too lazy to optimize." Indies and small game studios shouldn't have this burden. AAA's, yes, they can and should optimize the dickens out of their games as they generally do. And again, if it's a problem today, it's only going to get worse in the coming months and years.

Let's quit blaming devs and expect more from some of the largest and most profitable companies in the world.