I can't tell if your post is in bad faith, you couldn't be bothered to watch the video or if you just don't understand. Quite frankly it doesn't matter which because the end result is the same: you spreading misinformation.So his video was pretty faulty, largely because he was looking at the 30 to 40 series jump and assuming that would happen again.
Back in the days when home computer meant a machine with a 68000 CPU, you had to count how many CPU cycles you had left over from any given set of instructions. Game code in those days was almost exclusively written in assembly and, by necessity, tighter than a duck's ass. In this respect, modern PCs are much more forgiving (and I don't think you can cycle count an out of order CPU with anything like the accuracy you can a trusty old Motorola).Plain and simple:
In The Old Days, hardware wasn't powerful, so programmers had to be more careful with how games were coded.
You would think AMD and Nvidia's GPUs were hand made when you look at their general (lack of) availability. I suspect we are getting the data center rejects."may arrive" is a pretty accurate statement. Maybe before that they can actually produce and distribute cards. 🤔
You do realize most new console ports will allocate well over 8GB at 1080p max, right?I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
Allocating (V)RAM =/= needing VRAM. Games will often allocate more than they need to run; I think the point is that, in 2025, GPUs marketed at gamers should include more VRAM than GPUs released in 2016.You do realize most new console ports will allocate well over 8GB at 1080p max, right?
You drank ALL the Nvidia Kool-Aid didn’t you? Retailers still sell gt1030 cards as “Fartnite capable”. The BOM on a 5060 is WAY under $100. The BOM on a 5090 is probably under $500.I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
Absolutely not, the GPU alone by the most optimistic calculations is around $280, 32GB GDDR7 is a bit north of $300.The BOM on a 5090 is probably under $500.
source ?The BOM on a 5060 is WAY under $100. The BOM on a 5090 is probably under $500.
You do realize most new console ports will allocate well over 8GB at 1080p max, right?
well if you want to use ray tracing or you are playing a bad port you need more then 8gb guarenteed.I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
If you think the PCB s add another $500 to the GPU and memory, you’ve clearly never ordered PCBs in bulk. I’m referring to the the GPU+ memory kits Nvidia sells AIBsAbsolutely not, the GPU alone by the most optimistic calculations is around $280, 32GB GDDR7 is a bit north of $300.
And that is even before we start talking about the quite complex PCB and cooling. Your average 5090 BOM would probably be pushing $1k, even before other R&D and production expenses.
If you actually talk to game developers, many will say VRAM limitations are holding back graphics capabilities; that they have to spend more time optimizing a game to make sure it stays within that 8 GB envelope at 1080p at max settings rather than spending time on coding the game itself. Moreover, the tech community is rightly upset about a $400 GPU only having 8 GB of VRAM.I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
Even at 4k with a benchmark suite updated for the 5090 review, the difference between the 16GB and 8GB 4060ti was only about 6%. The difference at 1440p was 0.6fps.I can think of 3 reasons:
1) Price. nVidia is charging more than ever for entry level gaming cards, yet they're using VRAM size as a way for product segmentation. There have been articles in the past couple of years on TH about modders having upped the VRAM on these cards and shown a not-insignificant gain, so the performance of these cards is artificially being gimped by nVidia.
2) More and more games are using near or above 8GB VRAM at 1920x1080, especially when ray tracing is enabled. Techspot did a decent article about it last year, I suggest you give it a read.
3) DLSS frame generation. Those extra frames have to have memory to fit in, and I'll borrow a chart from that Techspot article to demonstrate, those extra frames can easily push VRAM usage over 8GB at 1920x1080, so imagine what memory usage will be once DLSS4's additional frames add to that.
![]()
4)
In 2003, the top end 5950 Ultra MSRP'd at $500. That's about $850 today. That's not $2000, but it isn't cheap by most people's standards either. That flagship GPU was only 207 mm2. By comparison, the upcoming $550 5070 is a 263 mm2 die. $550 today is the equivalent of $320 in 2003. 5090 is nearly 4 times the size of the 5950 Ultra at 750 mm2. Comparing pricing with cards that old is practically meaningless as the products have evolved so much that there is very little basis for comparing.It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time.
If you were referring to just that don't use the word BOM. I don't read your mind. Chipset/SoC/GPU these things are also part of BOM.If you think the PCB s add another $500 to the GPU and memory, you’ve clearly never ordered PCBs in bulk. I’m referring to the the GPU+ memory kits Nvidia sells AIBs
Yep and normally at least 10GB is always allocated to the GPU in consoles while the CPU just makes do with whatever’s left.Short answer: no.
Long answer:
Games no longer "allocate" anything, that entire task has been moved over to the OS and it's graphics pipeline. Modern OS's now pool both system ram and graphics card ram into a single pool of memory called display memory. Any resource or buffer can be located in any memory space at any point in time. The display manager will then attempt to predict what resources are needed immediately and have most moved into the graphics cards ram, if that ram is full then the oldest resource is evicted out to system memory. What this does is turn graphics card ram into a massive L3 cache.
Anyone trusted the "Memory used" part of GPU-Z is still living in Windows XP / DX9 era.
The PS5 for example has 16GB of 256-bit GDDR6 that is used by games for both code/data and graphics.
You’re right. I should’ve been much more clear but a company like Gigabyte doesn’t pay hardly anything for bulk ordered printed PCBs. Even 14 layer is cheap in bulk.If you were referring to just that don't use the word BOM. I don't read your mind. Chipset/SoC/GPU these things are also part of BOM.
There is 0 "allocated" to the GPU. That's the great part about linear memory addressing modes with a unified memory architecture, no need for it.Yep and normally at least 10GB is always allocated to the GPU in consoles while the CPU just makes do with whatever’s left.
Something nobody here will want to hear I'm sure.I understand what you're saying. I know that HellBlade, Hogwartz Legacy, IJATGC, Avatar, and a few that will push north of 8GB when maxed out. But what I'm saying is, for the vast majority of games, 8GB will work. What I mean is this: There are 90,000 games on steam. I bet there are no more than 10 that require >8GB for gaming with at least HIGH settings (excluding ray tracing) at 60FPS, which is still very playable. That's such a tiny fraction. This is the masses that XX60 and XX60Ti cards appeal to. And I think NVidia is just saying: For the people who wants to play those 10 games, go get an XX70 and XX70Ti. For everyone else, 60 and 60Ti is enough. I think a 60 or 60Ti with >=12GB is pointless, because if a game is demanding enough to use 12GB, the core won't be able to produce the frames, even though the textures memory pool is sufficient. I think 60 is geared towards what would suffice for most people: 1080p high for only $350 instead of $2000 lol. I think thats what they are getting at.
I apologize if I sound like I am defending NGreedia's price gouging. That is NOT what I am trying to do. I'm trying to do an objective, neutral, impartial, honest, realistic, real world TECHNICAL take on the matter. In the vast majority of common, real world use cases, with a few small exceptions, 8GB is still enough.