News RTX 5060 Ti and RTX 5060 may arrive in March to steal AMD's spotlight — Chaintech hints at higher Average Selling Prices

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So his video was pretty faulty, largely because he was looking at the 30 to 40 series jump and assuming that would happen again.
I can't tell if your post is in bad faith, you couldn't be bothered to watch the video or if you just don't understand. Quite frankly it doesn't matter which because the end result is the same: you spreading misinformation.

First of all aside from the 3090 to 4090 the 40 series was a bad value and generally poor upgrade to the 30 series.

Secondly he compares 7 generations worth of cards and isn't just comparing flagships.

Thirdly the 40 series is the first time nvidia has scaled price/perf off of the flagship part. They got away with it so it's absolutely expected they would do the same thing with the 50 series.

Lastly consumers might be getting the same price/perf as the 40 series which as I already pointed out was the first shift to this scaling off flagship model. That means consumers are not getting what used to be the case but rather screwed two generations in a row.
 
Plain and simple:
In The Old Days, hardware wasn't powerful, so programmers had to be more careful with how games were coded.
Back in the days when home computer meant a machine with a 68000 CPU, you had to count how many CPU cycles you had left over from any given set of instructions. Game code in those days was almost exclusively written in assembly and, by necessity, tighter than a duck's ass. In this respect, modern PCs are much more forgiving (and I don't think you can cycle count an out of order CPU with anything like the accuracy you can a trusty old Motorola).

Anyway, 8 GiB is acceptable in an RX 480, not so in the latest and greatest Nvidia whizz bang wunderwaffe.
 
  • Like
Reactions: Mindstab Thrull
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
You do realize most new console ports will allocate well over 8GB at 1080p max, right?
 
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
You drank ALL the Nvidia Kool-Aid didn’t you? Retailers still sell gt1030 cards as “Fartnite capable”. The BOM on a 5060 is WAY under $100. The BOM on a 5090 is probably under $500.
 
  • Like
Reactions: George³
You do realize most new console ports will allocate well over 8GB at 1080p max, right?

Short answer: no.


Long answer:

Games no longer "allocate" anything, that entire task has been moved over to the OS and it's graphics pipeline. Modern OS's now pool both system ram and graphics card ram into a single pool of memory called display memory. Any resource or buffer can be located in any memory space at any point in time. The display manager will then attempt to predict what resources are needed immediately and have most moved into the graphics cards ram, if that ram is full then the oldest resource is evicted out to system memory. What this does is turn graphics card ram into a massive L3 cache.

Anyone trusted the "Memory used" part of GPU-Z is still living in Windows XP / DX9 era.

The PS5 for example has 16GB of 256-bit GDDR6 that is used by games for both code/data and graphics.