News RTX 5060 Ti and RTX 5060 may arrive in March to steal AMD's spotlight — Chaintech hints at higher Average Selling Prices

I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
This is true about well-optimized games. Unfortunately a lot of new ports are straight dog water when it comes to texture streaming/compression so they tear through VRAM.

On the other hand, in no way shape or form are the 60 and especially the 60ti cards supposed to be "budget" options. Historically, they have always been the mainstream gaming gpu option (I would go back as far as the geforce GT6600).
The fact that nvidia can't be bothered to release 50 and 50ti cards anymore doesn't change the fact that a $350-400 card cannot be considered budget. 60 series are generally the mainstream gamer cards, 70 series are for enthusiasts and 80 is for high end gaming. 90 series or Titan were always "halo" products only for the crazies or professionals.
 
This is true about well-optimized games. Unfortunately a lot of new ports are straight dog water when it comes to texture streaming/compression so they tear through VRAM.

On the other hand, in no way shape or form are the 60 and especially the 60ti cards supposed to be "budget" options. Historically, they have always been the mainstream gaming gpu option (I would go back as far as the geforce GT6600).
The fact that nvidia can't be bothered to release 50 and 50ti cards anymore doesn't change the fact that a $350-400 card cannot be considered budget. 60 series are generally the mainstream gamer cards, 70 series are for enthusiasts and 80 is for high end gaming. 90 series or Titan were always "halo" products only for the crazies or professionals.
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
 
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
Jensen is all too pleased to have you call $400 budget. He would also have you believe that his 8GB VRAM cuts it because you are only doing 1080. Too bad his idea of $250 12GB entry level cards are now doing 1440 (3440x1440 in my case).
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
8GB is not enough with how games are currently being designed. It's one of those corners being cut which means it should only appear in the budget space. I don't think anyone rational would complain about getting 8GB VRAM on a $200 card. This isn't how pricing currently exists though since the 4060 is a $300 card (4060 Ti $400) and there's no reason to believe a 5060 will cost less even with 8GB VRAM.
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
You really shouldn't approach it this way because what you're saying is: hey giant company fleece money from me because you've raised the prices. This has been something most of the tech space has just gone along with and it doesn't make sense.

The $300 price point has never really been a budget offering, but rather the lower end of midrange. We saw this shift with the 30 series because the big die was where all the money was and AMD has been happy to play nvidia's pricing games since they launched the first RDNA based card.

What's actually going on is nvidia and AMD have no interest in providing value for the consumer because they've proven they don't have to. This is simply to have high margins on every GPU product being sold. $300-400 isn't a budget price range, but it certainly is low end performance wise.

Tim from HUB did a 40 series video regarding what the market is actually getting and just did one for the 50 series products that we have details on. It should come as no surprise that every performance tier is lower than it really should be and consumers are getting less for their money:
View: https://youtu.be/J72Gfh5mfTk?si=KrST-B6tTsv5t73W
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.

I can think of 3 reasons:

1) Price. nVidia is charging more than ever for entry level gaming cards, yet they're using VRAM size as a way for product segmentation. There have been articles in the past couple of years on TH about modders having upped the VRAM on these cards and shown a not-insignificant gain, so the performance of these cards is artificially being gimped by nVidia.

2) More and more games are using near or above 8GB VRAM at 1920x1080, especially when ray tracing is enabled. Techspot did a decent article about it last year, I suggest you give it a read.

3) DLSS frame generation. Those extra frames have to have memory to fit in, and I'll borrow a chart from that Techspot article to demonstrate, those extra frames can easily push VRAM usage over 8GB at 1920x1080, so imagine what memory usage will be once DLSS4's additional frames add to that.

SSH-f-p.webp


4)
 
99.999% of games is called OLD games and a GTX 1060 an OLD card will run them fairly well yeah. 8GB is not enough for gaming in 2025, that's what it is. 90 series became 4K users, 80 series 2K users and 70 series 1K users, that's what it is. We all know, btw, that the 4060 was really the 4050, the 4070 rlly was the 4060 and so on. Now we have a 80 series card costing 2 grand, a 70 series costing one grand, and a 60 series costing half grand, that's reality. What happened is that starting series 4000 Nvidia changed its name scheme and didn't upgrade cards but rebranded them. The 3080 12GB became the 4070 12GB and now it will be called the 5070, it's really a 3080 12GB. In all fairness though the 3080 MSRP at launch was $700 and it came down to $600 when rebranded as 4070 and now it almost hit $500 as a 5070, if the 5060 TI had its power at 16GB woundnt be a bad deal either cause it would be way more useful at 4tK.. but then how would Nvidia sell the 4070 Ti? So it's nerfed once again. Yeah games didn't evolve for the last 4 years and won't until the PS6 comes , that's when GPUs will finally upgrade a tier above the last gen (3000 series)
 
8GB is still plenty to run 99.99999999999% of games at max settings at 1080p.
old games? yes, but are you spending 450+ dollars for older games and not for upcoming gamesover next few yrs that WILL use more?
Games like Monster Hunter Wild want over 6GB at 720p low everything.
it goes up from there. more so if you turn on frame gen (which uses more vram & also the game wants you to for 60fps at 1080p upscaled)

and thats JUST the games....your vram is used by other stuff in background.

GDDR6 is around 2 per gb...so going from 8gb to 16 gb is less than $20 more to do...for a product that is literally 400+.

Nvidia's just got so far ahead they treat low end like crap becasue it makes ppl buy a more expensive card w/ more vram.

and that ignoring how badly they gimp the memory bus of the 4060 (and you will likely also see on 5060)

AMD's raster & vram is better than nvidia's and only reason they aren't the best is they are much worse moment anything needs ray tracing...if AMD's RT can be around 4070's they have nothing to fear from nvidia on low end.
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
Plain and simple:
In The Old Days, hardware wasn't powerful, so programmers had to be more careful with how games were coded. In the modern era, even a sub-$500 machine is more powerful than a $2000 machine from a decade and a half ago, with a lot more memory and faster and likely more storage, so programmers don't have to worry about system constraints as much. As a result it's easy for programs to balloon out of size, and this means for VRAM requirements as well, because you don't have to find the best way to make a character appear on screen the way you want, you just need something that works. Apply that logic to game coding in general and you have Call of Duty taking 200+ GB to install and Indiana Jones requiring Ray Tracing... Also even cheap monitors are getting harder to drive to their full potential, and add the number of people that run more than one display on their system at a time (usually the second having something like Discord up), and 8GB just isn't that much anymore.
So if it's an older game (I love me some Titan Quest!) being run on a single display, 8GB is plenty. But you can get 4k displays under $300 - some basic ones under $200 even! - and 8GB is starting to just not be enough anymore. And we've been getting told to make do unless we want to overspend for too long now. There's the entire 3050/3060 debacle too.

That's my two Sarpadian coppers
Mindstab Thrull
 
I can think of 3 reasons:

1) Price. nVidia is charging more than ever for entry level gaming cards, yet they're using VRAM size as a way for product segmentation. There have been articles in the past couple of years on TH about modders having upped the VRAM on these cards and shown a not-insignificant gain, so the performance of these cards is artificially being gimped by nVidia.

2) More and more games are using near or above 8GB VRAM at 1920x1080, especially when ray tracing is enabled. Techspot did a decent article about it last year, I suggest you give it a read.

3) DLSS frame generation. Those extra frames have to have memory to fit in, and I'll borrow a chart from that Techspot article to demonstrate, those extra frames can easily push VRAM usage over 8GB at 1920x1080, so imagine what memory usage will be once DLSS4's additional frames add to that.

SSH-f-p.webp


4)

I understand what you're saying. I know that HellBlade, Hogwartz Legacy, IJATGC, Avatar, and a few that will push north of 8GB when maxed out. But what I'm saying is, for the vast majority of games, 8GB will work. What I mean is this: There are 90,000 games on steam. I bet there are no more than 10 that require >8GB for gaming with at least HIGH settings (excluding ray tracing) at 60FPS, which is still very playable. That's such a tiny fraction. This is the masses that XX60 and XX60Ti cards appeal to. And I think NVidia is just saying: For the people who wants to play those 10 games, go get an XX70 and XX70Ti. For everyone else, 60 and 60Ti is enough. I think a 60 or 60Ti with >=12GB is pointless, because if a game is demanding enough to use 12GB, the core won't be able to produce the frames, even though the textures memory pool is sufficient. I think 60 is geared towards what would suffice for most people: 1080p high for only $350 instead of $2000 lol. I think thats what they are getting at.

I apologize if I sound like I am defending NGreedia's price gouging. That is NOT what I am trying to do. I'm trying to do an objective, neutral, impartial, honest, realistic, real world TECHNICAL take on the matter. In the vast majority of common, real world use cases, with a few small exceptions, 8GB is still enough.
 
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080

That $249 Intel card (mine was $269) is a 1440p High card. I can live with that. It outperforms my a770 - which out performed the 4060.
 
  • Like
Reactions: rluker5 and P.Amini
Let's not forget as well the mindset/loyalists. I've had "debates" with people who've literally said (and I am quoting here) "I would rather pay $200 more for a slower Nvidia card over a faster AMD card, because (insert typical BS excuse here, ie: drivers are unstable, hardware fails frequently, AMD is screwing their customers, AMD puts pineapple on pizza, etc)"

And, people in this mindset push this view on others. Hard.
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
I think the problem is that something like 5060Ti is easily capable to handle 1440p too, especially if you slap on DLSS, but 8GB VRAM is starting to be a real problem there.

IMO, putting 8GB VRAM on a GPU that can otherwise give you north of 100 FPS in many titles at 1440p is pretty cringe, that's like asking to be memory bottlenecked there.
 
I think we can call $350-$400 budget cards.
maybe where you are, but IF that is US dollars, that is $500 to $600, that is NOT budget in any way shape or form.

you also seem to be forgetting, nvidia seemed to have raised the specs of their card up a tier, but still charge the higher tier price, for the lower tier performance....
 
serie 60 1080p 8gb no 16gb yes
serie 70 1080p/2k 12gb no 16 gb yes
Serie 80/90---- 2k/4k 16gb NO 24+ gb YES
7900XTX 24gb gddr6----1080p/2k/4k 24 gb YES
5080 16gb GDDR7 =960 gb bus 254 bits bus
7900XTX GDDR6 2 years old =bus 960gb ,bus 384 bits
7900XTX 24 gb =900 dolars
5080 16gb =1200 dolars
One built 5080 with 24 gb gddr6 same performance to 16gb gddr7.....and 300 dolars less.....with 8gb more gddr......same performance....Upps!!!

WIth 24 gb= maps more big in game,charge IA more bigs .
 
Last edited:
You don't even have to dig that far to see the VRAM on 60 series regressed.
RTX 5060: 8GB
RTX 4060: 8GB
RTX 3060: 12GB
RTX 2060: 6/12GB
GTX 1660: 6GB
GTX 1060: 3/6/5GB
GTX 960: 2GB/4GB
GTX 760: 1.5/3/2/4GB

At least the GTX series were relatively affordable, despite the weird ram configurations.
 
I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.

It is enough, I have a 3050 low profile with 6GB of VRAM running casual games at 1080 upscaled to 2160p on my living room HTPC. 8GB starts to become problematic with a handful of games release last year that honestly are just poorly coded ports. And even then having 8GB doesn't make them unplayable, just you have less FPS, of course your on a 60 model so your not exactly pushing high FPS to begin with.

If someone wants to run all the latest stuff, including dumpster fire ports, their gonna need a much beefier card then the new "budget" 60 models. And yes from nVidia's point of view $400 is budget, you guys have no idea how much money they make on their datacenter GPUs, which is all made from the exact same silicon from TSMC. Every consumer GPU they make is charity work due to how much more they could make by just printing more H200s and soon B200s.
 
  • Like
Reactions: iLoveThe80s
I think the problem is that something like 5060Ti is easily capable to handle 1440p too, especially if you slap on DLSS, but 8GB VRAM is starting to be a real problem there.

IMO, putting 8GB VRAM on a GPU that can otherwise give you north of 100 FPS in many titles at 1440p is pretty cringe, that's like asking to be memory bottlenecked there.

This was largely disproven, they made a 16GB 4060 Ti and it didn't do any better then the 8GB 4060 Ti because they still have the same memory bandwidth and GPU compute. Nvidia very carefully paired up each cards compute with the amount of bandwidth it would utilize and then just used whatever capacity was highest per-chip (16Gb). If the 50 series had a significant jump in compute power then they would have to add more bandwidth and capacity, but its looking like it doesn't so not much is going to change. Looking towards Q3 this year expect to see a 5060 TI super with 12GB of VRAM as a result of them using Samsung's new 24Gb (3GB) chips. It will be benchmarked and I bet there won't be much difference in speed (without a clock boost).
 
Tim from HUB did a 40 series video regarding what the market is actually getting and just did one for the 50 series products that we have details on. It should come as no surprise that every performance tier is lower than it really should be and consumers are getting less for their money:

So his video was pretty faulty, largely because he was looking at the 30 to 40 series jump and assuming that would happen again.

Consumers are getting the exact same price / performance as before, and I mean that quite literally because nVidia has gotten extremely good at selling performance in units of dollars. They really do sell performance this way.


One of their partner vendors

https://www.runpod.io/console/deploy

4090 is 59 ~ 69 cents per hour.

What we see is that 50 series is giving us about the same dollar/FPS as the 40 series did. Some games will naturally very, but the average gets you the same terrible price/performance curve. The same curve that was optimized to funnel people to buy the 4090 is also funneling people to buy a 5090. There is no "value" option like we had in the past, lower tiers that provided better dollar/FPS, instead it's all the same.
 
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
This whole categorization system for GPU tiers is getting too relative and subjective.
I used to have sli 780tis, got a 4k tv in 2014 and could play W3 when it came out at 4k60 because my tv was one of 2 lines with a DP input, but had to play at med-hi settings even with a custom bios and 1250mhz clocks. Only needed 3GB even with Halk textures.
Couple years later got sli 1080tis and played 4k60 at reduced settings on the tougher games and sli worked less and less.
Next up was a 3080. Still 4k60 at reduced settings, but now the upscaling worked much better, and I often run it at 1800mhz/800mv because the cooler is obnoxious.
My B580 also plays most games at 4k60 with mostly high settings, overclocking and upscaling and in CP2077 it runs the same settings for 4k60 as my 3080. Sure it relatively falls back in all the other games I've tried, but it looks good on a 4k screen.

Sure it isn't as fast at 1080p as a lot of others. All cards have gotten faster. Budget should be a price point category and entry level should mean barely functional. Like something equivalent to a 780m, GTX1650 or A380 level of performance where you often struggle to get 60fps at 1080p at settings that are visually a significant step down from medium. If a card can do more than an entry level monitor can display then it should be better than entry level. With the new high end cards you can enable settings you don't even want other than just because you can, except in games that are intentionally made in such a fashion as to make it look like a more powerful GPU, the latest features or copious amounts of vram are needed. But if they didn't have this stuff then people wouldn't have a reason to upgrade from their RDNA2 card or whatever.

Maybe categorize cards like games hardware requirements do: Looks good at 60fps at x resolution (I.E a good 1440p card), or will run frivolous settings or high refresh rates at y resolution (I.E. an OP 4k card).

It seems more quantifiable and will draw more attention both to overpriced cards and to outlier games that need more vram for no good reason like others had hairworks, raytracing that doesn't improve image quality, or games that make a CPU bottleneck when your view is a static table or 5 npcs or something.