News "Very Few Are Interested" in RTX 4060 Ti 16GB GPUs, Nvidia AIB Sources Reportedly Say

InvalidError

Titan
Moderator
"No one is interested in a 16GB RTX4060Ti."

I'm sure plenty of people would be interested if it was $350 instead of $500-and-up.

Though 16GB on a 128bits bus might pose somewhat of a bandwidth challenge. High-res textures and buffers are the main things that consume VRAM and VRAM bandwidth requirement will rise accordingly. Looking forward to benchmarks showing how badly 128bits might fall flat on its face because of it with 16GB models taking VRAM exhaustion out of the equation.
 
Another big advantage of the RTX 4070 stems from its memory subsystem: using faster GDDR6, and a wider memory bus, for almost double the memory bandwidth.
That bandwidth is needed to feed the extra cores. It does not mean it gives it any actual advantage over the 4060 Ti if you normalize things (though I'm not sure how you'd normalize it). Or to put it in another way, it's like saying a Threadripper 3970X has a memory subsystem advantage over Ryzen due to being on a quad channel platform... even though there's at least twice as many cores to feed at this point. Or to reverse it, we feed 16-core Ryzen CPUs with dual channel memory and nobody seems to bat an eye.

In addition NVIDIA's method to combat this is load up the GPU with a lot of L2 cache. The GeForce 40 series has give or take anywhere from 8-12 times as much L2 cache over the GeForce 30 series. It's not that dissimilar from AMD having a lot of LLC on RDNA3.

And curious if memory bandwidth really had any significant impact, I dug around to see if I could find two video cards where the only thing that changed (more or less) was the bandwidth. Which the only card I was able to find within the last few generations was the RTX 3060 Ti, which got a GDDR6X upgrade. The rest of the specs are identical. It also saw little (as in <5%) improvement.
 

salgado18

Distinguished
Feb 12, 2007
980
437
19,370
No one is interested in paying $100 for extra 8 GB of VRAM. $50 would be a nice price difference, but $100 (which is a 25% price increase over the 8 GB model) is way too much for the benefit.

I mean, it sure is good to have lots of VRAM. My nephew has an RX 460 2 GB that doesn't play many games not because it is a 460, but because it is not the 4 GB model. I know that is a terribly slow card for today, but the future will come, and someone will be using the card.
 

evdjj3j

Distinguished
Aug 4, 2017
371
396
19,060
No one is interested in paying $100 for extra 8 GB of VRAM. $50 would be a nice price difference, but $100 (which is a 25% price increase over the 8 GB model) is way too much for the benefit.

I mean, it sure is good to have lots of VRAM. My nephew has an RX 460 2 GB that doesn't play many games not because it is a 460, but because it is not the 4 GB model. I know that is a terribly slow card for today, but the future will come, and someone will be using the card.
I had to use a 2 GB 770 during the GPU shortage and 2 more GB of RAM would have made a huge difference.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,452
996
20,060

GDDR6 VRAM Prices Plummet: 8GB of Memory Now Costs $27

If GDDR6 VRAM price costs $27 for 8 GB, why would the end user want to pay more than $27 or $30 more for the exact same card?

You already have the original profit margin on the 4060Ti, nobody wants to give nVIDIA more profit margin for what is effectively a little bit longer time on the Pick & Place machine.

Hypothetically, $3 should cover the extra time & costs for the machines to do their jobs to place the extra GDDR6 VRAM Packages on the MoBo.
 

bigdragon

Distinguished
Oct 19, 2011
1,145
620
20,160
$100 more for 8GB? Does Nvidia not realize how inexpensive memory is now? We can all go to our favorite storefronts and see how much memory prices have come down. Nvidia is abusing its market position to intentionally kneecap what should have come standard on the base model 4060 Ti, if not the 4060. It's startling just how out of touch Nvidia is with its customers now. The lack of 4060 Ti 16GB models could be a positive, however. We don't need 5+ variants of the same tier card!

8GB of VRAM no longer meets the minimum requirements of many AAA games. 12GB won't be enough much longer. It's incumbent on Nvidia to either provide more VRAM on their products or work directly with game engine developers to improve the way cross-platform ports utilize memory.

In an ideal world, VRAM would no longer be a thing and we'd have some sort of unified memory separate from the GPU...and user upgradable!
 

InvalidError

Titan
Moderator
That bandwidth is needed to feed the extra cores. It does not mean it gives it any actual advantage over the 4060 Ti if you normalize things (though I'm not sure how you'd normalize it). Or to put it in another way, it's like saying a Threadripper 3970X has a memory subsystem advantage over Ryzen due to being on a quad channel platform...
If you increase bandwidth while keeping the workload exactly the same, you aren't going to see much change unless you already had a non-trivial bandwidth bottleneck.

If you double VRAM then change the workload to fill that VRAM such as by loading 4k/6k/8k texture packs, chances are that reads are now scattered that much wider as they were before and bandwidth may play a much larger role, especially when most of it is high-res textures which have minimal impact on GPU-power requirements, only VRAM size and bandwidth.
If GDDR6 VRAM price costs $27 for 8 GB, why would the end user want to pay more than $27 or $30 more for the exact same card?
Because nobody wants to make a bigger effort with higher expenses and higher liabilities for zero profit. AMD and Nvidia want 40+% and 60+% profit margins, so you need to add 40+% to whatever they put in the GPU kits they sell to AIBs. Then AIBs have to slap their own profit margin on top, then distributors and retailers.
8GB of VRAM no longer meets the minimum requirements of many AAA games. 12GB won't be enough much longer.
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.

The statement you should be making is: "8GB of VRAM no longer meets the minimum requirements of many AAA games at 1080p high-details and beyond."

Most people looking for GPUs under $300 are perfectly fine dialing things down to save $100+ vs the next sensible step up.
 
  • Like
Reactions: Math Geek

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,452
996
20,060
Because nobody wants to make a bigger effort with higher expenses and higher liabilities for zero profit. AMD and Nvidia want 40+% and 60+% profit margins, so you need to add 40+% to whatever they put in the GPU kits they sell to AIBs. Then AIBs have to slap their own profit margin on top, then distributors and retailers.
I doubt the 40-60% Profit Margins = 3.703…× Price Multiplier on VRAM.
GDDR6 literally costs $3.364/GiB
8 GiB of GDDR6 costs $26.912 ~= $27
Why would you think that a 3.703…× Price Multiplier on VRAM on 8 GiB more of VRAM is justified?

Anybody with basic common sense can see that it's just money grubbing greed for a un-fair mark-up.
 

bigdragon

Distinguished
Oct 19, 2011
1,145
620
20,160
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.

The statement you should be making is: "8GB of VRAM no longer meets the minimum requirements of many AAA games at 1080p high-details and beyond."
Jedi Fallen Order, Hogwarts Legacy, The Last of Us, Forza Horizon 5...all of these recent AAA games launched with excessive VRAM use that severely compromised game performance. Each game was very difficult to play with cards only equipped with 8GB (or 10GB) of VRAM. These games have since been patched to lower VRAM demand, but patches took weeks or months after release to address the problem. I don't think it's misleading to say that modern AAA games require more than 8GB of VRAM.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,452
996
20,060
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.
Nobody who is currently playing AAA games on 8 GB is wanting to lower ANY details.

They want to crank it to max and get good frame rates.

Is that realistic, no! But it's the experience that they're used to for so many years.

They're not used to a world with larger textures that demand more VRAM.

They don't care about the technical details, they just want it to look pretty and fit in VRAM while being at "Maximum Visual Fidelity".
 
  • Like
Reactions: atomicWAR and PEnns
Jan 15, 2023
44
15
35
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.

The statement you should be making is: "8GB of VRAM no longer meets the minimum requirements of many AAA games at 1080p high-details and beyond."

Most people looking for GPUs under $300 are perfectly fine dialing things down to save $100+ vs the next sensible step up.
Funny, from my point of view it's the other way around: People who oppose or nitpick the "8GB is not enough" argument are the ones being misleading, whether intentionally or not. The argument is perfectly clear, no matter if it's flawlessly expressed or not: 8GB might be enough NOW, but most likely won't be enough in VERY NEAR future, whether that is 0.5-1year, or 2 years from now.

And no, im not even remotely fine dialing things down below high settings on a new GPU. Im fine dialing things down further down the road, not immediately after i sold my kidney to get one. I'm looking for GPUs under $300 because i can't afford the more powerful ones, and exactly because i can't afford them, im buying with the intent to use that GPU for years. Not to mention that the cheapest 4060 (not even TI) in my country costs ~450$ starting today, while 620$ is the most common price of the rest of the 4060 models. Mind you, the average monthly pay here is 550$ but most of the country lives on a minimum wage of 350$ / month. And i believe this is the case in most of the world, if not worse.

So yeah, from that POV, "8GB is not enough" is perfectly well put and doesn't need any additional disclaimers, let alone constant pushback.
 

oofdragon

Distinguished
Oct 14, 2017
327
292
19,060
Very few are interested in 4060, 4060Ti 8GB, 4060Ti 16GB, 4070, and 4070Ti. Everybody interested on used market because this "gen" is a sad money grabbing scam.

I play mostly 1080p, the RTX3080 ($400 at ebay right now) paired with a top notch processor just runs 99% of games 144hz maxed out, why spend more? It plays any game 60fps+ maxed out at 4K without any scaler. What everybody was waiting for was the true RTX4060 at $400 with that exact performance, but it came out as a "4070" costing $600......... and here we are hoping no one buys it so Nvidia learn the lesson. At $600 there's the 6950XT that just eats the 4070 for breakfast, now If just AMD came out with a $400 at 6800XT performance ...... that would be hitting the spot right on
 
Last edited:

InvalidError

Titan
Moderator
I don't think it's misleading to say that modern AAA games require more than 8GB of VRAM.
It is misleading to say so without specifying the detail level. If you leave the statement at only this, it implies that people with 8GB or less VRAM cannot play those game in playable way whatsoever no matter how much they may be willing to compromise. You can play most modern AAA games on a 4-6GB GPU at lowest details.

Pretty sure every AAA title for the next four years will play fine on 8GB GPUs at 1080p medium-low details - can't have the jarring texture resolution pop in Hogwart if you lower details enough for the game to not bother using the higher resolution textures at all. If you don't care about maxing out everything and go beyond 1080p, 8GB will be enough for the foreseeable future.

By leaving the detail level and resolution out of your statement, you are omitting half of the discussion as VRAM requirements are heavily affected by detail levels and to some significant extent by resolution.

Nobody who is currently playing AAA games on 8 GB is wanting to lower ANY details.

They want to crank it to max and get good frame rates.

Is that realistic, no! But it's the experience that they're used to for so many years.
Most 8GB GPUs aren't powerful enough to do that, which is part of the reason why AMD and Nvidia bet on 8GB being enough for them.

Funny, from my point of view it's the other way around: People who oppose or nitpick the "8GB is not enough" argument are the ones being misleading, whether intentionally or not. The argument is perfectly clear, no matter if it's flawlessly expressed or not: 8GB might be enough NOW, but most likely won't be enough in VERY NEAR future, whether that is 0.5-1year, or 2 years from now.
If you fear that 8GB won't be enough in the near future, then vote with your wallet and don't buy an overpriced GPU with only 8GB, save $100+ by getting an RX6600 instead or put gaming on hold while you wait the GPU market out.

I was still using a 2GB GTX1050 until last month where I got the cheapest RX6600 I could get. I tried Intel's A750 three months ago but returned it due to frequent crashes even while idle. 2GB was good enough for me for six years, 8GB will likely be easily good enough for me for at least four years, especially if AMD and Nvidia continue their trend of neglecting the $200 price point.
 
  • Like
Reactions: palladin9479

Zerk2012

Titan
Ambassador
Simple to me if you don't like what they offer then don't buy it.

If you have the must run everything ultra thing then you should be looking to buy the best card also. If you can't afford to run ultra then your trying to eat steak on a hamburger budget.

Edit For me and a bunch other people Ultra is just a epin thing.
 

Math Geek

Titan
Ambassador
i love the "nobody is playing a new game on less than ultra" nonsense.

total BS. i have a 1650 super with 4 gb ram and it plays everything i throw at it perfectly fine. don't know nor care what the settings are and i and the others who use it enjoy every moment of play time.

but as noted above, we are also very secure in our world and don't need to get into an e-peen competition with others on the web, :)

but hey, if you need the validation, they make something just for you and this article states this version is only $500 to start. but if you rerally are super ultra mega insecure, they got $1500+ cards just for you!!!
 

InvalidError

Titan
Moderator
Edit For me and a bunch other people Ultra is just a epin thing.
After playing a few games on my RX6600 at medium-high details instead of lowest everything on my GTX1050, my suspicions were confirmed: I don't give a fluff about details. Sure, graphics look incrementally better but it makes no material difference to how much I do (or don't) like a game. All I need is smooth playable frame rate and high enough resolution that edge aliasing doesn't bother me.
 
If you increase bandwidth while keeping the workload exactly the same, you aren't going to see much change unless you already had a non-trivial bandwidth bottleneck.
Which is why I don't see any merit in the idea that a more powerful video card has a memory subsystem advantage when its GPU needs it because, well, it'll get more work done in the same amount of time.

If you double VRAM then change the workload to fill that VRAM such as by loading 4k/6k/8k texture packs, chances are that reads are now scattered that much wider as they were before and bandwidth may play a much larger role, especially when most of it is high-res textures which have minimal impact on GPU-power requirements, only VRAM size and bandwidth.
Data being scattered sounds more like a latency problem, not a bandwidth problem. And even then, textures are effectively a LUT and likely read in fixed block sizes with only the needed blocks. Also likely is once the texture data is read, it's going to live in cache for a while because unless the scene is completely changing every frame, those texture blocks are likely going to be needed again.
 

oofdragon

Distinguished
Oct 14, 2017
327
292
19,060
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.

The statement you should be making is: "8GB of VRAM no longer meets the minimum requirements of many AAA games at 1080p high-details and beyond."

Most people looking for GPUs under $300 are perfectly fine dialing things down to save $100+ vs the next sensible step up.

Talk about yourself, I'm definetly not fine paying north of $300 to play a game at "medium" settings and I bet most here aren't either. Can you tell me when it was the last time you paid $300 for a next gen GPU that could not play recent launched titles? Yep.. you can't, never happened, until now it is. Do you know what a $250 card looks like today actually by market value? A 6700XT 12GB that is faster than a 3060TI. That's used yeah, so the new card should cost max max $300, at 3060Ti level with 12GB of course. What do we have at that lvl from Nvidia right now? 4060Ti 16GB, $500. That's the closest, rlly.

4060TI 8GB/16GB = 13400
6700XT 12GB = 12800
3060 Ti 8GB = 11800
4060 8GB = 10600

This "new" $300 card from Nvidia is nothing but a 3050 Super with the same VRAM. Right now I can buy a 3060TI 8GB for $250, and that's actually a better buy than a 4060 even if this new card also costed $250. The reason you are scaling things down to "medium" is because this card is a 3050 replacement, even the 3060 had 12GB (now $220 at ebay) and doesnt need to dial down settings from ANY game. No matter how you look at it the 4060 is a downgrade from the 3060 and just blind fanboys won't see it. If you are buying a 4060 8GB for $300 instead of a 3060 12GB for $220 you are ... not a intelligent person

View: https://www.youtube.com/watch?v=_ja2GTq99us


When the "$500" 16GB 4060Ti comes this is exactly the performance it will have against the $250 6700XT 12GB. They actually TIE. The 4060Ti is but a $300 card new, as much as the 4070 is a $400 (exactly the price the 3080 is used), the 4070Ti a $600 (exactly the price of a 6950XT new which the Nvidia ties), and the 4080 a $800 one, the price of a 6900XT. People are paying a $200 premium for lagged fake frames (DLSS3) and stutter (RT) that make their experience worse. That's what is really happening. If dlss3 actually made the game play faster or RT actually improved the eye candy, maybe someone could say "yes it's worth to pay $200 more for this", but nope, just marketing and delusion.

So.. those 300 bucks? Sorry but instead of
dialing down settings I'll play any game maxed out at 1080p with the 6700XT 12GB, heck I bet it plays maxed out at 1440p just fine, and the extra $50 I may use to upgrade the CPU or memory tnks
 
Last edited:

InvalidError

Titan
Moderator
Can you tell me when it was the last time you paid $300 for a next gen GPU that could not play recent launched titles? Yep.. you can't, never happened, until now it is.
I don't buy $300 GPUs, don't care anywhere near enough about incrementally better game graphics for that. My $180 RX6600 can play any modern game fine, just need to lower settings enough to make it smooth.

Data being scattered sounds more like a latency problem, not a bandwidth problem. And even then, textures are effectively a LUT and likely read in fixed block sizes with only the needed blocks. Also likely is once the texture data is read, it's going to live in cache for a while because unless the scene is completely changing every frame, those texture blocks are likely going to be needed again.
Data being scattered around because there is about twice as much of it to fetch to render higher resolution textures and effects becomes a bandwidth problem pretty quickly. You can engineer latency mitigation into a GPU, which is one reason why GPUs are so heavily multi-threaded. Once you hit a bandwidth bottleneck though, no amount of latency mitigation and additional threading can hide that.

As for textures remaining in cache between frames, that isn't going to happen unless your entire scene including z-buffers, rendering buffers, frame buffer, etc. all fit within the cache. However, a typical game will chew through at least 1GB of assets to render one scene, so a GPU with only 32MB of L2 cache is likely to have most of it flushed dozens of times per frame. The likelihood of a texture staying in L2 long enough to get reused between consecutive frames is extremely low.
 
Jan 15, 2023
44
15
35
Saying "8GB is not enough" or "modern AAA games require more than 8GB vram" is not misleading because it is said within a certain context. The context is: 8GB is not enough to run modern AAA games at a level you would expect from a newly released GPU in 2023 at prices like these. Even Hardware Unboxed, who really started this "movement" (god bless them), have pointed out numerous times that their beef is not with 8GB cards, but 8GB cards at these prices. How is this not clear by now? Why are we being so pedantic about supposed lack of nuance when the argument was made clear thousands of times before? Yet at the same time we have these corporations deliberately butchering their products, withholding value from consumers, desperately trying to cling to their cryptoboom profit margins.

Speaking of "misleading", this whole "vote with your wallet" mantra is exactly that. My "vote" of not buying a budget gpu means nothing against someone else's purchase of a gpu nearly three times as expensive as that. For example, the 4060 vs 3070 Ti (even if it's a regional case). The only way my vote counts is through community pushback. And now that we have it, i find it silly that we're insisting on accurately presenting the claim every time, or not making it at all.
have a 1650 super with 4 gb ram and it plays everything i throw at it perfectly fine. don't know nor care what the settings are and i and the others who use it enjoy every moment of play time.
Oh really? Did you buy it in 2023 for +400$? I bet not.

As for gpus somehow being connected to self-validation, epins and personal insecurities... only a child would make that connection. Implying that's the issue with the 8GB vram discussion is completely missing the point. Also, doesn't sound very "secure" to me, whatever that means and whoever "you" and "your world" is.

But since having an old gpu is so cool, apparently, i should mention that i'm sitting on a 10 year old 2GB GTX 750Ti as a main and only option. Ikr, i'm such a badass. Now where is my Medal of honor?

My point is, me and others here and elsewhere are not arguing the 8GB vram issue from a standpoint of spoiled brats, as some here would like to present it. I'm not even speaking for myself, but for the sake of people like me, especially the young ones. You know, these people someone crazy enough would call "gamers". Not Nvidia tho (or AMD for that matter, although they are slightly more "generous" imo), they refer to a different type when they use that word. You know, the type to act as a gatekeeper of having fun. The type to enjoy fighting pixelated dragons just like when he was a child, but claim everyone else is entitled for wanting value for their often hard earned money. The type to suggest that if you dislike the malicious practices these companies employ, you're only allowed to not buy their products and then shut up about it, unless you make sure you wrote an entire essay explaining the full context of something so obvious. The type to prefer developers optimizing games for cutdown products instead of fixing essential bugs or working on new games and content. The type to parrot that most people play on 1080p, without ever asking himself: Why is that? ...