News Nvidia Reveals RTX 4060 Ti, 4060 with Prices Starting at $299

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
AMD has absolutely given "effective bandwidth" numbers on RDNA 2/3 chips. And again, it's not just marketing, it's engineering. Because when people look specs and see a drop in bandwidth, they get worried. Looking only at bus width or bandwidth is as misguided as looking only at theoretical teraflops.

The RX 6600 XT has fewer cores at higher clocks to get 10.6 TFLOPS, and RX 5700 XT has 9.8 TFLOPS. The point isn't that they have similar compute, it's that the 6600 XT has 256 GB/s of bandwidth while the 5700 XT has 448 GB/s. How can it deliver similar performance with 43% less bandwidth? Infinity Cache. How does Nvidia deliver a big generational boost in performance with the RTX 4090 over the RTX 3090 Ti, even though they have the same GDDR6X configuration and bandwidth? With a much bigger L2 cache.

You can call BS on Nvidia's pricing. You can question how good DLSS 3 Frame Generation really is. You can complain about the lack of VRAM capacity. But the "effective memory bandwidth" figures are probably the least problematic aspect of the GPUs. The only real issue is that getting more effective bandwidth from a narrower bus means it's possible to end up with less VRAM because there aren't as many memory channels to go around.

Proof:

AMD:
View attachment 247
View attachment 249
View attachment 250
View attachment 252
View attachment 251

Honestly, I appreciate having the "effective bandwidth" data. It's AMD and Nvidia saying, in effect, this is the average hit rate of our L3/L2 caches. AMD didn't publish effective bandwidth data on the earlier RDNA 2 GPUs. Actually, it's a bit hit and miss right now. RX 6500 XT, RX 6700 10GB, and RX 6700 XT list effective bandwidth (along with the above five cards). RX 6900 XT, RX 6800 XT, RX 6800, RX 6600 XT, RX 6600, and RX 6400 do not.

Fair enough, I have never seen that. Perhaps because I did not peruse AMD's website. I never saw it on any of their marketing slides though. Also, how can you neglect the fact that the 4090 has a much higher clock speed than the 3090 Ti? It is not at all just because of increased L2 cache.
 

TJ Hooker

Titan
Ambassador
Microsoft is a actually a good example of why 8GB VRAM is insufficient.

The series S is a console in big trouble because it only has 8G VRAM.

And this idea that developers are to blame or that it's due to poor optimization, is baloney.

A good example is Baldur's Gate 3 that has been delayed on Xbox because Larian Studios can not allocate enough graphical memory on Series S. Larian is a studio with 30 years of experience, they know how to make games. Developers just can't develop a game for one audience that has 16GB VRAM and another audience that has 8GB VRAM.

So developers develop for the biggest market, and that's PS5, a console that happens to have blazing fast custom I/O chips, custom decompression chips, and 16GB GDDR6. PS5 can pull in assets and decompress textures like no other machine can. PC are struggling to keep up, the bare minimum has been set, 16GB VRAM, anything below that and PC will struggle for a whole generation.

580×334 jpg
48 kB

fsdggdgdg.jpg
The issue in that screenshot is likely just a bug, not an issue of genuinely running out of VRAM. Borderlands 3 can run on cards with as little as 2GB of VRAM, and you can find plenty of reports of PC gamers getting the same error, even with cards like a 2080 Ti or a 3090.

Source that the difficulties with Baldur's Gate 3 on Series S are related to lack of VRAM specifically?
 
Last edited:
I think gamers should base their buying decision to get these 40-series SKU only on pure rasterization performance. DLLS3 makes the comparison less appealing, and it can also be slightly misleading.

On top of that previous-gen RTX don't support DLLS3, only DLLS2. So for a fair comparison for an upgrade, just look at raw rasterization performance in games.
Yeah , I agree. I would have liked an apples to apples comparison, with DLSS 2/Or stock used to see the real difference between these cards and older ones! DLSS 3/Frame Generation is an odd mix. I don't think I'd buy a 4xxx card purely based on that. 15% Increase doesn't seem that much given the uplift the higher end cards have had.
 
  • Like
Reactions: Metal Messiah.

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
When a console is new, it is current, but as it ages, it quickly becomes outdated. PC's are able to move with the latest technology, consoles can't, because they are a closed architecture. The hardware that it comes with when you buy it, is all it's ever going to have. Nothing is upgradeable. 5 or 6 years from now, you're going to be looking at the quality of the games on your console, and comparing them to the latest PC versions, and thinking your console is crap.
In 5 years these consoles will be 8 years old and new consoles will most likely come. And as I wrote developers are just starting to fully utilize their ram and all CPU cores. Games look decent on current consoles already and huge chunk of the market is bound by them so games will be optimized to look decent enough unless developers go "F*ck you" like they did in Redfall which doesn't even look that graphically pleasing tbh.
 
Fair enough, I have never seen that. Perhaps because I did not peruse AMD's website. I never saw it on any of their marketing slides though. Also, how can you neglect the fact that the 4090 has a much higher clock speed than the 3090 Ti? It is not at all just because of increased L2 cache.
Again, you're missing the point of what I was saying. 4090 has a lot more compute, but if it didn't have more (effective) bandwidth, the extra compute would be wasted. You need to balance more compute with more bandwidth. If you had an RTX 4090 with no L2 cache and a 128-bit memory interface, the cores would all end up waiting for data and you'd probably have something around RTX 3060 performance.
 

hannibal

Distinguished
Someone drank the cool aid. They just came up with BS "effective bandwidth" numbers to try to hide the fact that its really memory bandwidth is far less than its predecessor.

I don´t say that 4060 or 4060ti are good GPUs, but they are not as bad as someone claims.
In reality tiny update with same price.
But calling these xx50 level gpus is pure horse...

I am not happy with these, they are waste of sand in some aspect, but they are not as bad as some people claim. IMHO. Hopefully 7600 offer better price / performance than these... but lets see.
Waiting for reviews and real market prices (not fame MSRP) from both companies.
 
But calling these xx50 level gpus is pure horse...
the 4060 non ti might actually be a 50 tier..as its actually going to performa worse in some aplications than a 3060 due to fact it has less stuff.


sure, faster clocks, but like CPU clocks dont mean everything when something can take advantage of more [insert part of gpu i.e. tensor cores] it will have a benefit even if clocks lower.


AFAIR they havent ever released a new gen gpu of same tier that had physically less specs. (least within past decade)
 

j_m_h_jm

Honorable
Oct 29, 2017
10
4
10,525
Something missing in this conversation is that people are still rocking tech from GTX 900, 1000 series in 2023. Steam GPU data says 4 1000 series, 4 1600 series…more people are still using the 970, 960, and 750 Ti than a 3090 or 2080.

So given this information, clearly there is a need for not arguing about 2023 usage, but closer to 2029-2030 usage. Obviously that is hypothetical, but what will trends look like then? The quasi-fruitless push to cloud games will…die? Replace consoles altogether? That could impact game development significantly. Assuming we still have local gaming, will developers find a benefit to increased VRAM at a faster rate or a slower rate? VRAM capacity gains have slowed relative to several generations of looking - it wasn’t unusual to see capacity double gen on gen back in the late 90s/early 2000s. If capacity is slowing, how will developers keep pushing the envelope?

I still have a 1060 6 gig. It replaced a Radeon 7750 and before that was a GT 8 or 9 series. My 1060 is still good enough, but happen to have the budget to upgrade now. The future is in a very weird place, to say the least. While we can complain about numbers or pricing, there are a large number of PC gamers using old tech that will be due for upgrades this generation or next (based on min specs of SOME current games, though what people play impacts that, too). My hope is we can start figuring out what will cause the next NEED for upgrades sooner rather than later. While we may WANT more VRAM, devs have to sell copies and that means GPU makers provide it for the masses, otherwise that’s a loss of sales. Now push that out 6-7 years or more. Obviously a lack of HP means VRAM doesn’t matter. :🤷:
 
My 2 cents..

I'm glad to see $299 on the RTX 4060. However, the issue of vram still bugs me. Particularly the fact that Nvidia can put more vram on slower cards shows they don't know what they're doing when it comes to planning an entire generation of video cards. 3000 series and 4000 series (for me in particular) are both ruined because I would like to spend ~$600-$700 on a kickass GPU that I feel comfortable buying and using for a long time. Nvidia has not offered that in 3000 or 4000 series. The 4070 was almost that, but still got 4GB taken from it that I think should have been there. I like the power efficiency of the 4070, and would almost consider buying one to have because my 6950XT is so power hungry and puts off a lot of heat. However, now the 4060 Ti will offer a version with more vram than the 4070...??? Nope. Looks to me like Nvidia doesn't know what they are doing.

Having said that, one might say the 4060 Ti vram was a lose-lose situation. Yes, that is exactly what it is when you skimp down your high-end video cards. [Insert bad words here.]
 
Last edited:
  • Like
Reactions: atomicWAR

InvalidError

Titan
Moderator
Assuming we still have local gaming, will developers find a benefit to increased VRAM at a faster rate or a slower rate? VRAM capacity gains have slowed relative to several generations of looking - it wasn’t unusual to see capacity double gen on gen back in the late 90s/early 2000s. If capacity is slowing, how will developers keep pushing the envelope?
A better question may be: does the envelope need to be pushed any further? IMO, at this point, graphics are already beyond what I could care for in games and I'd be 10X more interested in seeing novel concepts than pushing graphics any further.

My 2 cents..

I'm glad to see $299 on the RTX 4060. However, the issue of vram still bugs me. Particularly the fact that Nvidia can put more vram on slower cards shows they don't know what they're doing when it comes to planning an entire generation of video cards.
When your choices of GDDRx chips is between 2GB and 2GB, your only options for a given bus width are 1X or 2X the amount of memory. 6GB was already known to be too little from the 2000-series, which leaves 12GB as the only other option on a 192bits bus short of mixing 1GB and 2GB chips. Mixing VRAM chips makes little sense when the cost difference between 1GB and 2GB is less than $2 and would introduce a bunch of unnecessary complications.
 
A better question may be: does the envelope need to be pushed any further? IMO, at this point, graphics are already beyond what I could care for in games and I'd be 10X more interested in seeing novel concepts than pushing graphics any further.


When your choices of GDDRx chips is between 2GB and 2GB, your only options for a given bus width are 1X or 2X the amount of memory. 6GB was already known to be too little from the 2000-series, which leaves 12GB as the only other option on a 192bits bus short of mixing 1GB and 2GB chips. Mixing VRAM chips makes little sense when the cost difference between 1GB and 2GB is less than $2 and would introduce a bunch of unnecessary complications.
So, you're saying 192-bit bus simply isn't cutting it. Good point.

I would have accepted a little higher power consumption for 256-bit memory bus if it came with 16GB of vram. That would have made sense. If the RTX 4070 was faster with 256-bit memory bus and 16GB of vram, and cost $699, that would have been the perfect card for me. Instead, we have Nvidia cutting corners and lowering cost of production to provide a 4070 and 4070 Ti that has less than desirable vram and the 4070 Ti costs and absurd amount of money for not having the memory that modern games demand. What in the actual F is going on at Nvidia?

FYI Nvidia... not every GPU needs a Ti version. You ruined both the 4070 series cards with only 192-bit memory bus. Give us one good GPU instead of 2 GPUs that simply do not meet our needs or expectations.

Nvidia offered the RTX 3080 10GB at $699 with 320-bit memory bus and 760 GB/s bandwidth. Now, they're offering the 4070 Ti 12GB at $799 with 192-bit memory bus and 504.2 GB/s bandwidth.

In a generation that already suffered from a lack of vram, you made processing power faster and vram slower, and you decided to charge the consumer more than you did the previous generation which already sucked when it came to value per dollar. 11GB in 2017 for $699, 10GB in 2020 for $699. 12GB in 2023 for $799. Meanwhile, your revenue has tripled since 2016. Gamers don't need Nvidia. Big business and governments need Nvidia.

This just keeps going.

The RTX 4070 should have been the 4060 Ti. There should not be a 4070 Ti. That should be a 256-bit RTX 4070 16GB for $699 or less.

RTX 4050 128-bit 8GB.
RTX 4060 192-bit 12GB.
RTX 4070 256-bit 16GB.

How difficult is that? We don't need Ti versions of cards that didn't have enough specs to begin with and you didn't add anything to the Ti version to make up for the lack of specs either. All of a sudden deciding to make up for it in mid-range while leaving higher-end cards to suffer is DUMB! What is wrong with you!?
 
Last edited:

InvalidError

Titan
Moderator
So, you're saying 192-bit bus simply isn't cutting it. Good point.
Not really. More like 8GB isn't enough for the amount of details a 19 TFLOPS32 GPU can push and 12GB isn't an option on 128bits with currently available GDDRx chips. Nvidia needed to make the 4060s 192bits to hit the 12GB sweetspot they should have been at, regardless of whether the GPU needed the bandwidth.
 
  • Like
Reactions: atomicWAR
Not really. More like 8GB isn't enough for the amount of details a 19 TFLOPS32 GPU can push and 12GB isn't an option on 128bits with currently available GDDRx chips. Nvidia needed to make the 4060s 192bits to hit the 12GB sweetspot they should have been at, regardless of whether the GPU needed the bandwidth.
Absolutely! I'm saying 256-bit is where the 4070 should be!
 
well, as far as i am concerned, the 60 series cards were always 1080p cards. and 6gb is enough for 1080p.

But 299 USD for a 60 series card is what i am not ok with.

reduce 100 usd from 4060, 4060ti and 4070 and i am happy.

The 4080 should be 1000 USD. only the 4070ti pricing makes sense.
 
well, as far as i am concerned, the 60 series cards were always 1080p cards. and 6gb is enough for 1080p.

But 299 USD for a 60 series card is what i am not ok with.

reduce 100 usd from 4060, 4060ti and 4070 and i am happy.

The 4080 should be 1000 USD. only the 4070ti pricing makes sense.
6GB is enough for some games, but not all of them. 12GB is not too much to ask for on a $299 GPU in 2023.
 

InvalidError

Titan
Moderator
well, as far as i am concerned, the 60 series cards were always 1080p cards. and 6gb is enough for 1080p.
When you have a 19 TFLOPS32 GPU, 6GB isn't enough to crank details as high as the GPU should be comfortably capable of at 1080p. It already wasn't enough for the RTX2060 to do everything it may have been capable of. Heck, even the 1660/s/Ti got stiffed a bit with only 6GB.
 
When you have a 19 TFLOPS32 GPU, 6GB isn't enough to crank details as high as the GPU is should be comfortably capable of at 1080p. It already wasn't enough for the RTX2060 to do everything it may have been capable of. Heck, even the 1660/s/Ti got stiffed a bit with only 6GB.
Agreed. The 20 series is where Nvidia started doing things that didn't make sense. Hell, even the naming scheme is dumb. Come up with something new. Meanwhile, Intel calls their parts 13900KS-F-what-ever-the-f. Something is terribly wrong and it's not just computer parts.

Sorry, I've had too much coffee this morning.
 
Yeah, it's fundamentally a problem/choice with the memory bus width. With a 128-bit bus, you can do 8GB or 16GB (the latter via clamshell). Would have been nice if Nvidia had done 128-bit on AD107, 192-bit on AD106, 256-bit on AD104, 320-bit on AD103, and 384-bit on AD102. But it didn't, opting instead to save on costs and reduce the VRAM capacities at basically every level except the RTX 4090.
Exactly this.

The funny part is that Nvidia is making $Billions in the supercomputer/AI world and they want to cut the average consumer off at the knees and give us the short end of the stick. Where is all that extra money going? I guess it doesn't matter... and neither do your GPUs.
 
I can't agree with you enough here. I think buyers would have been far more forgiving of prices in the 70/80 class cards too if Nvidia had done this.
The easier thing for Nvidia would have been to price the GPUs more accordingly. The RTX 4080 costs 71.5% higher than the RTX 3080 and offers about the same 71.5% in performance increase. That means we didn't get anything new from the 40 series except a more expensive option for more performance. I would have bought a 4080 if it were $699, maybe even $799. Instead, they gave me a 4070 Ti for $799 and less than desirable memory specs. No thanks.

I'm more on the side of the viewpoint that no consumer GPU should cost $1,999. That's the prosumer Titan territory of pricing and the flagship fastest gaming GPU on the planet should stick to around $999 or less. Unfortunately, I suppose the value of the USD$ is going to crap. What's the goal of The Fed btw? Please remind us. Since the beginning of The Fed, we've gone through The Great Depression and 2 World Wars and lost our very soul (mind, will, and emotion) to some group of satanic occultists who literally utilize technology to control the human mind. That's just my schizophrenic viewpoint of the matter.
 
  • Like
Reactions: atomicWAR

atomicWAR

Glorious
Ambassador
The easier thing for Nvidia would have been to price the GPUs more accordingly. The RTX 4080 costs 71.5% higher than the RTX 3080 and offers about the same 71.5% in performance increase. That means we didn't get anything new from the 40 series except a more expensive option for more performance. I would have bought a 4080 if it were $699, maybe even $799. Instead, they gave me a 4070 Ti for $799 and less than desirable memory specs. No thanks.

I'm more on the side of the viewpoint that no consumer GPU should cost $1,999. That's the prosumer Titan territory of pricing and the flagship fastest gaming GPU on the planet should stick to around $999 or less. Unfortunately, I suppose the value of the USD$ is going to crap. What's the goal of The Fed btw? Please remind us. Since the beginning of The Fed, we've gone through The Great Depression and 2 World Wars and lost our very soul (mind, will, and emotion) to some group of satanic occultists who literally utilize technology to control the human mind. That's just my schizophrenic viewpoint of the matter.
I don't disagree. But if nvidia was so intent on these prices whether out of need or want (likely the latter), then at least higher capacity ram could have been used as a reason for the price increase. As it stands now it looks like Nvidia is charging more for less or less than we expected for a generational leap. But at the end of the day your not wrong IMHO.