News Nvidia Reveals RTX 4060 Ti, 4060 with Prices Starting at $299

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ottonis

Reputable
Jun 10, 2020
224
193
4,760
Maybe we should, you know, wait for reviews?
Of course, as always.
Obviously, the performance figures presented by nvidia are achieved in best case scenarios and may not represent the entire truth.
That being said: the category of performance, efficiency and prize of the 4060 seems just right to me. Even when considering that independent testing might reveal that this GPU will on average show lesser performance increases over its predecessor, it will still be in the right category for me - except that independent testing may reaveal some very bad, disruptive surprises.
 

YSCCC

Respectable
Dec 10, 2022
588
478
2,260
Of course, as always.
Obviously, the performance figures presented by nvidia are achieved in best case scenarios and may not represent the entire truth.
That being said: the category of performance, efficiency and prize of the 4060 seems just right to me. Even when considering that independent testing might reveal that this GPU will on average show lesser performance increases over its predecessor, it will still be in the right category for me - except that independent testing may reaveal some very bad, disruptive surprises.
I would say IF it's for the C/P at the beginning of the 4000 series it would be a fair price, but now the problem is those who get hyped by the new gen performance and features have cooled down sufficiently, and that since the V ram limits most of the lineup's performance, there's no real urge to upgrade unless you step into the $1000+ category, so Nvidia won't be seeing a massive sale increase, rather, more like how the GTX 1660 goes, budget gamers really needing a card to replace their old stuffs will buy it every now and then, no sold out, no hype, as someone have a broken PC during the pandemic and broke my bank once for the almost double priced 3070Ti, there is absolutely nothing tempting for me to shell out a large chunk of the money, which I believe a lot of gamers will think so, when the last gen is already limiting the performance, more often it's not the raw GPU power, it's VRAM, thus upping to the 40 series just don't make sense
 
  • Like
Reactions: SSGBryan

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
PC GPU lack VRAM. Period.

Playstation went from 8GB to 16GB VRAM for PS5. This has huge consequences on the size of assets developers create.

Consoles are a $60 billion gaming market.
PC is a $34 billlion gaming market.

Games are designed for consoles. Consoles set the technological bar, whether you like it or not. And 8GB PC GPU are going to struggle for a whole console generation.

The number of game developers willing to redo all their assets for a PC port for people with 8GB GPU are few and far between. 2023 is just the beginning of a new generation of games using lots of high memory assets.

8GB PC GPU are already struggling to handle console ports and the PS5 Pro dev kits with even more memory bandwidth are already in circulation.

No one should be buying 8GB GPU at this point. No one, they are already outdated.


100% agree
You know it, I know it...it seems majority of content creators know it...just maybe some leftover sponsored newslets remain to be taught by...time...and time is starting to show the results, firstly someone on YouTube expresses doubt, makes some tests, it's on the edge! But barely okay for games at that time...and everyone slams that guy for how stupid he is. Then some 5 months pass by, 2-3 new games come out along with the truth.

I am still surprised PCMR crowd never looks at consoles, maybe because they used to have outdated HW. Now we see for example digital foundry showing more and more games not utilizing all CPU cores at launch and it's just a matter of time when new games will fully utilize those 8 cores all while most still say 6 cores for new PC are just fine...yeah, for now maybe, but who buys CPU for 2-3 years?
 

Joseph_138

Distinguished
I don't like this trend that I've been seeing where they think that additional cache memory, allows them to cheap out on memory controller bandwith. A 4060 Ti trying to push 16gb of texture memory, through a 128-bit bus, is going to be painfully slow. Look how crippled the 3060 8gb is compared to the 12gb version. That's not because it has less memory, it's because it uses the same 128-bit memory bus as the 3050, in place of the 192 bit bus on the 12gb card. The odds of having the information that you need, from a 16gb texture buffer, ready to go in 32mb of cache is going to be low. You'll be going to texture memory a lot more than nvidia would have you believe. That effective memory bandwith number is worthless, because it only applies to each succesful cache hit. The rest of the time, the lower number applies.
 
Last edited:

ottonis

Reputable
Jun 10, 2020
224
193
4,760
I would say IF it's for the C/P at the beginning of the 4000 series it would be a fair price, but now the problem is those who get hyped by the new gen performance and features have cooled down sufficiently, and that since the V ram limits most of the lineup's performance, there's no real urge to upgrade unless you step into the $1000+ category, so Nvidia won't be seeing a massive sale increase, rather, more like how the GTX 1660 goes, budget gamers really needing a card to replace their old stuffs will buy it every now and then, no sold out, no hype, as someone have a broken PC during the pandemic and broke my bank once for the almost double priced 3070Ti, there is absolutely nothing tempting for me to shell out a large chunk of the money, which I believe a lot of gamers will think so, when the last gen is already limiting the performance, more often it's not the raw GPU power, it's VRAM, thus upping to the 40 series just don't make sense
I agree that there's little reason to upgrade from last Gen 3060 to 4060.
Even the 2060 (Super) seems good enough for me not to upgrade right now.
However, people still playing games on a 3GB 1060 will definitely see a substantial improvement.
 

Joseph_138

Distinguished
You know it, I know it...it seems majority of content creators know it...just maybe some leftover sponsored newslets remain to be taught by...time...and time is starting to show the results, firstly someone on YouTube expresses doubt, makes some tests, it's on the edge! But barely okay for games at that time...and everyone slams that guy for how stupid he is. Then some 5 months pass by, 2-3 new games come out along with the truth.

I am still surprised PCMR crowd never looks at consoles, maybe because they used to have outdated HW. Now we see for example digital foundry showing more and more games not utilizing all CPU cores at launch and it's just a matter of time when new games will fully utilize those 8 cores all while most still say 6 cores for new PC are just fine...yeah, for now maybe, but who buys CPU for 2-3 years?
When a console is new, it is current, but as it ages, it quickly becomes outdated. PC's are able to move with the latest technology, consoles can't, because they are a closed architecture. The hardware that it comes with when you buy it, is all it's ever going to have. Nothing is upgradeable. 5 or 6 years from now, you're going to be looking at the quality of the games on your console, and comparing them to the latest PC versions, and thinking your console is crap.
 
  • Like
Reactions: atomicWAR

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
I don't like this trend that I've been seeing where they think that additional cache memory, allows them to cheap out on memory controller bandwith. A 4060 Ti trying to push 16gb of texture memory, through a 128-bit bus, is going to be painfully slow. Look how crippled the 3060 8gb is compared to the 12gb version. That's not because it has less memory, it's because it uses the same 128-bit memory bus as the 3050, in place of the 192 bit bus on the 12gb card. The odds of having the information that you need, from a 16gb texture buffer, ready to go in 32mb of cache is going to be low. You'll be going to texture memory a lot more than nvidia would have you believe. That effective memory bandwith number is worthless, because it only applies to each succesful cache hit. The rest of the time, the lower number applies.
so basically the 16gb will be worthless over 8gb !!
 

oofdragon

Distinguished
Oct 14, 2017
327
292
19,060
Instead of a new gen ada cheaper cards turned out to be a new gimmick, fake frames and fake rt all around. Oh well.. so to wrap it up we are looking at the aging 2060 super as the still best $200 choice for NV people and 6600xt for AMD.. but looking at ebay I'm already spoting some $250 6700XT 12GB, that's a winner no contest. $400 RX6800 16GB>RTX4060Ti 16GB, fake frames can kiss my sss, and on top of that fheres the $450 6800XT, plays anything 4K maxed out at 60fps and high refresh lower 1080p gaming. At $600 theres the 6950XT for high 1440p, and now at almost $900 there's the 7900XTX for OLED monitors, no contest against the 4080, higher perf at much lower price. And finally the 4090 is still king but for $1600.

2023 chart
60fps 1080p = $250 6700XT12GB/$250 3060 12GB
144fps 1080p = $450 6800XT16GB/$475 3080 10GB
60fps 4K = $450 6800XT16GB/$600 4070 12GB
144fps 1440p = $600 6950 16GB/$800 4070Ti 12GB
165fps 3440p = $950 7900XTX 24GB
240fps 3440p = $1600 4090 24GB
144fps 4K = $1600 4090 24GB

If Minecraft is a big deal for you, like, you are building a PC for a child, go NV. If it's not, go AMD. Simple choice

6700 XT = $250 RTX 4060Ti 12GB
6800 XT = $450 RTX 4070 16GB
6950 XT = $600 RTX 4070 Ti 16GB
7900 XTX = $950 RTX 4080 24GB
 
Last edited:
  • Like
Reactions: Amdlova

Loadedaxe

Distinguished
Jul 30, 2016
219
144
18,790
Those complaining about the vram on these new cards should really take a step back and consider other sources and read reviews on "other" video cards.
Nvidia is not the only video card manufacturer, and I bet some of the other cards out on the market right now that may be better performers and even cheaper.
Lets wait for Jarred to get his hands on one and go from there.
 

sitehostplus

Honorable
Jan 6, 2018
404
163
10,870
Sure, if you want a 300W card instead of a 200W card and don't care about ray tracing or DLSS. Same old story from AMD. I expect the 6800 XT will win by ~20% (give or take) in rasterization, lose by ~15% in ray tracing, and lose by ~30% in games that support RT and DLSS (and not FSR2 — still a 15% loss if a game also supports FSR2, i.e. CP77).
So, in other words, nVidia is going to stomp the crap out of AMD again.

This is why nVidia can get away with requiring your soul to get either a 4080 or 4090. AMD is clearly not competitive, and clearly does not want to be.

I can almost guarantee you if AMD or Intel was competitive, nVidia would magically find a way to lower video cards $800 overnight.
 

sitehostplus

Honorable
Jan 6, 2018
404
163
10,870
When a console is new, it is current, but as it ages, it quickly becomes outdated. PC's are able to move with the latest technology, consoles can't, because they are a closed architecture. The hardware that it comes with when you buy it, is all it's ever going to have. Nothing is upgradeable. 5 or 6 years from now, you're going to be looking at the quality of the games on your console, and comparing them to the latest PC versions, and thinking your console is crap.
It's also one heck of a lot cheaper.

This should refute nVidia's claims about needing to charge so much for a video card. They simply do it because they can. And who can blame them?
 
is it just me or does the baseline 4060 have lower specs than the 3060 minus its clock speeds & the tflops?Sms, tensor, shaders, rt cores all less than 3060....yes the clock being higher will help but only so much & w/o dlss3 (which is msot thigns as its got little support atm like gen 1 Raytracing did) it looks actually worse than the last gen card ....
 
  • Like
Reactions: SSGBryan

qwertymac93

Distinguished
Apr 27, 2008
118
59
18,760
Running defense for Nvidia's decision to charge an entire $100 for 8GB more VRAM is completely unnecessary. The "clamshell mode" VRAM is not nearly as big a deal as the article makes it out to be, and in fact, doubled up VRAM used to be the norm. Going way back to the 8800gt you had the 256MB and "clamshell" 512MB versions that were only separated by ~$35. This very article shows the 3GB and 6GB 1060 cards being only $50 apart.

Nvidia didn't want there to be too big a gap in their product stack so a mere $50-$75 premium over the 8GB model just wouldn't do! Had to do the full hundo to get that nice uniformity in pricing.

is it just me or does the baseline 4060 have lower specs than the 3060 minus its clock speeds & the tflops?Sms, tensor, shaders, rt cores all less than 3060....yes the clock being higher will help but only so much & w/o dlss3 (which is msot thigns as its got little support atm like gen 1 Raytracing did) it looks actually worse than the last gen card ....

Yeah we're pretty much guaranteed to see some games where the 3060 outperforms the 4060 without the DLSS3 handicap in play. Games that rely on memory bandwidth (and capacity) could show some serious gaps. The 4060 is a 3050 competitor.
 
Last edited:
May 19, 2023
14
10
15
Running defense for Nvidia's decision to charge an entire $100 for 8GB more VRAM is completely unnecessary. ...
Sure, the same holds true about complaining about this, about the prices and names of the products, about their specs, etc.

The recent absurd amount of whining about 4070 is laughable given the fact that 4070 are now the best selling cards on the market.

What matters are the sale numbers. Only that.

Does anybody really believe that 4060 for $299 will not be selling well?

Customers will have a chance to SHOW, whether an extra 8 GB of VRAM is worth $100 or not.
 
Last edited:
Perhaps I should've said it is marketing BS. AFIAK AMD never came up with an "effective bandwidth" number to hide the fact that they trimmed the bus width in half. Instead, they called it what it was, a cache in front of the framebuffer. It is a clever idea and that is why Nvidia is using it but it does have compromises.

The RX 6600 XT had a refined architecture and massive boost in core clock speed to keep up with the 5700 XT. That being on top of the new Infinity Cache as well. It looks like Nvidia is doing something similar with the 4000 series.

I am far from the only one who is critical of the 4000 series. I think a lot of it has to do with naming and pricing. Nvidia is trying to sell DLSS 3 hard and I am not sure many people are buying it. It looks like a very modest upgrade at best compared to the 3060 Ti without the new features. Time will tell.

So, again, I should have said BS marketing slides. Sort of like when they doubled the CUDA cores on slides which wasn't quite right either.
AMD has absolutely given "effective bandwidth" numbers on RDNA 2/3 chips. And again, it's not just marketing, it's engineering. Because when people look specs and see a drop in bandwidth, they get worried. Looking only at bus width or bandwidth is as misguided as looking only at theoretical teraflops.

The RX 6600 XT has fewer cores at higher clocks to get 10.6 TFLOPS, and RX 5700 XT has 9.8 TFLOPS. The point isn't that they have similar compute, it's that the 6600 XT has 256 GB/s of bandwidth while the 5700 XT has 448 GB/s. How can it deliver similar performance with 43% less bandwidth? Infinity Cache. How does Nvidia deliver a big generational boost in performance with the RTX 4090 over the RTX 3090 Ti, even though they have the same GDDR6X configuration and bandwidth? With a much bigger L2 cache.

You can call BS on Nvidia's pricing. You can question how good DLSS 3 Frame Generation really is. You can complain about the lack of VRAM capacity. But the "effective memory bandwidth" figures are probably the least problematic aspect of the GPUs. The only real issue is that getting more effective bandwidth from a narrower bus means it's possible to end up with less VRAM because there aren't as many memory channels to go around.

Proof:

AMD:
1684505768893.png
1684505803869.png
1684505845057.png
1684505918121.png
1684505889730.png

Honestly, I appreciate having the "effective bandwidth" data. It's AMD and Nvidia saying, in effect, this is the average hit rate of our L3/L2 caches. AMD didn't publish effective bandwidth data on the earlier RDNA 2 GPUs. Actually, it's a bit hit and miss right now. RX 6500 XT, RX 6700 10GB, and RX 6700 XT list effective bandwidth (along with the above five cards). RX 6900 XT, RX 6800 XT, RX 6800, RX 6600 XT, RX 6600, and RX 6400 do not.
 

Attachments

  • 1684505788923.png
    1684505788923.png
    22.4 KB · Views: 3
Running defense for Nvidia's decision to charge an entire $100 for 8GB more VRAM is completely unnecessary. The "clamshell mode" VRAM is not nearly as big a deal as the article makes it out to be, and in fact, doubled up VRAM used to be the norm. Going way back to the 8800gt you had the 256MB and "clamshell" 512MB versions that were only separated by ~$35. This very article shows the 3GB and 6GB 1060 cards being only $50 apart.

Nvidia didn't want there to be too big a gap in their product stack so a mere $50-$75 premium over the 8GB model just wouldn't do! Had to do the full hundo to get that nice uniformity in pricing.
Higher speed VRAM makes it more difficult to do stuff like clamshell memory designs. The fact is that since the GTX 600-series, there have been very few GPUs with memory on both sides of the PCB. Off hand, I couldn't even name the last time we had an AMD consumer graphics card that had VRAM on both sides.

We'll just have to wait and see if AMD does any "clamshell" offerings I guess. Because we don't know for sure exactly how much it increases the BOM. But like I said, a $50 increase in BOM translates to $100 retail in general. Was it a $50 increase, or only a $30 increase? I couldn't say. But there's also market differentiation and economics, so if Nvidia thinks it can charge $100 extra and people will pay it, the smart business decision is to charge $100 extra.

Keep in mind that the double VRAM will probably be most beneficial for AI workloads. There are things that you simply can't do in AI with less VRAM. Several tools (Whisper and Text Generation) end up VRAM and CPU constrained rather than GPU constrained. There will absolutely be some AI people that want more than 8GB but don't want to pay the $1200 for a 16GB 4080 or $1600 for a 24GB 4090 — especially if they can make do with a $500 4060 Ti 16GB.
 
  • Like
Reactions: adbatista

YSCCC

Respectable
Dec 10, 2022
588
478
2,260
When a console is new, it is current, but as it ages, it quickly becomes outdated. PC's are able to move with the latest technology, consoles can't, because they are a closed architecture. The hardware that it comes with when you buy it, is all it's ever going to have. Nothing is upgradeable. 5 or 6 years from now, you're going to be looking at the quality of the games on your console, and comparing them to the latest PC versions, and thinking your console is crap.
Yea but the whole console now just cost less than a decent graphics card do, and for a lot tof ppl the general use of PC is basically replaced by smartphones or tab, so the PC don't need to be replaced that often for most work, for gaming, by the time the console hardware is obsolete, they release the next gen, which then you can upgrade, this works for most ppl. Upgrading PC with paired system to the graphics card 5 years onward likely cost like 5-6 consoles altogether
I agree that there's little reason to upgrade from last Gen 3060 to 4060.
Even the 2060 (Super) seems good enough for me not to upgrade right now.
However, people still playing games on a 3GB 1060 will definitely see a substantial improvement.
but then if they lurk onto the 3GB 1060 this long, likely they don't have budget to pay for the 4060 Ti either, you need newer CPU and PCIE to unleash that, definitely not the era of 1060. and even considering that, the market will be tiny for Nvidia, it just looks like Karma for them thtis gen, bad pricing ruining what could have been a big upgrade after 3 years of last gen products
 

lsorice

Distinguished
Oct 27, 2013
12
1
18,515
3060 had 12GB for $299.
4060 only has 8GB for $299.

At $499, the 16GB 4060 Ti is out of reach for most consumers.

Looks like PC gaming is set for another year of struggling to run console ports.
The 3060's MSRP was $329 and nearly impossible to find for that. They also had 8gb and 12gb variants.