News Nvidia RTX 4060 Specs Leak Claims Fewer CUDA Cores, VRAM Than RTX 3060

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

LokkenJP

Distinguished
Jun 17, 2014
27
3
18,545
Well. The GTX/RTX X060 lineup was always meant to be the mid-low segment, so I'm perfectly fine with this changes IF it means we are back to having a decent sub <250$ GPU solution.

Of course those changes won't have any sense if we mantain the current pricing tiers, but if the efficiency (FPS per dollar spent) grows substantially from the current 3060, then it will be worth the tradeoff.

Right now the low segment is almost orphaned of any decent GPU solution, even more so on the nVidia corner. Maybe this will change with the 40xx series.
 
i was thinking of maybe getting a 4060 as my 1060 is getting old...but if its barely over a 3060? it best be priced same casue i'd rather save a few and ge ta 3060 if i get 90% of the performance.

Definitely wait on reviews. The way I’m reading it the 4060 may be a bit crippled compared even to the 12gb 3060. If that is the case, you might wait to see what amd and Intel have on offer at the time. I think Tom’s or someone else possibly had an article up showing the Intel arc a770 being cut in price to 350 with 16gb vram. Intel seems like they’ve been making progress, so their next gen might be nice improvement.
 
  • Like
Reactions: atomicWAR

edzieba

Distinguished
Jul 13, 2016
434
426
19,060
Just looking at core count (or memory bandwidth or memory capacity) between generations and assuming performance based on "smaller number means worse" is not likely to be accurate, as we see with existing Ada GPUs:
3090Ti​
4080​
delta​
core count​
107529728
90%​
boost clock​
1.86 GHz2.52 GHz
135%​
memory capacity​
24 GB16 GB
67%​
memory buswidth​
384 bit256 bit
67%​
memory bandwidth​
1008 GB/s717 GB/s
71%​
die area​
628mm^2379mm^2
60%​
We see the 4080 performing roughly 35% above the 3090Ti (anywhere from 25% to 50% in some outliers) discounting the effects of DLSS3, despite the lower core count, lower memory bus width, lower memory bandwidth, and lower memory capacity.
And since the performance delta is close to the clock speed delta, it provides quite a direct example that Ampere -> Ada the same performance is achieved with fewer cores and lower memory capacity and bandwidth.
 
Just 8GB VRAM in 2023, wow.

A game like Hogwarths Legacy uses more in 1080p.

And with consoles having 16GB shared VRAM, 8GB is a serious problem for any cross platform title.

This card better be around $200 or something, because it's already behind the curve.

l-n-mn-n.jpg

it's a problem if you were ultra freak.
 
  • Like
Reactions: KyaraM
Just looking at core count (or memory bandwidth or memory capacity) between generations and assuming performance based on "smaller number means worse" is not likely to be accurate, as we see with existing Ada GPUs:
3090Ti​
4080​
delta​
core count​
107529728
90%​
boost clock​
1.86 GHz2.52 GHz
135%​
memory capacity​
24 GB16 GB
67%​
memory buswidth​
384 bit256 bit
67%​
memory bandwidth​
1008 GB/s717 GB/s
71%​
die area​
628mm^2379mm^2
60%​
We see the 4080 performing roughly 35% above the 3090Ti (anywhere from 25% to 50% in some outliers) discounting the effects of DLSS3, despite the lower core count, lower memory bus width, lower memory bandwidth, and lower memory capacity.
And since the performance delta is close to the clock speed delta, it provides quite a direct example that Ampere -> Ada the same performance is achieved with fewer cores and lower memory capacity and bandwidth.
I agree with what you're saying, but this is nVidia, so this is how it'll pan out:

Performance will be higher, but they'll also ask for more money making the $/frame either the same or worse (value worsens or stays barely the same). The key element here to note is they stopped producing the 3060 12GB and you will only be able to effectively compare it to the "new" 3060 8GB, which is noticeably slower than the 12GB version. That way they'll be able to market "vs 3060" with no shame and put it in all slides.

I'm also thinking they won't be supplying this card to reviewers. I hope I'm wrong, specially on this last point. Alas, I won't give nVidia the benefit of the doubt.

Regards.
 

logainofhades

Titan
Moderator
Definitely wait on reviews. The way I’m reading it the 4060 may be a bit crippled compared even to the 12gb 3060. If that is the case, you might wait to see what amd and Intel have on offer at the time. I think Tom’s or someone else possibly had an article up showing the Intel arc a770 being cut in price to 350 with 16gb vram. Intel seems like they’ve been making progress, so their next gen might be nice improvement.

The A750 got a price cut down to $249, and it trades blows with the RX 6600, thanks to new driver updates. Hopefully Intel keeps this going, and force Nvidia to compete, on price, again.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
The A750 got a price cut down to $249, and it trades blows with the RX 6600, thanks to new driver updates. Hopefully Intel keeps this going, and force Nvidia to compete, on price, again.
One shop in Japan has discounted the A750 down to $150 for a limited time. Wonder if that could be a hint of things to come. I'd definitely buy one at that price.
 
  • Like
Reactions: cyrusfox and KyaraM
Jan 14, 2023
17
11
15
That plus the crippling it with 192 bit bus which has limited its potential at 4K. A very deliberate move by NVIDIA to try and push people onto the 4080.
That's avoidable.. by not playing games at 4K though.

While it's obvious GPUs are a rip off, gamers are the one who bought into the marketing for higher refresh rates and resolutions(or combo of both). Running games higher than 1080p requires a lot of power and people seemingly only buy their hardware if it's winning benchmarks.. Even consoles only run games at close to 4K at 30 fps and performance modes at 60 fps(and a few at 120).

It also appear as though people are buying their PCs mainly to play enhanced console ports, which isn't smart as those ports usually aren't good. Consoles have been the target for multiplat games since the 360/PS3 era because those games sell like garbage(comparatively to console versions) on PC. PC gamers generally don't to pay full console price for games either. We're at a point where people are praising DLSS, an upscaling tech, when a few years ago PC gamers were mocking how console had to use tricks like checkboarding and resolution scaling for 4K. Suddenly now its a must have feature because its now the only way to get playable fps with RTX and higher resolutions in console ports(imagine that!).

If people were sensible, they would choose 1080p144hz or 1440p60.
 
  • Like
Reactions: KyaraM

logainofhades

Titan
Moderator
If people were sensible, they would choose 1080p144hz or 1440p60.

I would argue that a 1440p, 144hz monitor is the sweet spot. As long as you get one with G-sync/freesync support. I could never go back to 1080p, after having been on 1440p, for so long. Started out with 60hz lenovo, and now I have 170hz Acer Nitro's, with freesync premium.
 

razor512

Distinguished
Jun 16, 2007
2,134
71
19,890
One issue with getting decent displays that are 1440p, is that many are too big for the resolution.

For example, if you are a little nearsighted but can see a decent amount of detail, often even at 27 inches, 2560x1440 can cause headaches, especially when doing detail oriented work such as digital art or photo editing, because the pixel grid becomes distracting, it is like using a VR headset that is too low resolution.

Ideay, I would like to have 2 to 3, 24 inch 1440p displays that will offer decent refresh rates while also offering decent DCI-P3 coverage but no one seems to make one.
 

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
I would argue that a 1440p, 144hz monitor is the sweet spot. As long as you get one with G-sync/freesync support. I could never go back to 1080p, after having been on 1440p, for so long. Started out with 60hz lenovo, and now I have 170hz Acer Nitro's, with freesync premium.
I could never go back to 1080p after only about 12 minutes with my 1440p monitor. :-D
I don't have a problem going back to 60 FPS, or even ~48. But I can't go below 1440p again.
 

KyaraM

Admirable
Just 8GB VRAM in 2023, wow.

A game like Hogwarths Legacy uses more in 1080p.

And with consoles having 16GB shared VRAM, 8GB is a serious problem for any cross platform title.

This card better be around $200 or something, because it's already behind the curve.

l-n-mn-n.jpg

My 3070Ti sits at 88% VRAM (7.5GB) average in 1440p. Core utilization at around 97% average. Ultra settings, DLSS if I feel like it but often not. Besides, that game is so stupidly unoptimized, it even eats over 20GB system RAM. There is nothing in the game that should use all that hardware, something went majorly wrong in the code.

You are wrong again. It is used to render frames, and it does impact in-game FPS.

That 3060 12GB VRAM is beating that 3080 10GB and 3070 8GB in-game, in 1080p, with higher FPS.

I suggest you stop now, admit your mistake, before digging deeper.

(source: Hardware Unboxed)

fgdgdgdg.png
My 3070Ti averages around 60 FPS with Ultra Settings and RT Ultra, in Hogsmeade. Worst drop was to 27 GPS, and that happened exactly once so far, else it has always been at maybe 37 lowest. Something is very wrong with HUB's data there, or the community patch does more than smooth out lags, which again implies bad programming, not VRAM actually being an issue. A friend with a 3060Ti reports similar findings. Even considering DLSS Quality at 1440p, since it renders at 1080p with that setting. Btw, most places I can get up to 90 FPS even with RT.
Everything I have seen so far points at bad programming.

EDIT: The mod is explicitly for RT off, and mostly increases Frametimes. So it shouldn't be why my card is faster than in the benchmark under the same circumstances.
 
Last edited:
  • Like
Reactions: AgentBirdnest

evdjj3j

Honorable
Aug 4, 2017
315
325
11,060
You are wrong again. It is used to render frames, and it does impact in-game FPS.

That 3060 12GB VRAM is beating that 3080 10GB and 3070 8GB in-game, in 1080p, with higher FPS.

I suggest you stop now, admit your mistake, before digging deeper.

(source: Hardware Unboxed)

fgdgdgdg.png
Seems like RAM may not be the issue. The RX 6650 XT outperforms the 3080 but only has 8GB of RAM.
 

KyaraM

Admirable
Seems like RAM may not be the issue. The RX 6650 XT outperforms the 3080 but only has 8GB of RAM.
It isn't. See my post above, I get vastly higher FPS than HUB on my 3070Ti with everything including RT on Ultra. There is something weird going on in this test system, or the community patch vastly increases performance somehow instead of just fixing stutters. Also, I have seen a YouTuber who demonstrated that his 4090 outperformed his 7900XTX with Ultra settings, while in the quoted benchmark, the AMD card is faster. CPU was the same. Something is fishy about the benchmark. Seeing when it released, it might miss the Day 1 patch, for example.
 
  • Like
Reactions: Roland Of Gilead

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
My 3070Ti sits at 88% VRAM (7.5GB) average in 1440p. Core utilization at around 97% average. Ultra settings, DLSS if I feel like it but often not. Besides, that game is so stupidly unoptimized, it even eats over 20GB system RAM. There is nothing in the game that should use all that hardware, something went majorly wrong in the code.

My 3070Ti averages around 60 FPS with Ultra Settings and RT Ultra, in Hogsmeade. Worst drop was to 27 GPS, and that happened exactly once so far, else it has always been at maybe 37 lowest. Something is very wrong with HUB's data there, or the community patch does more than smooth out lags, which again implies bad programming, not VRAM actually being an issue. A friend with a 3060Ti reports similar findings. Even considering DLSS Quality at 1440p, since it renders at 1080p with that setting. Btw, most places I can get up to 90 FPS even with RT.
Everything I have seen so far points at bad programming.
Thanks for sharing your numbers! : )


I said this about DirectStorage in Forspoken, and I'll say it about VRAM in Hogwarts: this is ONE very badly optimized game. Let's wait for some more "new gen" games, and have a sample size of more than one or two (and from multiple sources), before making any definitive declarations about VRAM.
 
  • Like
Reactions: KyaraM

Geezer760

Distinguished
Aug 29, 2009
219
108
18,870
Why wasn't this discussed about before all this? that way people wouldn't have wasted their money on these scam cards, and to be honest what is the point of these high end cards when you go shopping for games on steam there is nothing but low end crappy trash games being created these days or just a bunch of unfinished trash games filled with bugs that are not even worth $20-$60, Id rather just keep playing Skyrim, The Witcher 3. game creators are becoming very lazy on game story writing, they are filled with BUGS, and never get patched or finished, you just may be better off buying an Xbox than spending $600-$2000 on a graphics card.
 
Why wasn't this discussed about before all this? that way people wouldn't have wasted their money on these scam cards, and to be honest what is the point of these high end cards when you go shopping for games on steam there is nothing but low end crappy trash games being created these days or just a bunch of unfinished trash games filled with bugs that are not even worth $20-$60, Id rather just keep playing Skyrim, The Witcher 3. game creators are becoming very lazy on game story writing, they are filled with BUGS, and never get patched or finished, you just may be better off buying an Xbox than spending $600-$2000 on a graphics card.
What exactly? Sorry, it wasn't clear to me.

There's been a lot of people being very vocal about nVidia not giving enough VRAM to very expensive models for a good while now. Their knee-jerk reaction was the 3060 12GB, but they've corrected that "mistake" with the 3060 8GB, lel.

Plus, and sorry to say, there's an argument to be made about the people actually buying those cards even with a lot of people shouting "not enough VRAM" and now realizing those people complaining were right in just 1 generation of games. Well, the Wizzard kid game is probably the first "mainstream" entry to make it a thing. Previously it was via mods or those "mega texture packs" from the Devs themselves to make 4K crispy.

Regards.
 
  • Like
Reactions: neblogai

oofdragon

Honorable
Oct 14, 2017
237
233
10,960
On the flip side, developers can keep optimizing their games to run perfectly fine with best graphics at 1080p/60fps on RTX 2060 because GPU performance is still the same compared to "future" generations lol
 

KyaraM

Admirable
Thanks for sharing your numbers! : )


I said this about DirectStorage in Forspoken, and I'll say it about VRAM in Hogwarts: this is ONE very badly optimized game. Let's wait for some more "new gen" games, and have a sample size of more than one or two (and from multiple sources), before making any definitive declarations about VRAM.
Gladly.
And thank you for actually reading, not just ignoring. Btw, I tried running around Hogsmeade with DLSS off, so native 1440p. I get an average of 31 FPS with everything, including RT, set to Ultra. So definitely something off about that Benchmark HUB did. Very weirdly, I get more FPS during the day than at night. Huh... only annoying part are the ever present FPS drops you get with every card. During the day, it's 36 FPS and far fewer drops. Only explanation I have is weather...
 
  • Like
Reactions: AgentBirdnest

cyrusfox

Distinguished
One shop in Japan has discounted the A750 down to $150 for a limited time.
Sign me up! at that discount I would get 2. I have been holding out for the A770(for the 16gb VRAM) but I am looking for sub $300 mark for it.
@ $150 though I wouldn't be picky and realistically the A750 is perfectly serviceable for my workload (Media/graphic/encode, and occasional ancient gaming/emulation. I am personally glad they fixed DX9, Good for Vulkan and great fit for Intel