News Nvidia RTX 4060 Specs Leak Claims Fewer CUDA Cores, VRAM Than RTX 3060

""The GeForce RTX 4060 may retain a 192-bit memory bus, which has been the norm since the GTX 1060.""

That seems unlikely. The GPU will sport a 128-bit bus interface for sure, same as the rumored RTX 4060 Ti SKU (also because it will come in an 8GB VRAM flavor). That gives the card 288 GB/s bandwidth and all of that will fall within a 115W reference TDP.
 
Last edited:

cyrusfox

Distinguished
14% less cores, 2/3 the VRAM (12gb was generous though on the 3060). This is what comes to mind
7b1jsi.jpg
 
Feb 13, 2023
1
2
15
Seems like an odd choice to make an underpowered 4060.
On the other hand, releasing a 4050 with the mobile GPU could be a really great option. I'd love to have a relatively inexpensive single slot card that I could add to some of the underpowered workstations around my workplace. The only options for that are either insanely expensive or very old...
 

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
Less VRAM isn't really surprising. The 3060's 12GB was an interesting, somewhat bizarre choice, considering that the 3070 and OG 3080 only had 8 and 10 GB, respectively.

If it's true that it has 14% fewer cores than the 3060, though... dang, that's just sad. :-/ Higher clocks will probably make it perform similarly, but not for a cheaper price.

It feels like the entire RTX 40 series will be a mulligan. I hope it will be, and that Nvidia learns a lesson and gets back to actual progress next year, but I'm not gonna hold my breath.
 

InvalidError

Titan
Moderator
A game like Hogwarths Legacy uses more in 1080p.
Only if you push it to 1440p Ultra RT where the RTX3060 only does 26FPS on average, not really playable. If you lower details enough to get 50+FPS average, then 8GB GPUs like the 3070Ti beat the 12GB RTX3060, VRAM is no longer the main limiting factor.

Seems like 8GB is just about right for the amount of details that can be pushed out with the 3060's compute power.

The worst part about the RTX4060 will likely be its MSRP which I wouldn't be surprised if it was $500, which would be absolutely atrocious value per dollar for what should be 50-tier hardware.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
How much VRAM usage a game or hardware monitoring software reports has no direct correlation with how much of the VRAM's buffer is actually used to render frames.

You are wrong again. It is used to render frames, and it does impact in-game FPS.

That 3060 12GB VRAM is beating that 3080 10GB and 3070 8GB in-game, in 1080p, with higher FPS.

I suggest you stop now, admit your mistake, before digging deeper.

(source: Hardware Unboxed)

fgdgdgdg.png
 
Last edited:

neblogai

Distinguished
That 3060 12GB VRAM is beating that 3080 10GB and 3070 8GB in-game, in 1080p, with higher FPS.

That FPS count just means you need to turn off RT, or lower other settings, as this tier of GPU could not run well with these settings at good fps, even if it had enough VRAM (as the 306012GB shows). And then, without RT/Ultra settings, 8GB will be fine, even in a such an outlier title as HL.
 

neblogai

Distinguished
8gb 128bits less hardware higher price. What can be worst waiting for 4050ti 64bits edition

4060Ti is worse- it is a more expensive and more capable GPU, but still with 8GB VRAM. 4070Ti 12GB at >double the price of this 4060 is very questionable too. But 8GB for AD107 cards, as well as for Navi 23 and Navi33 cards I think is fine due to their lower price level, and 1080p rendering focus.
 

Amdlova

Distinguished
4060Ti is worse- it is a more expensive and more capable GPU, but still with 8GB VRAM. 4070Ti 12GB at >double the price of this 4060 is very questionable too. But 8GB for AD107 cards, as well as for Navi 23 and Navi33 cards I think is fine due to their lower price level, and 1080p rendering focus.
Still waiting 1080p graphics card or gaming a little over a decade ago. 3870 4870 5870 6970 7870... gaming with 512mb long time ago... but now low end graphics has the price of enthusiasts card, can't play a game on 1080P. I got a 1650 only to display, what kind of graphics I need tomorrow to play 1080p 6060ti lol
 
That FPS count just means you need to turn off RT, or lower other settings, as this tier of GPU could not run well with these settings at good fps, even if it had enough VRAM (as the 306012GB shows). And then, without RT/Ultra settings, 8GB will be fine, even in a such an outlier title as HL.


Regardless it's a sign of the times. We PC gamers are a crusty old bunch but we need to admit console specs have a huge impact on game design. 12GB may very well be the new recommended spec. Myself? I will not be purchasing any GPU with less than 16GB VRAM for this gen, if only for longevity purposes. It did me well with Pascal (8GB), and it will do me well this gen whether it's AMD or Nvidia. Yes, there is more to the equation, but VRAM is a pretty big variable in that equation and it's a one time purchase.
 
Wow on the 3080 numbers. I bought one a month or two ago thinking it would last a couple of years or so. Guess I should have gotten the xbox series x or ps5 instead.

The fact that they’ll charge more for the 4060 than previous generation and the 4060 may very well be a slower card, but they’ll charge more because the 4000 series will have dlss 3 which will give fake performance gains, and you won’t be able to get dlss 3 on your 3000 series cards which they’ve been happy to keep selling to clear out their stock. Good move nvidia.

Should held out for a 6800xt or 6900xt before buying another nvidia product. I know for next time.
 
To be honest I’m not a Harry fan really but the game looked cool so I purchased it on my xbox series s. Even though the series s is the lower tier system it really doesn’t perform badly if you leave it on performance mode. Wanted to be comfortable playing from a recliner on the 65 inch.

The game does have a graphical hitch here and there on the series s but it’s really not a bad experience. There are some games I prefer on pc but I like playing a game like that on the larger screen. As a non Harry Potter fan I’m enjoying the game and probably have 8-10 hours in it so far.

On the pc since you can see this situation though will likely play out more and more. I just paid 600 for the 3080 a couple months ago around Christmas time. So you can see why I’m slightly annoyed with the vram usage. I can just imagine this situation will repeat itself as new games arrive. They’ll probably issue patches but as the 3000 series cards age how many of those fixes will there be say a year from now?

Edit: it occurs to me that amd will soon be releasing their new version of fsr. Since nvidia has been restricting dlss 3 to 4000 series cards, if the new fsr is anywhere near comparable and still works on nvidia hardware as in the past, you could see folks like me with a higher end nvidia card from the 3000 series using fsr. Not that amd is saintly compared with nvidia…but you see stuff nvidia pills and you feel like buying amd cards. Intel keeps up development though who knows my next gpu could be an Intel arc in a couple of generations.
 
Last edited:

YouFilthyHippo

Prominent
Oct 15, 2022
168
84
660
Someone needs to figure out a hack to get DLSS 3.0 on RTX3000 series cards as this would completely turn NVidias pricing scheme on its head. As far as Hogwartz goes, the PS5 has 16GB unified memory. 2GB of that is likely for the OS (which is quite a bit smaller than windoze 10). Minimum system requirements for Hogwartz says 16GB RAM. Suppose developers can make it work with only 12 on the PS5, that leaves 2GB left over. 1080p Hogwartz uses 9GB VRAM. Let's all hold a candlelight vigil and a moment of silence for consoles only 3 years afte release

As far as GPUs today goes, 8GB is still enough for most games, but for the most demanding games its no longer enough. The RTX 5070 better have 12GB, the 5060 and 5060Ti better than 10GB. Actually wait, NVidia will launch DLSS 4.0 ONLY for those cards and charge 6 kidneys each. Something needs to be done about this DLSS-Price crap. Such a scummy company NVidia is
 

razor512

Distinguished
Jun 16, 2007
2,134
71
19,890
Easy fix, just don't play Hogwarts, lol.
The issue is not just that specific game, it is the trend going forward. With consoles having more VRAM, and the fact that the vast majority of games are built for consoles first before a PC version is made, then nearly all of the development time is spent trying to take advantage of the hardware the console has to offer. This means that the Harry Potter game results will become more common in the years to some.

When the RTX 3080 first came out, I and many others called them out on the small amount of VRAM, as this was an expected outcome if you want to use a video card for more than 2 or so years. You will encounter games where the GPU is powerful enough to max out, but the VRAM will bottleneck performance so much that you will be forced to use lower settings.

One issue with games that have to be scaled back from the console level graphics, is that those processes are not well optimized. The lower settings will often look really bad (far worse than many older games that ran far better).

The RTX 3000 series ended up being the shortest lived generation of video card.

PS, in ARK survival evolved, the RTX 3060 12GB performs better than the RTX 3070 8GB when you have a large base and many tamed dinos.
The RTX 3070 starts out much faster but as soon as the number of visual assets increases significantly, it runs into a VRAM bottleneck.
 
And you can almost see the same thing with the new rtx 4070ti. Sure it has 12gb of vram, but how long before they saturate that 12gb? If they’d given it 16 or 20gb it could potentially be a 4K card. People should have an idea what they are buying but when you spend 800+ on a card…. It wasn’t long ago when 500-600 got you top of the line or close to it.