News Gigabyte App Allegedly Confirms Nvidia RTX 4070 and RTX 4060 Memory Configs

InvalidError

Titan
Moderator
So, how many more GPU generations until Nvidia execs get off their yachts and private islands long enough to finally figure out that the crypto fad is over, and GPU mining is dead?
From which generation are we counting from and until what generation are we counting to?
The entire industry got borked by crypto and covid during the 3k series.
The 4k series got borked from being designed and priced based on expectations that crypto would go on.
The 5k series will likely get borked by its goals likely set on many of the same assumptions that borked the 4k series and could be too far along development to correct.
The 6k series should be the one where Nvidia will be forced to admit its unrealistic expectations have crashed and burned, though it may still try its hands at inflated prices for the performance it delivers one more time, albeit not as brazenly.
So the 7k series is where I would expect Nvidia to have mostly cleansed itself of covid-crypto greedflation and return to more normal greed.

If we count inclusively from the moment greedflation set in to the moment it is mostly gone, that would be five generations (3k through 7k) for me.
 

doughillman

Distinguished
Feb 13, 2012
41
52
18,610
MLID dropped the 749.99 rumor so I'd take that with a truck load of salt. He throws everything he can at the wall hoping something sticks though I do enjoy his vids on occasion...

I quit watching his junk a long time ago. He spends way too much time crowing about the very few things he gets correct and never again mentioning the preponderance of stuff that he's whiffed on. As you say, he just throws everything he can out there. Even a blind squirrel finds a nut now and then.

Just make up your own rumors in your head. You'll likely be right about as often as he is.
 

atomicWAR

Glorious
Ambassador
I quit watching his junk a long time ago. He spends way too much time crowing about the very few things he gets correct and never again mentioning the preponderance of stuff that he's whiffed on. As you say, he just throws everything he can out there. Even a blind squirrel finds a nut now and then.

Just make up your own rumors in your head. You'll likely be right about as often as he is.
I only watch now to see how laughably wrong he is and so I know when an outlet picks up his junk....
 
  • Like
Reactions: KyaraM and artk2219

razor512

Distinguished
Jun 16, 2007
2,130
68
19,890
8GB VRAM is simply not enough, the 4060 would be a horrible purchase, especially as more games come out being first developed for consoles such as the PS5. 8GB VRAM means that games will have to be run at lower settings than necessary for the GPU to avoid having system memory get used as VRAM.
 

Giroro

Splendid
From which generation are we counting from and until what generation are we counting to?
The entire industry got borked by crypto and covid during the 3k series.
The 4k series got borked from being designed and priced based on expectations that crypto would go on.
The 5k series will likely get borked by its goals likely set on many of the same assumptions that borked the 4k series and could be too far along development to correct.
The 6k series should be the one where Nvidia will be forced to admit its unrealistic expectations have crashed and burned, though it may still try its hands at inflated prices for the performance it delivers one more time, albeit not as brazenly.
So the 7k series is where I would expect Nvidia to have mostly cleansed itself of covid-crypto greedflation and return to more normal greed.

If we count inclusively from the moment greedflation set in to the moment it is mostly gone, that would be five generations (3k through 7k) for me.
I would start at 4k and not include off-year "super" refreshes. So 4k is borked, 4k super is super borked, 5k Nvidia doubles down on the 4k strategy which probably is where things fail particularly noticeably (They'll do something outrageous like push $800+ for an RTX 5060 or prevent 3rd parties from making high-end cards). 5k super Nvidia sticks to 5k pricing but the company, or at least investors react to multiple quarters of missed revenue targets. Intel will finally get their act together, or maybe PlayStation 6/ Xbox Series S IV will be making waves. By RTX 6k They might finally at least test the waters by offering some kind of mainstream-priced card. Maybe this will be when they get around to making some kind of GTX 1750, or equivalent. More likely they'll crap out a highly nerfed/feature-stripped RT 6060 for ~$400 (their lowest priced card in 3+ years) and call it a day. RTX 6k Super we may feel some real effort to win back mainstream and midrange gamers.

So, I'm calling that as 5-6 years, and 3 generations until they take any action to acknowledge that the overwhelming majority of gamers are not tech-influencer millionaires.

Or, Nvidia will just continue to not care about gamers and go-all in on big data, workstations, "content creation", and prosumer AI.
 

Ogotai

Reputable
Feb 2, 2021
323
219
5,060
if ur buying the xx60 tier ur generally not maxing settings anyways.
heh, thats funny, as thats what i do, max out the settings expept AF, that i set to 8x vs 16x. and i was using a 1060, and now a 3060, still max eye candy in every game. but to be fair, the games i currently play arent exactly gpu intesive either :)
 
  • Like
Reactions: artk2219

TheAlmightyProo

Distinguished
May 3, 2015
34
4
18,535
8GB VRAM is simply not enough, the 4060 would be a horrible purchase, especially as more games come out being first developed for consoles such as the PS5. 8GB VRAM means that games will have to be run at lower settings than necessary for the GPU to avoid having system memory get used as VRAM.

I've used as much as 13-14Gb with a 6800XT at 3440x1440 over the last 22 months (that's current and recent AAA games for that period) Sure, that's a minority of games played overall, the average usage afaik is around 10-12Gb, but not for too much longer. Newer cards with 8Gb might just suffice for 1080p now and going forward to next gen... But then again my old 1070 had been seeing it's 8Gb maxed out in contemporary AAA's at 1080p some time before 2021 (GDDR5 sure, but 6 /6X aren't multiple times faster either) and this was when knowledgeable ppl said 4Gb was fine for 1080p...

On the other hand, my gf's PC (5600X/3070 @ 3440x1440) runs many of the same games ok, 20-25 fps lower on average vs 90-100 with my 6800XT, but it is a work first, gaming second rig. My Legion 5 (6800H/3070ti @ 1440p) gets tweaked settings and likely capped as well with AAA games to keep thermals and fan noise balanced. For that anything over 60 fps ultra and under 75-80C is a bonus. Those two examples I'm ok with lower VRAM caps due to those caveats, it's what you make of it.

For a 6800XT upgrade with some longevity and even at the same res I'd be looking at no less than 20Gb rn and for the next couple of years at least, even if only to cover as many games and optimisation outliers as possible (and numbers of the latter seem to be increasing) So only a 7900XT/XTX for now will do, with the latter having a big enough bump to be worth the price rn. The 4090, currently starting at around £500 more than the most expensive 7900XTX (which in turn matches the cheaper 4080's in price) is well out of reach. However, this 6800XT is still killing it (ditto my 5800X) so I might wait for RDNA4 and spend my pennies on useful, necessary irl stuff until then.
 

KyaraM

Admirable
8gb 4060 or 12gb 4070? Yawn. Been thinking more on a 6800xt which can be found under 600 bucks now. Benchmarks show them close to a 4070ti. Think I’m going to save a little more cash for one of those.
In which parallel universe is a 20-40% (rasterizing/RT) difference "close"?
https://www.techpowerup.com/review/palit-geforce-rtx-4070-ti-gamingpro-oc/31.html

And this doesn't even count power efficiency yet, or other stuff Nvidia has over AMD (I know, I know, those things don't count, right...)... at which point the 4070Ti outright murders the 6800XT. You simply get more frames for lower consumption on the 4070Ti, which very quickly saves you money you spent more on that card. But hey, nothing new that this is only a valid argument when it's AMD that is more efficient...
 
For me at least if I can have the other card under 600 and the 4070ti is going to cost 800, it’s going to take a while to make up 200+ dollars of energy use. My average bill for the entire house is usually under 200. Your point is well taken the 4070ti in many cases is faster. Certainly it’s faster in ray tracing if you care about that. Take a look at the video below though. I think that or another may have been what I was looking at. As far as dlss, amd hasn’t brought out fsr 3, so that could rebalance a little. Me personally the 6800xt is close enough for far less. If I’m spending 800, probably getting the 7900xt with more vram. But for a value pick I like the 6800xt.


View: https://youtu.be/wRCVRsjs0aE


View: https://youtu.be/lSy9Qy7sw0U
 

Joseph_138

Distinguished
We have 3GB DRAM chips, 3GB GDDR6/6X/7 cannot be very far behind. We may very well start seeing 12GB on 128bits or 9GB on 96bits next year or possibly sooner.

The bandwith on those memory controllers would render all that memory useless, like it does on the 8gb 3060, and the 3050. Dropping from 192 bit memory bus on the 12gb 3060, to 128 bit on the 8gb 3060, is crippling. Merely losing 4 gb of memory, can't account for the performance drop alone. My 2060 Super is faster, and that has the same amount of memory, but on a 256 bit bus.
 

InvalidError

Titan
Moderator
The bandwith on those memory controllers would render all that memory useless, like it does on the 8gb 3060, and the 3050.
It is far from useless: still much faster and lower latency than having to fetch assets from system memory so instead of seeing performance dive straight off a cliff once the active data set passes 8GB and system memory has to cover the difference, you only get a steady decline with memory controller load.
 

Joseph_138

Distinguished
It is far from useless: still much faster and lower latency than having to fetch assets from system memory so instead of seeing performance dive straight off a cliff once the active data set passes 8GB and system memory has to cover the difference, you only get a steady decline with memory controller load.
If you turn up the settings on your games, to where your memory usage is pushing up to near 8gb, then you are putting a load on the memory controller. It can't pass through all that information, like a 192 bit or 256 bit memory controller can. That's why the 8gb 3060 is barely faster than the 3050, and far slower than the 12gb card. I reiterate my point. If the 8gb 3060 is barely faster than the 3050, then it's pointless to have 8gb on a 128 bit memory controller, because the controller is bottlenecking the GPU to the point where you aren't getting all the performance that you could be getting from it. They would have been better off to release a 6gb 3060, using the same 192 bit controller, than releasing the pile of steaming poop, that the 8gb version is.
 
Last edited:

InvalidError

Titan
Moderator
If you turn up the settings on your games, to where your memory usage is pushing up to near 8gb, then you are putting a load on the memory controller.
If you push an RTX3050 beyond 8GB, your FPS will go from 80 to 20 once it starts going to system memory. If you give it 3GB chips for 12GB total , still on 128bits, it might dip form 80 to 60 instead because everything is still in local VRAM at 50ns access time and 200+GB/s of bandwidth instead of 150+ns of latency at only 32GB/s from system memory over 4.0x8.
 
So, how many more GPU generations until Nvidia execs get off their yachts and private islands long enough to finally figure out that the crypto fad is over, and GPU mining is dead?

Is it 3 generations? I bet it's 3 generations.

If there is no crypto then gamer only need to buy gpu at crypto season price. They will do this until gamer accept this as a norm :p
 
From which generation are we counting from and until what generation are we counting to?
The entire industry got borked by crypto and covid during the 3k series.
The 4k series got borked from being designed and priced based on expectations that crypto would go on.
The 5k series will likely get borked by its goals likely set on many of the same assumptions that borked the 4k series and could be too far along development to correct.
The 6k series should be the one where Nvidia will be forced to admit its unrealistic expectations have crashed and burned, though it may still try its hands at inflated prices for the performance it delivers one more time, albeit not as brazenly.
So the 7k series is where I would expect Nvidia to have mostly cleansed itself of covid-crypto greedflation and return to more normal greed.

If we count inclusively from the moment greedflation set in to the moment it is mostly gone, that would be five generations (3k through 7k) for me.

Probably another mining craze will happen between that 5 generation. Well at least some miners are counting for another craze to happen.
 
8GB VRAM is simply not enough, the 4060 would be a horrible purchase, especially as more games come out being first developed for consoles such as the PS5. 8GB VRAM means that games will have to be run at lower settings than necessary for the GPU to avoid having system memory get used as VRAM.

4060 will be marketed as 1080p gpu with medium setting.