News Edward Snowden slams Nvidia's RTX 50-series 'F-tier value,' whistleblows on lackluster VRAM capacity

While I agree with both the Tom's review and Snowden, honestly, the only reason I read the article was out of curiosity. I don't give his opinions more weight on the issue than I do, say, Miley Cyrus.
I do idly wonder just why he should care. He shouldn't be able to legally get his hands on any modern RTX card.
So, I guess what I'm saying is your clickbait actually worked this time. But don't keep doing that. I'll stop reading this site if it becomes "random celebrity not known for hardware expertise slams hardware manufacturer."
 
The performance will always be regarded as lackluster since 40/50 are using the same node. Nvidia can fix the VRAM for various cards by using 3 GB modules. 24 GB 5080 can happen on the same 256-bit bus.

Maybe the worst thing coming is a 5060 Ti 8 GB.

While I agree with both the Tom's review and Snowden, honestly, the only reason I read the article was out of curiosity. I don't give his opinions more weight on the issue than I do, say, Miley Cyrus.
I do idly wonder just why he should care. He shouldn't be able to legally get his hands on any modern RTX card.
So, I guess what I'm saying is your clickbait actually worked this time. But don't keep doing that. I'll stop reading this site if it becomes "random celebrity not known for hardware expertise slams hardware manufacturer."
Here's your Cool Story award 🏆
 
The real reason that Nvidia constrains the VRAM capacity, despite it being such an unpopular position and not increasing prices that much (relatively-speaking), is because they want to segment the consumer and professional markets.

The main two things differentiating Quadro and GeForce in most applications is VRAM capacity and power/thermal-design. And they ofc like to sell Quadro cards at an even more ridiculous price than their top-end GeForce cards.
 
he aint wrong.

Games want more vram as they go on (especially raytracing req. titles (indy game wants 12gb for 1440p)

and if you use frame gen and all it uses more, but any background usage..it adds up.

vram cost is chump change for an extra 8gb of the stuff compared to the profit they make off a single gpu.
Yeah I been running through debugging the new Spiderman 2 on PC that loves to crash more than run and found that the game is constantly looking for about 31GB+ of VRAM in the back round at only 1080p high settings lol.
At least I found that every single crash dmp that I have and ppl sent me have all said the same reason for failing back to desktop.... The games .exe is constantly trying to to release a resource that didn't belong to it. I'm assuming that it does belong to it but they didn't code it to grab the proper permissions from the is and is 95% of the crash issues. I'm shocked they haven't been able to find out what resource is being not given the proper permissions and fix it fairly quickly since 34 crash dmps I have gone through all have the same crash 😂
But yes as u were saying games are constantly wanting as much VRAM nowadays and the 5080 is a slap in the face to the consumer
 
The performance will always be regarded as lackluster since 40/50 are using the same node. Nvidia can fix the VRAM for various cards by using 3 GB modules. 24 GB 5080 can happen on the same 256-bit bus.

Maybe the worst thing coming is a 5060 Ti 8 GB.

Here's your Cool Story award 🏆
Exactly.

And using 24 Gbit GDDR7 chips on a 192-bit bus gives us 18 GB. So, an informed critique would've actually suggested that the regular RTX 5070 should have 18 GB. I mean, don't we all wish it had more? ...but assuming Nvidia was going to stick with the 192-bit interface at that tier.
 
The real reason that Nvidia constrains the VRAM capacity, despite it being such an unpopular position and not increasing prices that much (relatively-speaking), is because they want to segment the consumer and professional markets.

The main two things differentiating Quadro and GeForce in most applications is VRAM capacity and power/thermal-design. And they ofc like to sell Quadro cards at an even more ridiculous price than their top-end GeForce cards.
Well, yes. But the workstation cards usually have double the number of DRAM chips (i.e. memory on both sides of the PCB), for double the capacity. So, if they put 24 GB on the RTX 5080, the workstation version could still provide value in having 48 GB.

I'm sure the RTX 5080 Super will have 24 GB.
 
While I agree with both the Tom's review and Snowden, honestly, the only reason I read the article was out of curiosity. I don't give his opinions more weight on the issue than I do, say, Miley Cyrus.
I do idly wonder just why he should care. He shouldn't be able to legally get his hands on any modern RTX card.
So, I guess what I'm saying is your clickbait actually worked this time. But don't keep doing that. I'll stop reading this site if it becomes "random celebrity not known for hardware expertise slams hardware manufacturer."
I will admit when I saw the headline with Snowden's name I thought it must be someone else who happens to have the same name.
Nope.
Clickbait.
I am surprised that a site that I thought was a dependable source of useful information would stoop to that. I hope it does not continue.
 
I'm sorry, why does Edward Snowden's opinion matter when it comes to hardware any more than, say, my dentist's opinion?
Well, he is a tech guy and a pretty smart one. So, I think he's probably at least as informed as most of us. Presumably, that's not true of your dentist.

Other than that, no reason that I'm aware of why his opinion would count for any more than other tech celebs (that don't specialize in AI or related hardware).

What I'm more curious about is whether this was an attempt by Snowden to pander, or if he regularly posts opinions on all kinds of stuff and this one just got plucked from his feed, as a way to try and highlight what most other people are saying about the 5080.
 
Limiting $400+ cards to 8GB is incredibly stingy when AMD's RX 480 offered that much VRAM on a competitive midrange card that launched for just $240 more than 8 years ago. Even on Nvidia's side, the 1060 was offering 6GB for about the same price at that time, and the 1070 offered 8GB for less than $400. For recent cards, they likely realized that many compute and AI workloads require lots of VRAM, and are limiting it to upsell customers to higher-end hardware than they might otherwise need.
 
already assume that...the base 5070 is coming with 12.

so the 5060 ti will likely have 12 and base 5060 8gb.

Nvidia straight up stopped making the 60 tier gpu a good buy way back with the 30 series.
16 GB and 8 GB for the 5060 Ti.

https://www.tomshardware.com/pc-com...16gb-models-2016-wants-its-vram-capacity-back

It's also possible that we see 16 GB, 8 GB, and 12 GB 5060 Tis in the card's lifespan. Nothing stopping that from happening. 24 GB is possible but highly improbable.
 
Last edited:
  • Like
Reactions: bit_user
Limiting $400+ cards to 8GB is incredibly stingy when AMD's RX 480 offered that much VRAM on a competitive midrange card that launched for just $240 more than 8 years ago. Even on Nvidia's side, the 1060 was offering 6GB for about the same price at that time, and the 1070 offered 8GB for less than $400.
I'd suggest that's partly the fault of the DRAM industry for not making higher-density chips. I'd be curious to know whether the gains we've seen in GDDR performance have at all come at the expense of density or cost-efficiency.

BTW, the Ellesmere cards came with either 4 GB or 8 GB. Do you happen to know whether that was by using low vs. high-density chips, or did they reach 8 GB by doubling up the number of chips per channel (i.e. "clamshell" configuration)?
 
He's not wrong. There's no reason why NVidia isn't putting on more RAM other than "planned obsolescence", as well as purposely gimping the cards so they aren't as attractive for Ai and such.
From what I've seen, the 50 series from the 40 series is much like the 20 series from the 10 series. Rasterization wasn't much improved, but came with some new 'shiny things'.
 
He’s right. And why we should all be rooting for Intel and AMD. This is Intel Pentium days all over again. Eventually, AMD was able to compete … but it seems AMD since purchasing ATI has always been playing catch up.
 
it seems AMD since purchasing ATI has always been playing catch up.
AMD has been able to catch Nvidia, on a few occasions. For instance, their Fury X was neck-and-neck with the GTX 980 Ti.

More recently, their RX 6950X actually beat the RTX 3090 Ti on rasterization performance, which I'm surprised so many people don't seem to know. I guess because it's when GPUs were super-expensive (if you could even find one in stock), due to the pandemic X crypto craze X scalpers. So, maybe some weren't paying much attention to the reviews & benchmarks.
 
There's no reason why NVidia isn't putting on more RAM other than "planned obsolescence"
im getting tired of people blaming gpu manufacturers for not having "enough" vram when the problem is that game devs are shoving bloatware rt features into games. ALSO vram isnt the only factor that makes a gpu "good"
... much like the 20 series from the 10 series.'.
that isnt exactly true...
View: https://www.youtube.com/watch?v=iGUejCAupSg

20 series was still a massive leap in game performance + they upped the vram on their budget cards on that series.
 
  • Like
Reactions: bit_user
AMD has been able to catch Nvidia, on a few occasions. For instance, their Fury X was neck-and-neck with the GTX 980 Ti.

More recently, their RX 6950X actually beat the RTX 3090 Ti on rasterization performance, which I'm surprised so many people don't seem to know. I guess because it's when GPUs were super-expensive (if you could even find one in stock), due to the pandemic X crypto craze X scalpers. So, maybe some weren't paying much attention to the reviews & benchmarks.
yes, however a problem ive been seeing with amd is that they dont really try to make a gpu on their own, they try to base it off nvidia. often times it seems like amd isnt even trying to create gpus better than nvidia, just gpus that perform simular (in gaming at least) for a fair bit less.
 
He's not wrong. There's no reason why NVidia isn't putting on more RAM other than "planned obsolescence", as well as purposely gimping the cards so they aren't as attractive for Ai and such.
From what I've seen, the 50 series from the 40 series is much like the 20 series from the 10 series. Rasterization wasn't much improved, but came with some new 'shiny things'.
Now, I don't want to be simping for Nvidia on this one, but I think what happened has more to do with Samsung fudging it up with GDDR7.

There were supposed to be 3GB modules already around, but they ran behind the schedule. My headcanon tells me that 5080 should have been a 24GB card, we had all these traces like 24GB packaging popping up and such.

But things happened and we got what we got.

This is literally out of MSI video here, which they since fixed:

d37b013e3b073e9b463fcedc4823df77.png