News AMD Goads Nvidia Over Stingy VRAM Ahead of RTX 4070 Launch

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

InvalidError

Titan
Moderator
Look what happen for the last 15 years. With CPU AMD become more competitive and they start gain market share. With GPU the more competitive they are more they will lose market share against nvidia.
With CPUs, AMD had to heavily discount them through the first two generations before gaining any real market traction with Zen 2. For GPUs, merely competing on "performance per dollar" without the brand and feature recognition when Nvidia is self-promoting so much through PC game developers is an uphill battle that AMD appears to be incapable of overcoming by playing coy with pricing and will have an even harder time achieving now that there is a 3rd kid on the block. If Intel sorts its drivers out and gets serious with a market share push on BMG which appears to be aimed at ~2X the A750's performance for ~$300, AMD could be in serious GPU trouble at the mid-range.
 
  • Like
Reactions: KyaraM

DSzymborski

Curmudgeon Pursuivant
Moderator
It pretty much always was if your goal is to beat out the "console peasants."

Yup, this is what just blows my mind in these threads. While arguing over which $800-$1000 GPU is better, you have people fighting each other for the title of working class hero.

The high end of this hobby has always been for the well off and really, in the context of the average person on this planet, the low end has been too. Many would do well to can the performative mock noblesse oblige; we're watching Goliath brawling Goliath and betting on the fight wearing tuxedos in the skybox. Let's be honest about what this is!
 
  • Like
Reactions: KyaraM

blppt

Distinguished
Jun 6, 2008
580
105
19,160
"That's sadly not how it works. "

I know that. I was offering an attempt at equivalency since consoles have to use their 16GB of RAM for both video and game. In an ideal situation, the PC will be able to keep most of its 16GB main memory for game engine and 16GB VRAM for the graphics. Of course, its not THAT cut and dry, but overall, your standard gaming pc with 16GB main and 16GB VRAM is going to have more available memory than the console. Even with its larger OS footprint.

Of course, "ideal" rarely if ever happens IRL, but the fact remains that PCs do not run their game engines in Video RAM. So, automatically there is going to be more resources free on a PC videocard that has as much RAM as the entire unified console setup.

And if you think textures popping on PCs is bad with 16GB main ram and 16GB video ram, check out the draw distance/texture detail of a cross-platform game on the latest consoles.
 

mhmarefat

Distinguished
Jun 9, 2013
67
77
18,610
Which works well if you're the kind of person who just wants to play games and doesn't chase after the high preset at 60 FPS all the time. Heck I remember when I got started playing games on a PC, I was fine with 10FPS at times in a game.

But I feel like the expectations for a lot of people are "I want my 1080p 120 FPS uber quality for $300." Which even if we tempered the FPS expectation, the only time I can recall this was ever done was with the 8800 GT.
Thanks to advancement of technology (itself grateful to the masses who supported the industry throughout the years), PC gaming has reached to a level that 1080p 120 FPS or 1440p 60 FPS could've been EASILY affordable to the majority of people today, if only greedy corporations had any common sense. But no! What if God forbid they lost even 1% share of the market? They had to come up with this completely irrelevant incomplete technology (RT) to justify higher prices for years to come and guarantee market share gains. Years later they are still continuing this path thanks to companies like CD Project whose disgraceful"Cyberpunk 2077" is now getting the new shiny "Path Tracing" to keep the masses deluded to tech that is generations behind and justify big corporations' predatory price behavior. Today we have ended up having 70 series cards starting at $600 justified by fake needs such as RT. This is a disaster and failure of PC Gaming industry. Only "people" who are happy are the likes of 4090 owners.
Technology is here to serve the humanity not the other way around.
 

blppt

Distinguished
Jun 6, 2008
580
105
19,160
"m confused - you suggest doing 1:1 comparisons isn't possible and then you suggest a PC with 16+16 GB would be the rough equivalent of a console with 32 GB!??!? "

I was attempting to draw a very loose parallel, thus the disqualifier. Of course it isn't 32GB unified memory.

The point is, no matter how you look at it, since you cannot run the main game engine in VRAM, that 16GB is free to store as much data for video ram as is entirely available to the whole game in the console. Its not a 32GB framebuffer, but also likewise, you cannot have all 16GB in the console dedicated to graphics.
 
  • Like
Reactions: KyaraM

razor512

Distinguished
Jun 16, 2007
2,159
87
19,890
For a PCs system RAM, it will need more than a console as you are running a fully featured OS, with some 3rd party background tasks. Beyond that, resource management for a game running on the PC will be different since they have to account for a wide range of hardware, and they can't design how much is loaded at a time assuming that everyone will have a SSD offering a at least 7.5GB/s read speeds. Since having enough system RAM is not rare, it is easier to just simply better utilize it to avoid any potential issues, even if the person decides to use a QLC SATA SSD.

On the console, the hardware has been fully characterized, thus it is easier to optimize for system memory use.
 
  • Like
Reactions: Why_Me and KyaraM

oofdragon

Distinguished
Oct 14, 2017
327
292
19,060
The only reason why Nvidia is selling more is because of RT gimmick trend which they started themselves, like other times they pulled this kind of markeing nonsense paying developers to implement "tricks" that only their cards would do well. Today is Cyberpunk with "RT", yesterday was The Witcher with "Pure Hair", the same developer btw. AMD is probably having a hard time figuring out how to run RT coding without infringing patents, at raster side it's superior to NVidia. Anyone that played Hogwarts Legacy with "RT" can atest it's not better graphically, it's just a gimmick, just one example. Duke Nukem 3D had mirror reflection back then, now it's a "RT" feature, it's sad really. Im cheering for AMD to sell better this gen and find a way to beat Ngreedia at their own game
 

tamalero

Distinguished
Oct 25, 2006
1,231
247
19,670
I know of no consumers taking issues with gpu price cuts but I do know the two recently released AMD gpu's aren't selling well atm.
Neither are some models of Nvidia.
The difference is Nvidia has the power to strongarm. Aka fanatism.

I still found egregious that the MSRPs are also BS.
Where most makers only make a handful of models for MSRP and everything else is way above.
 
Thanks to advancement of technology (itself grateful to the masses who supported the industry throughout the years), PC gaming has reached to a level that 1080p 120 FPS or 1440p 60 FPS could've been EASILY affordable to the majority of people today, if only greedy corporations had any common sense. But no! What if God forbid they lost even 1% share of the market? They had to come up with this completely irrelevant incomplete technology (RT) to justify higher prices for years to come and guarantee market share gains. Years later they are still continuing this path thanks to companies like CD Project whose disgraceful"Cyberpunk 2077" is now getting the new shiny "Path Tracing" to keep the masses deluded to tech that is generations behind and justify big corporations' predatory price behavior. Today we have ended up having 70 series cards starting at $600 justified by fake needs such as RT. This is a disaster and failure of PC Gaming industry. Only "people" who are happy are the likes of 4090 owners.
Technology is here to serve the humanity not the other way around.
While I will say entertainment should be a right (as in, everyone should have access to some form of entertainment), any specific form of it is not. Especially when the performance requirements necessitate a team of highly intelligent people who solve problems that the average person here would have their head spinning looking at it (myself included, for the most part).

If Cyberpunk 2077 in all its glory is inaccessible to you, too bad. You don't have a right to it. Or rather, there's no need for you to have it. There's plenty of other games out there that are accessible to you that could keep you entertained as long as you have some electronic device.

EDIT: I kinda glossed over the post but I saw this bit and I mentally facepalmed

They had to come up with this completely irrelevant incomplete technology (RT)
You do realize that ray tracing was developed in the 70s and has been considered the holy grail of graphics rendering ever since because it physically simulates light. NVIDIA did not just "come up" with something, it was there all along. I mean hell, NVIDIA wasn't even the first to demonstrate real time ray tracing on hardware. They were at least third (Intel with Larabee and PowerVR with the GR6500 both demonstrated it before NVIDIA had something)

So yeah, completely irrelevant.
 
Last edited:
  • Like
Reactions: renz496 and KyaraM

Nicholas Steel

Distinguished
Sep 12, 2015
30
7
18,535
Nvidia and AMD are asking extremely high prices for GPU, the bare minimum to ask is that they can keep up with consoles that have 16GB VRAM.

A PC GPU released in 2023 should have at least 16GB VRAM, no ifs or buts.

It's especially the texture pop-in that looks horrendous on 8GB and 12GB cards.

Games are made for current gen consoles that have 16GB VRAM. Developers have no interest in redoing all the textures for PC GPU that lack VRAM, so you get these horrendous texture pop-ins and massive frame drops on PC when GPU lack VRAM.

Current consoles have 16GB of memory that games can then decide how to split between RAM and VRAM. This has been a practice for both Sony and Microsoft since the Xbox 360 (PS3 used a fixed split).

The PS4/Xbox One and newer consoles then added separate RAM for the operating system to exclusively utilize (usually 1 or 2GB).
 
Last edited:
  • Like
Reactions: blppt

DaveLTX

Commendable
Aug 14, 2022
104
66
1,660
Current consoles have 16GB of memory that games can then decide how to split between RAM and VRAM. This has been a practice for both Sony and Microsoft since the Xbox 360 (PS3 used a fixed split).

The PS4/Xbox One and newer consoles then added separate RAM for the operating system to exclusively utilize (usually 1 or 2GB).

I'm confused, current consoles take the OS ram from the same pool of ram that the SoC accesses then you say Xbox one and newer consoles added separate ram?
 
Reading everything here, I'm kind of amused from some angles taken; not necessarily copium, but such a hard defense of something we should all* agree on... I remember when the Fury card launched with "only" 4GB of VRAM and everyone lost their minds saying it would not be enough down the line for such a halo product back then and now I'm reading plenty saying "nah, it's fine" when there's hard evidence newer games just can't offer a quality experience with 10GB or less for the price ranges both AMD and nVidia want us to pay for.

Back then my original argument was also "it won't have the power to process much more data than 4GB anyway" and I can confidently say I was wrong.

Much like with system RAM, having more VRAM is never a bad thing for the GPU; it is, in fact, very desirable. That's why the 3090 launched with 24GB with dual sided modules and the 3080 was not even half of it. Let's not kid ourselves and just agree that nVidia knows exactly what they're doing and it's not to please the masses. Anyway, on the VRAM topic: whatever the GPU needs to use, will always be able to allocate close to it and not use PCIe and RAM to go and get those; including cycling textures from streaming and such.

Oh welp. History and lessons and repeating stuff and all that.

Regards.
 

hannibal

Distinguished
I think anyone who buys a GPU with 8GB VRAM is making a mistake. 12/16GB is bare minimum IMO.

My 1080 Ti in 2017 had 11GB... not sure why anyone would think 8GB would cut it with today's AAA titles.

For low res low quality gaming. (gb is more than enough. Even 4GB can be enough. But if you want to play AAA-games. Sure!
Then 16Gb or more is the way to go. Some expert that use the new engines did say that 16gb is minimum and 32Gb would be nice to have!

But not all games belongs to that category and using 1080p or even 720p at low settings is still gaming.
 

edzieba

Distinguished
Jul 13, 2016
589
594
19,760
As always: ignore slap-fights between marketers, and buy based on actual real-world performance rather than Big Specsheet Numbers. If a card with x GB of vRAM performs better than a card with 4+x GB vRAM, then it's the better card in all realities other than my-number-is-bigger forum arguments.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
As always: ignore slap-fights between marketers, and buy based on actual real-world performance rather than Big Specsheet Numbers. If a card with x GB of vRAM performs better than a card with 4+x GB vRAM, then it's the better card in all realities other than my-number-is-bigger forum arguments.
That depends on the likelihood of the card with x+4GB getting on top a little while down the line when VRAM usage goes up. The 8GB RTX370 was generally faster than the RX6800 and much faster when using RT at launch but today, now that VRAM usage has gone up slightly, the RTX3070 performs like crap at detail levels the RX6800 can still handle just fine including RT. In many cases, the RTX3070's 8GB can barely keep up with 12GB RTX3060s.

The RTX3070 had the absolute bare minimum amount of VRAM necessary to make it work at launch with practically zero future-proofing headroom.

If I was going to spend $400+ on a GPU, which I doubt I ever will, I definitely wouldn't want it to fall flat on its face only two years down the road simply because it is already 2-4GB short on VRAM.
 
  • Like
Reactions: -Fran-

Elusive Ruse

Estimable
Nov 17, 2022
459
597
3,220
VRAM is indeed part of the equation when talking about performance, acting like it's just a nominal value is ridiculous and is in part enabled by the fact that most reviewers still cling to old releases from years ago to benchmark new GPUs.
 

Elusive Ruse

Estimable
Nov 17, 2022
459
597
3,220
Reading everything here, I'm kind of amused from some angles taken; not necessarily copium, but such a hard defense of something we should all* agree on... I remember when the Fury card launched with "only" 4GB of VRAM and everyone lost their minds saying it would not be enough down the line for such a halo product back then and now I'm reading plenty saying "nah, it's fine" when there's hard evidence newer games just can't offer a quality experience with 10GB or less for the price ranges both AMD and nVidia want us to pay for.

Back then my original argument was also "it won't have the power to process much more data than 4GB anyway" and I can confidently say I was wrong.

Much like with system RAM, having more VRAM is never a bad thing for the GPU; it is, in fact, very desirable. That's why the 3090 launched with 24GB with dual sided modules and the 3080 was not even half of it. Let's not kid ourselves and just agree that nVidia knows exactly what they're doing and it's not to please the masses. Anyway, on the VRAM topic: whatever the GPU needs to use, will always be able to allocate close to it and not use PCIe and RAM to go and get those; including cycling textures from streaming and such.

Oh welp. History and lessons and repeating stuff and all that.

Regards.
You don't need VRAM, you think you do but you don't.
 
You don't need VRAM, you think you do but you don't.
My own experiences will beg to differ.

I have a few screenshots I've shared in the Discord on how VRChat will just gobble up all the VRAM it can in complex and populated worlds. People with 3080s, 3080tis and anything with less than 16GB suffer dearly. Funnily enough, people with 3060 12GB have a better overall experience than people with 3080s even.

EDIT: View: https://www.reddit.com/r/VRchat/comments/xjjpup/vrchat_vram_usage/


Regards.
 

FunSurfer

Distinguished
AMD, take recent games like Forspoken, Hogwarts Legacy, Returnal, Last of Us: Part 1 and other games that use more VRAM than 8GB, and run a benchmark of minimum FPS like you did before to strengthen your claims. What are you babbling about with these old games?
 

Elusive Ruse

Estimable
Nov 17, 2022
459
597
3,220
My own experiences will beg to differ.

I have a few screenshots I've shared in the Discord on how VRChat will just gobble up all the VRAM it can in complex and populated worlds. People with 3080s, 3080tis and anything with less than 16GB suffer dearly. Funnily enough, people with 3060 12GB have a better overall experience than people with 3080s even.

EDIT: View: https://www.reddit.com/r/VRchat/comments/xjjpup/vrchat_vram_usage/


Regards.
View: https://www.youtube.com/watch?v=Q3oTv2yUpiw
 

InvalidError

Titan
Moderator
WoW Classic was one of those things I thought I wanted to see but quit three days in due to how unbearably slow, tedious and over-crowded everything was. I've been through that ~10 times during late-BC (when I started playing) through WotLK, didn't take me long to realize I didn't have the patience to go through that again in even more painful raw vanilla form.
 
Status
Not open for further replies.