News AMD Goads Nvidia Over Stingy VRAM Ahead of RTX 4070 Launch

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

DaveLTX

Commendable
Aug 14, 2022
104
66
1,660
I would love to buy a high vram amd gpu to go with my R9, but have doubts about the stability of the drivers in productivity (adobe cc) which is awful as is with Nvid.Anyone have any experience to share?

Actually it's a Adobe CC problem, Adobe CC suite has been growing increasingly unstable
 
If anything NVIDIA's strategy to continue with monolithic dies or to tie L2 cache to the memory controller is biting them in the ass. GDDR6 has been around for five years and the highest capacity chip was 2GB, also for nearly five years. Samsung finally came out with a 4GB GDDR6W variant, but that was announced last November.

So that makes me wonder if the mid-cycle refreshes for Ada will have GDDR6W support, or if that'll get shoved onto the next thing.
 

Elusive Ruse

Estimable
Nov 17, 2022
459
597
3,220
Yup. And with all those fancy mental gymnastics AMD is getting outsold 6:1. If they had the better product, it would be the other way around.
Most of it is outdated mentality of consumers; all AMD has to do is consistently produce competitive GPUs and the market share will come. Just as they managed to do so with their CPUs.
 
Most of it is outdated mentality of consumers; all AMD has to do is consistently produce competitive GPUs and the market share will come. Just as they managed to do so with their CPUs.

Agree. Can't overstate market momentum enough. Average consumer doesn't like to step out of their comfort zone, corporations know this and capitalize on it. NVIDIA drivers have been utter <Mod Edit> for years (compared to the good old stuff) but hey, only AMD has driver issues! People don't want to change and they think of all kinds of reasons not to. I'm doing it myself. I want to try AMD this go 'round, but I keep seeing some weird issues in game forums I frequent. I also see lots of NVIDIA issues, and I'VE HAD NVIDIA ISSUES. But I'm still a little tepid going AMD. It's stupid and I know it.
 
Last edited by a moderator:
  • Like
Reactions: TCA_ChinChin
The claim

KCBe8Pqs6MwtuBeXMNDoNR.png


The reality (TH 9 game, TPU 25 game)

Xx7ZdRvkhYehAHyLs8NUKD.png

average-fps_3840-2160.png


Yeah, all that extra RAM on the 7900XTX and XT really makes it perform so much better than the paltry 16GB on the 4080 and 12GB on the 4070 Ti...

And on Newegg the 7900 XT sells for $800-$850 while the 4070 Ti is $820-$900, not exactly a large cost difference, so there's no compelling reason to switch to AMD. Now if AMD actually competed in price and didn't attempt to price gouge match with nVidia and priced it at $699, then there'd be a compelling reason.
 
  • Like
Reactions: KyaraM and Why_Me

InvalidError

Titan
Moderator
Nvidia and AMD are asking extremely high prices for GPU, the bare minimum to ask is that they can keep up with consoles that have 16GB VRAM.

A PC GPU released in 2023 should have at least 16GB VRAM, no ifs or buts.
For PC gaming to survive beyond trivial games, there needs to be decent $200-300 GPUs to get new people who already own PCs for other reasons on board without busting the bank and putting more than 8GB on those is going to make a relatively large dent in the parts budget for a GPU that likely won't have the processing power to push the sort of details that may require more than 8GB in the first place.

Not everyone is obsessed with RT-Ultra-Psycho-Nightmare-4k300. Plenty of people want something just good enough to be comfortably playable at medium-ish settings for the least amount of money possible.
 

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
It's interesting you mention Nvidia's performance in StableDiffusion near the end of the article because personally I'm half suspecting myself that Nvidia being stingy with VRAM is directly tied to StableDiffusion and AI in general, they might want to boost the sales of their high end and professional GPUs for those applications....
NVIDIA drivers have been utter <Mod Edit> for years (compared to the good old stuff) but hey, only AMD has driver issues!
Except for the recent high CPU usage after closing a game caused by new drivers, which was quickly fixed, I can't really see how Nvidia drivers have been <Mod Edit> in recent years. In general Nvidia is generally quick with their certified drivers and hotfixes. AMD on the other hand spent a couple of months without updating the drivers for their older GPUs because the few folks who work on the drivers needed to focus exclusively on fixing the recently released RDNA3, honestly it's embarrassing for a company that want to compete with the market leader and it might turn off customers who expect to get decent support for their GPUs even when it's no longer the latest and best.
 
Last edited by a moderator:
  • Like
Reactions: KyaraM

OneMoreUser

Prominent
Jan 2, 2023
119
119
760
Two things: One, the consoles do not have access to all 16GB ram for games and two, that is their entire RAM pool, game and video ram.

There isn't a direct 1:1 comparison since the console version will always be more optimized and streamlined, but I'd wager the average gaming PC has 16GB main RAM, if not more, to go with the 16GB VRAM, which would be the rough equivalent of 32GB for a console.
I'm confused - you suggest doing 1:1 comparisons isn't possible and then you suggest a PC with 16+16 GB would be the rough equivalent of a console with 32 GB!??!?

No idea on the XBOX, but the PS5 does some tricks where it pulls data directly from the SSD to the GPU. That lets them do more with less memory needed and I bet it is why the Last of us isn't flying on PC unless one has plenty of VRAM.
Look up the PS5 launch and how a game like Ratchet & Clank was made to take big advantage of the tech.
 

OneMoreUser

Prominent
Jan 2, 2023
119
119
760
<SNIP>
Not everyone is obsessed with RT-Ultra-Psycho-Nightmare-4k300. Plenty of people want something just good enough to be comfortably playable at medium-ish settings for the least amount of money possible.
Trouble is even now some games may run, but look crap on 8 GB cards because the 8 GB has been sort of minimum since 2020/2021. Check the recent video by Hardware Unboxed.

So in the future the issue will get even worse for those with low memory cards ie. Nvidia "mid-range" ones.

There is no law which states a decent gaming PC can't be a $1K+ investment. In fact in the early days there wasn't such a thing as $1K PC's as even a PC with no audio and no 3D hardware cost $2K or more, so instead most people gamed on what was called home computers or gaming consoles.
 
It's interesting you mention Nvidia's performance in StableDiffusion near the end of the article because personally I'm half suspecting myself that Nvidia being stingy with VRAM is directly tied to StableDiffusion and AI in general, they might want to boost the sales of their high end and professional GPUs for those applications....

Except for the recent high CPU usage after closing a game caused by new drivers, which was quickly fixed, I can't really see how Nvidia drivers have been <Mod Edit> in recent years. In general Nvidia is generally quick with their certified drivers and hotfixes. AMD on the other hand spent a couple of months without updating the drivers for their older GPUs because the few folks who work on the drivers needed to focus exclusively on fixing the recently released RDNA3, honestly it's embarrassing for a company that want to compete with the market leader and it might turn off customers who expect to get decent support for their GPUs even when it's no longer the latest and best.

There have been several weird multi display support issues going back years, as well as Vsync issues across multi monitor. Also issues with GPU acceleration in browsers and Discord most recently. Not to mention the "I will never return to idle clocks again" bug that constantly crops up (AMD gets that one too, probably a power profile issue in Windows). In the Olde Days things seemed a little more stable, now it seems there's an issue every other release. Just because you never encounter them does not mean they don't exist. Some of us have our thumb a little closer to the pulse.
 
Last edited by a moderator:

InvalidError

Titan
Moderator
Trouble is even now some games may run, but look crap on 8 GB cards because the 8 GB has been sort of minimum since 2020/2021. Check the recent video by Hardware Unboxed.
You snipped a supremely important part of my post: $200-300.

If you want to push a steady 60+fps out of an RTX3050 or RX6650, the GPU will be tapped out long before the 8GB of VRAM most of the time and slapping 16GB of VRAM on it won't help much. Intel's A750 should be twice as powerful and may enter the performance territory where 12GB could make a relevant difference but drivers are too inconsistent to really tell how much of the performance differences between the A750 and A770 beyond the 1/8th shader trim is due to 8GB vs 16GB or just driver weirdness.
 
  • Like
Reactions: KyaraM

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
I'd wager the average gaming PC has 16GB main RAM, if not more, to go with the 16GB VRAM, which would be the rough equivalent of 32GB for a console.

That's sadly not how it works.

For gaming, the assets taking up space are overwhelmingly textures.

You can not avoid storing these textures in VRAM. They're completely useless in system memory, they need to get to VRAM in an uncompressed state where the GPU can use them.

If you start swapping textures back and forth between system memory and VRAM, you get these texture pop-ins shown in that HWO video.

A possible solution is a unified architecture on PC similar to consoles.

PC gaming hardware is increasingly looking like it is becoming a niche for the wealthy due to the high cost of overlapping components.
 
Last edited:
  • Like
Reactions: KyaraM

healthy Pro-teen

Commendable
Jun 23, 2022
58
54
1,610
Yup. And with all those fancy mental gymnastics AMD is getting outsold 6:1. If they had the better product, it would be the other way around.
If they had a better product in every way, I'd say they would still be outsold 3:1, RX6600 is better in every way than the RTX 3050, do you know which GPU sold more?
 

InvalidError

Titan
Moderator
It pretty much always was if your goal is to beat out the "console peasants."
The only reason I ever touched PC gaming is because 20+ years ago, almost any odd $150 GPU could handle current games decently well with carefully chosen compromises and that was still generally the case up to the GTX1650 Super and RX580/8GB. Wish I had gotten either one of those just before the crypto-boom where they could still be had new for sub-$150.
 

razor512

Distinguished
Jun 16, 2007
2,159
87
19,890
I think the lack of VRAM in current game titles is a problem, but I am not sure if the main reason is because we are not given enough VRAM or simply because game developers are not optimising the game as they used to. It is impossible to fit a 8GB VRAM buffer? I don’t think that is the case. Of course we can solve the problem with more VRAM, but the more you have, the less optimisation will be done by the developers. Eventually the problem will catch up with you even with 16GB of VRAM. Game titles in 2022 pretty much run well and with decent image quality with 8GB VRAM. There are few titles that may need more than 8GB, but dropping the texture quality settings from Ultra to the next highest setting typically solves the problem. Moving to 2023, suddenly VRAM requirements jumped 50% or more, while visually, they don‘t really look much better. So it just screams poor optimisation to me.

Most of the optimization focuses on GPU compute related tasks, but there aren't many ways to reduce VRAM usage without compromising on texture quality or texture variety. The direction that many. If a game is designed first for the consoles and the consoles are offering more VRAM, then the goal of the the development team is to utilize the hardware that the console has to the fullest, and since the hardware is the same for everyone, there are some optimizations that they can do that PC can't do. For example, the performance of every component is known, thus you can minimize how much much of the game engine and other non-GPU related things need to be loaded, since they know how fast things can be loaded.
More VRAM means more unique textures, as well as use of some of the newer features of modern game engines, where they are making increased use of 8K+ textures that get loaded into VRAM, and then the GPU will dynamically generate resampled versions based on the PPI needed, thus the texture can look great while wall hugging, but not being a massive drain on GPU compute when the same wall on the other side of the room.

Most of what has gone on is utilizing more VRAM to allow for higher quality assets to be used, while while minimizing the performance impact.
 
Sometimes, that's exactly what you need to do. Time will tell if AMD gets egg on their face (again).
Doing so might entertain some people on the internet but usually it does not really help them in sales.
Most of it is outdated mentality of consumers; all AMD has to do is consistently produce competitive GPUs and the market share will come. Just as they managed to do so with their CPUs.
Look what happen for the last 15 years. With CPU AMD become more competitive and they start gain market share. With GPU the more competitive they are more they will lose market share against nvidia.
 
  • Like
Reactions: KyaraM
The only reason I ever touched PC gaming is because 20+ years ago, almost any odd $150 GPU could handle current games decently well with carefully chosen compromises and that was still generally the case up to the GTX1650 Super and RX580/8GB. Wish I had gotten either one of those just before the crypto-boom where they could still be had new for sub-$150.
Which works well if you're the kind of person who just wants to play games and doesn't chase after the high preset at 60 FPS all the time. Heck I remember when I got started playing games on a PC, I was fine with 10FPS at times in a game.

But I feel like the expectations for a lot of people are "I want my 1080p 120 FPS uber quality for $300." Which even if we tempered the FPS expectation, the only time I can recall this was ever done was with the 8800 GT.

Really funny when a "consumer" takes issue with reduction in prices.
The only time they do is when they just bought the card.
 
Status
Not open for further replies.