News Report: Big Navi Engineering Sample Points to 16 GB GDDR6 Memory

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
I feel AMD adds more RAM than is required to some of their GPU’s as a pure marketing gimmick even if the card has no hope of ever using it all.
I think it's a calculated move to make the card attractive to cryptocurrency miners. It's an insurance policy against being badly beaten by Nvidia. The card will still sell at a high price point even if gaming performance isn't anywhere near the RTX 3070's.
12gb should be good for years to come. by the time we've saturated 8gb thoroughly, a new generation will have already been released.
Yeah. EG. say 5 years in the future, you want to use your, say, hypothetical AMD 16GB card - even if that's still excessive VRAM, at say, 1080p or 1440p, the card will be held back by other things.

Guys , in case you havent noticed yet , we already Saturated even 10GB VRAM gaming at 4K resolution in some games , and for sure 8GB VRAM is a no option today AT ALL.

Take COD warzone for example , with the highest settings on 4K it uses >10GB VRAM , and reaches 10800MB , and in some maps it uses more than 11GB so the 2080 Ti will go down to 10fps from using system RAM.

COD at 1440P already uses >9GB VRAM ...

and this is NOW and not few years in the future .

Actually AMD is doing it RIGHT . and Nvidia is cutting the VRAM to corner AMD pricing !

The RTX 3070 is faster than RTX 2080 Ti , and already the 11GB is being used ! how on earth Nvidia is releasing RTX 3070 with only 8GB VRAM ?

The Answer is : cornering AMD , many uneducated buyers ( and this Thread proves it ) wont notice this at all , because most of them never tried RTX 2080 Ti to begin with , and never tried maximum settings yet and are happy with 8GB VRAM ..

Here is COD warzone VRAM usage

1440P >9GB
4K > 10GB

View: https://www.youtube.com/watch?v=YLyUKJ9I_oo


Now imagine 2021 games VRAM needs ??!!??

EVEN THE 3080 with 10GB VRAM is a DISASTER , COD NEEDS >10GB for 4K !!

AMD IS DOING IT RIGHT (16GB VRAM) , NVIDIA IS CHEATING YOU ALL .
 
Last edited:
  • Like
Reactions: Avro Arrow

escksu

Reputable
BANNED
Aug 8, 2019
877
353
5,260
Guys , in case you havent noticed yet , we already Saturated even 10GB VRAM gaming at 4K resolution in some games , and for sure 8GB VRAM is a no option today AT ALL.

Take COD warzone for example , with the highest settings on 4K it uses >10GB VRAM , and reaches 10800MB , and in some maps it uses more than 11GB so the 2080 Ti will go down to 10fps from using system RAM.

COD at 1440P already uses >9GB VRAM ...

and this is NOW and not few years in the future .

Actually AMD is doing it RIGHT . and Nvidia is cutting the VRAM to corner AMD pricing !

The RTX 3070 is faster than RTX 2080 Ti , and already the 11GB is being used ! how on earth Nvidia is releasing RTX 3070 with only 8GB VRAM ?

The Answer is : cornering AMD , many uneducated buyers ( and this Thread proves it ) wont notice this at all , because most of them never tried RTX 2080 Ti to begin with , and never tried maximum settings yet and are happy with 8GB VRAM ..

Here is COD warzone VRAM usage

1440P >9GB
4K > 10GB

View: https://www.youtube.com/watch?v=YLyUKJ9I_oo


Now imagine 2021 games VRAM needs ??!!??

EVEN THE 3080 with 10GB VRAM is a DISASTER , COD NEEDS >10GB for 4K !!

AMD IS DOING IT RIGHT (16GB VRAM) , NVIDIA IS CHEATING YOU ALL .

Actually, 8GB and 10GB is possible

Remember how Fury get by with just 4GB? AMD did alot of work on swapping textures in and out of the 4GB buffer so it won't impact performance.

Nvidia can do the same.
 
Guys , in case you havent noticed yet , we already Saturated even 10GB VRAM gaming at 4K resolution in some games , and for sure 8GB VRAM is a no option today AT ALL.

Take COD warzone for example , with the highest settings on 4K it uses >10GB VRAM , and reaches 10800MB , and in some maps it uses more than 11GB so the 2080 Ti will go down to 10fps from using system RAM.

COD at 1440P already uses >9GB VRAM ...

and this is NOW and not few years in the future .

Actually AMD is doing it RIGHT . and Nvidia is cutting the VRAM to corner AMD pricing !

The RTX 3070 is faster than RTX 2080 Ti , and already the 11GB is being used ! how on earth Nvidia is releasing RTX 3070 with only 8GB VRAM ?

The Answer is : cornering AMD , many uneducated buyers ( and this Thread proves it ) wont notice this at all , because most of them never tried RTX 2080 Ti to begin with , and never tried maximum settings yet and are happy with 8GB VRAM ..

Here is COD warzone VRAM usage

1440P >9GB
4K > 10GB

View: https://www.youtube.com/watch?v=YLyUKJ9I_oo


Now imagine 2021 games VRAM needs ??!!??

EVEN THE 3080 with 10GB VRAM is a DISASTER , COD NEEDS >10GB for 4K !!

AMD IS DOING IT RIGHT (16GB VRAM) , NVIDIA IS CHEATING YOU ALL .
VRAM used does not mean VRAM required. So although interesting that video doesn’t prove anything as it can be just the game grabbing what vram is available. Until we see actual benchmarks using a 3080 we don’t know if 10gb is actually going to cause any performance impact.
 

olin9

Distinguished
Feb 20, 2008
403
1
18,865
I have had mostly AMD GPUs. My 5700 xt reddevil is fast and meets my needs. But the drivers suck so bad there is no way I am getting the next one. MY pc crashes 10 times a day. CPU 3700x with my old GPU a R9 390x it never crashed.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
VRAM used does not mean VRAM required. So although interesting that video doesn’t prove anything as it can be just the game grabbing what vram is available. Until we see actual benchmarks using a 3080 we don’t know if 10gb is actually going to cause any performance impact.

does not prove anything? I told you in some cases even 11GB is all used up and the game fps goes down to 10 fps from lack of VRAM even on RTX 2080 ti.

Second , RTX 3070 is for sure a big no with only 8GB of VRAM.

and this is not about benchmarks , this is about testing max VRAM usage and no benchmark or any review will test this. they all want nvidia happy with them.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Actually, 8GB and 10GB is possible

Remember how Fury get by with just 4GB? AMD did alot of work on swapping textures in and out of the 4GB buffer so it won't impact performance.

Nvidia can do the same.

I am talking about 4K gaming with max detail , possible is one thing , while playing with max details without any problems is another ..

RTX 2080 ti already is slowing down in some maps .. no swapping textures helped ... and funny how your say 8 and 10 as if there is not huger difference between them ...

and still , AMD is doing it RIGHT with 16GB!.
 
does not prove anything? I told you in some cases even 11GB is all used up and the game fps goes down to 10 fps from lack of VRAM even on RTX 2080 ti.

Second , RTX 3070 is for sure a big no with only 8GB of VRAM.

and this is not about benchmarks , this is about testing max VRAM usage and no benchmark or any review will test this. they all want nvidia happy with them.
At what point in that video does it show the drop to 10fps?
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
At what point in that video does it show the drop to 10fps?

The Video shows the memory usage , dropping to 10 fps happens , some people reported it in forums.

I am giving you advice not to buy 8-10GB VRAM cards that are aimed at 4K gaming , you want to take it take it , you dont want to take it , it is your choice .. and dont be shocked when nvidia release RTX 3070 with 16GB VRAM soon and then say no one warned me .. Lenovo already leaked by Mistake coming RTX 3070 wit 16GB Vram . nvidia just wants to milk unaware people.
 
The Video shows the memory usage , dropping to 10 fps happens , some people reported it in forums.

I am giving you advice not to buy 8-10GB VRAM cards that are aimed at 4K gaming , you want to take it take it , you dont want to take it , it is your choice .. and dont be shocked when nvidia release RTX 3070 with 16GB VRAM soon and then say no one warned me .. Lenovo already leaked by Mistake coming RTX 3070 wit 16GB Vram . nvidia just wants to milk unaware people.
I’m not saying you are wrong but equally you are making bold statements without providing reliable evidence. VRAM usage does not mean VRAM required and I’m not taking something you read and repeated from another forum as fact. I was hoping you were trying to show a video of the exact issue. It’s going to be very interesting to see the reviews and I expect there will be a lot of focus on 4K. Assuming there is a problem that would be a massive mistake by NVidia.

I’m going to try and secure a 3080 on the 17th. I predominantly game at 1440p but on occasion I use a 4K 120Hz TV. If the benchmarks show a VRAM problem at 4K it will be going back which I can do in the UK with no cost.
 
does not prove anything? I told you in some cases even 11GB is all used up and the game fps goes down to 10 fps from lack of VRAM even on RTX 2080 ti.
You posted one instance of a game, which engine has an option to fill VRAM anyway (probably for shader caching) to prove that somehow the GeForce 30 series is a "disaster" because NVIDIA couldn't be bothered to add more VRAM. Last I checked, Call of Duty is not the end-all-be-all of gaming.

Let's wait until the independent benchmarks with a variety of games come out before claiming a disaster.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
I’m not saying you are wrong but equally you are making bold statements without providing reliable evidence. VRAM usage does not mean VRAM required and I’m not taking something you read and repeated from another forum as fact. I was hoping you were trying to show a video of the exact issue. It’s going to be very interesting to see the reviews and I expect there will be a lot of focus on 4K. Assuming there is a problem that would be a massive mistake by NVidia.

I’m going to try and secure a 3080 on the 17th. I predominantly game at 1440p but on occasion I use a 4K 120Hz TV. If the benchmarks show a VRAM problem at 4K it will be going back which I can do in the UK with no cost.

RTX 3080 is okay because it is at least 10GB, very few games will need more than 10GB , but avoid any ANY 8Gb card ...
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
You posted one instance of a game, which engine has an option to fill VRAM anyway (probably for shader caching) to prove that somehow the GeForce 30 series is a "disaster" because NVIDIA couldn't be bothered to add more VRAM. Last I checked, Call of Duty is not the end-all-be-all of gaming.

Let's wait until the independent benchmarks with a variety of games come out before claiming a disaster.

we dont need to wait , RTX 2080 ti is already here , my advice to you Avoid any card that comes with only 8GB VRAM if you intend to use 4K resolution at maximum settings.
 

Shadowclash10

Prominent
May 3, 2020
184
46
610
Guys , in case you havent noticed yet , we already Saturated even 10GB VRAM gaming at 4K resolution in some games , and for sure 8GB VRAM is a no option today AT ALL.
I have a 2060 Super with 8GB and I can run COD at 1440p just fine, thank you very much. It IS an option. Is it ideal? No. Is 16GB just right? No. 12GB is the sweet spot for anything equal to a 3070 or 3080. 16GB is ridiculously excessive. You are just pointing out the outliers. EG, the 2080 Ti can do 4k60 fine. Can it run Metro Exodus at 4K Ultra consistantly? No. Can it run Metro at 4k60 High consistantly? Yes. Can it run most games at 4k60 Ultra? Yes. When aiming for 100% every time, you will need to either be satisfied for 95% of the time or pony $$$ up.
 

King_V

Illustrious
Ambassador
Maybe. For example if we look at the 1060 6gb vs 580 8gb. Very similar performance but has there ever been a scenario with playable FPS where the extra 33% of RAM on the 580 was beneficial? Not that I have seen.

I still think it’s more to do with giving their gpu’s a selling point over NVidia that can be used for marketing. Just my opinion though.

I don't know if it's ever benefited from having 8GB vs 6GB specifically, but maybe the architecture makes it difficult to have a 4GB model and a 6GB model, rather than a 4GB model and an 8GB model.

There are most definitely instances, for example, where the 3GB of the lower 1060 model cripples it. It's outperformed by the RX 570 4GB, and sometimes, that 3GB means that, in a couple of games, the 1050Ti with 4GB will equal the performance (or slightly exceed) that of the 1060 3GB.
 
  • Like
Reactions: sizzling
New consoles got 8-10GB VRAM for developers to play with so we PC gamer will likely need about 16-20GB when those console ports start hitting shelves.
That could very well be. There's gotta be a reason why the RX 5700 XT has 8GB of GDDR6 because unlike the RX 580, the 5700 XT CAN use it all.
Actually, 8GB and 10GB is possible

Remember how Fury get by with just 4GB? AMD did alot of work on swapping textures in and out of the 4GB buffer so it won't impact performance.

Nvidia can do the same.
The thing is that the Fury could swap the textures because HBM1 was so insanely fast with its gargantuan 4096-bit VRAM bus. Even HBM2 in the Radeon VII has only half of that so I think that it might have been unique to the Fury line.

Case in point: I was able to run Superposition at 4K Optimized even though it warned me that it needed more than 4GB of VRAM. The HBM managed to handle it but nothing else would, otherwise Superposition wouldn't have said anything. There would either be a big performance hit or a crash if anything else with <=4GB of GDDR5 or GDDR6 were used. Even GDDR6X can't compare with HBM because it only has a 384-bit VRAM bus. That's less than 1/10th the bus width of HBM. It's just not the same thing and is probably the only case in which HBM offered a real advantage over GDDR6 because games weren't designed to operate with a VRAM bus that wide and couldn't take advantage of it. This is why I agree that HBM variants aren't useful on gaming cards. It's kinda like how the FX CPU was badly underused because games didn't (and most still don't) use all 8 cores.

Releasing some new tech that is so over-the-top to the point that no software will use it is a complete waste of time and money. This was the 8-core CPU back in 2011 and it's also HBM back in 2015 (and even now).
 
  • Like
Reactions: Shadowclash10
I have had mostly AMD GPUs. My 5700 xt reddevil is fast and meets my needs. But the drivers suck so bad there is no way I am getting the next one. MY pc crashes 10 times a day. CPU 3700x with my old GPU a R9 390x it never crashed.
If you're running a Ryzen system, I know EXACTLY how to solve your problem. Just go to amd.com and update your chipset drivers because that will fix almost all of your problems. There's a guy from Portugal who has a channel called "Ancient Gameplays" on YouTube. He has a video that says exactly how to fix all of the driver issues on the RX 5700 XT and I checked it out because, like all the review sites, I never experienced any of them. I found out from this video that it's because I (and all review sites) always update my chipset drivers (and BIOS for that matter) while most people don't even know that chipset drivers exist and need to be updated. This should be incredibly helpful to you:
View: https://www.youtube.com/watch?v=F1dQoJtkI-c

I hope that this helps you to love your RX 5700 XT as much as I love mine! :D