News RTX 5070 allegedly delayed until early March to counter AMD RX 9070 launch

8 years after the 11 GB 1080 Ti, it’s nothing sort of disappointing that we ‘re witnessing the release of 12 GB GPUs, like 5070.

With AAA gaming becoming more demanding with each passing day, 16 GB should be a bare minimum for graphics cards, IMHO.
 
Last edited:
  • Like
Reactions: atomicWAR
8 years after the 11 GB 1080 Ti, it’s nothing sort of disappointing that we ‘re witnessing the release of 12 GB GPUs, like 5070.

With AAA gaming becoming more demanding with each passing day, 16 GB should be a bare minimum, IMHO.
On the other hand, the first game directly impacted by not having 12GB vRAM available (at UHD render resolution) - The Last Circle - was only just released, whilst the first GPU with 12GB vRAM available was released a decade ago (2015's GTX Titan X). Real-world vRAM requirements in games have just not scaled up that much over time, and per-DRAM-die memory bandwidth has continued to grow without the need to brute-force it with wider bus widths (which requires more DRAM dies, which means more RAM capacity by default because leading-edge DRAM dies are only made in so small a capacity).
 
Looks like Nvidia wants to try and steal some of AMD's thunder. But considering the state of launched 50 series products I find it more likely Nvidia might end up helping AMD rather than hurt them. 50 series cards have been so disappointing. Unless AMD really botched their GPUs this may may well play out in their favor. Plus hearing rumors AMD might actually launch a higher end SKU now, I suspect their 'refresh' chips might have more umpff than Nvidia would like.
 
  • Like
Reactions: valthuer
On the other hand, the first game directly impacted by not having 12GB vRAM available (at UHD render resolution) - The Last Circle - was only just released, whilst the first GPU with 12GB vRAM available was released a decade ago (2015's GTX Titan X). Real-world vRAM requirements in games have just not scaled up that much over time, and per-DRAM-die memory bandwidth has continued to grow without the need to brute-force it with wider bus widths (which requires more DRAM dies, which means more RAM capacity by default because leading-edge DRAM dies are only made in so small a capacity).
Hardly, 12GB of VRAM even at 1440P has been problematic since the launch of 40 series cards and even longer at 4k/UHD. Granted its not all games but its to many to be ignored. The simple truth is 12 GB is not enough VRAM for modern gaming. Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings. Honestly 16GB should be the bare minimum cards launch with in this day and age IMO.
 
  • Like
Reactions: valthuer
Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings.

Yep. And Resident Evil 4 remake, requires 13.73 GBs of VRAM, at max 1080p settings.

Along with 2-3 more games, this was the very reason i decided to get rid of my 4070 Ti, back in 2023, and buy 4090.

I was getting CTDs with Direct3D fatal errors, due to my card's insufficient VRAM.

Haven't encountered any problems ever since.

Memory bandwidth, sure is helpful, but it's not enough to save you by itself.
 
  • Like
Reactions: atomicWAR
Hardly, 12GB of VRAM even at 1440P has been problematic since the launch of 40 series cards and even longer at 4k/UHD. Granted its not all games but its to many to be ignored. The simple truth is 12 GB is not enough VRAM for modern gaming. Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings. Honestly 16GB should be the bare minimum cards launch with in this day and age IMO.
I have a 3060 12GB. A comment i read this week stated that 8 GB VRAM was not enough or barely enough to run at 1080p. Since this is my monitor setup i fired up the most demanding game i had in my library. Tomb Raider Shadow, to see what it was demanding. I wanted to see for myself what the score was.

This was a game which did run on my previous 1060 6GB, but ran nowhere near max settings. It was a game of compromises between getting decent refresh rates and eye candy. Lots of sacrifices. I would rather have the game be playable with playable refresh rates and no stuttering, than lots of eye candy and the FPS yo-yo-ing, which is noticeable.

So i fired up the 3060 and turned all to max at 1080p. My monitor is limited to 60Hz so that is my refresh rate. It did run it at max and at 60 FPS. Memory usage was at nearly 7 GB though. And that on a 1080p.

More and more people are wanting to move up to 1440p. I am one of them. With the up coming 5060 and its 8 GB i think that is not enough for me. But then again i am going AMD this time around no matter what the specs say. Linux is easier to deal with on an AMD card.

Me thinks the 5060 will be barely able to handle 1440P with sub 60 FPS with its 8GB of Vram. Lots of people are goign to be disappointment.
 
  • Like
Reactions: usertests
I have a 3060 12GB. A comment i read this week stated that 8 GB VRAM was not enough or barely enough to run at 1080p. Since this is my monitor setup i fired up the most demanding game i had in my library. Tomb Raider Shadow, to see what it was demanding. I wanted to see for myself what the score was.

This was a game which did run on my previous 1060 6GB, but ran nowhere near max settings. It was a game of compromises between getting decent refresh rates and eye candy. Lots of sacrifices. I would rather have the game be playable with playable refresh rates and no stuttering, than lots of eye candy and the FPS yo-yo-ing, which is noticeable.

So i fired up the 3060 and turned all to max at 1080p. My monitor is limited to 60Hz so that is my refresh rate. It did run it at max and at 60 FPS. Memory usage was at nearly 7 GB though. And that on a 1080p.

More and more people are wanting to move up to 1440p. I am one of them. With the up coming 5060 and its 8 GB i think that is not enough for me. But then again i am going AMD this time around no matter what the specs say. Linux is easier to deal with on an AMD card.

Me thinks the 5060 will be barely able to handle 1440P with sub 60 FPS with its 8GB of Vram. Lots of people are going to be disappointment.
The XX60 series has been a 1080p-class card for at least the last decade. They can be used at 1440p, but they aren’t intended for it.

When the 2060 super came out, everyone said to skip the 2070 for 1440p and buy the 2060 super instead. The same thing happened with the 3060 Ti. Problem is, they very quickly became inadequate for 1440p gaming.

The RTX 5060 is intended for 1080p. It’ll do that adequately. You want more VRAM so that you can play at 1440p? Buy a 5070. Want even more? Buy a 5070 Ti. Or an AMD card, it’s up to you.
 
I just can't help but feel that the 90X0 line up is going to be smooth and polished. They've patiently waited and have been rewarded with news after news of nVidia falling on their face.
 
I just can't help but feel that the 90X0 line up is going to be smooth and polished. They've patiently waited and have been rewarded with news after news of nVidia falling on their face.
Never underestimate AMD's ability to mess up. I'm cautiously optimistic, but if they release the 9070xt at $700, they missed the point.

That being said, I'm still pretty happy with my 7800xt.
 
Yep. And Resident Evil 4 remake, requires 13.73 GBs of VRAM, at max 1080p settings.

Along with 2-3 more games, this was the very reason i decided to get rid of my 4070 Ti, back in 2023, and buy 4090.

I was getting CTDs with Direct3D fatal errors, due to my card's insufficient VRAM.

Haven't encountered any problems ever since.

Memory bandwidth, sure is helpful, but it's not enough to save you by itself.
I still wait for the day my 4070Ti will magically just stop being able to play newly released games at 1440p ultra settings. So far, it didn't happen. Maybe you did something wrong or something was wrong with your specific card, but it's complete, utter bulls that it doesn't game well at 1440p. But what do I know. I just own the card and use it every day for demanding titles. I told you that before ans you are still lying through your teeth, though, and the sheep believe you, so what does it even matter.
 
I told you that before ans you are still lying through your teeth, though, and the sheep believe you, so what does it even matter.

Like i told you in the past, I respect both your view and the fact that you chose this particular GPU.

However, i'm not gonna disregard my personal experience on the matter, just because you think i'm lying.

You 're still being disrespectful, even though i've never insulted you: not once.

Have a good one.
 
I have a 3060 12GB. A comment i read this week stated that 8 GB VRAM was not enough or barely enough to run at 1080p. Since this is my monitor setup i fired up the most demanding game i had in my library. Tomb Raider Shadow, to see what it was demanding. I wanted to see for myself what the score was.

This was a game which did run on my previous 1060 6GB, but ran nowhere near max settings. It was a game of compromises between getting decent refresh rates and eye candy. Lots of sacrifices. I would rather have the game be playable with playable refresh rates and no stuttering, than lots of eye candy and the FPS yo-yo-ing, which is noticeable.

So i fired up the 3060 and turned all to max at 1080p. My monitor is limited to 60Hz so that is my refresh rate. It did run it at max and at 60 FPS. Memory usage was at nearly 7 GB though. And that on a 1080p.

More and more people are wanting to move up to 1440p. I am one of them. With the up coming 5060 and its 8 GB i think that is not enough for me. But then again i am going AMD this time around no matter what the specs say. Linux is easier to deal with on an AMD card.

Me thinks the 5060 will be barely able to handle 1440P with sub 60 FPS with its 8GB of Vram. Lots of people are goign to be disappointment.
Remember that the reported vRAM consumption is not the required vRAM consumption, as any game engine worth its salt will stuff every texture it can find into vRAM until vRAM is full, because empty vRAM is wasted vRAM, and evicting cached textures to overwrite with working data (e.g. a framebuffer) has no more penalty than storing that same data in empty vRAM. Ideally you should never see your card's vRAM not full to the brim, but depending on how the game world is segmented you will likely end up running out of textures to load.
Required vRAM is vRAM not just used for opportunistically cached textures, but for textures and other data (e.g. framebuffers and manipulated geometry) which directly causes performance impacts if not available. This is where Last Circle fell, where the actual memory in active use for rendering went above 12GB.

The problem for testing is that the driver does not report the distinction between opportunistically cached data and data in active use (that's a distinction at the engine level) so just going by "x GB vRAM in use" does not provide a useful metric for assessing potential performance impact.