Question Should i buy an RTX 3060 Ti ?

Apr 22, 2023
1
1
10
Hi, I'm currently saving up to purchase an RTX 3060 Ti in the near future, but I'm not sure which model currently fits my computer build, or if there are other GPUs that would be better for me? My specs are:

CPU: Intel Core i7-7700
GPU: Asus - Nvidia GTX 1050
Motherboard: ASRock Fatal1ty Z270 Gaming K4
RAM (DDR4): 2x4GB Kingston HyperX + 2x8GB G.SKILL Trident Z RGB
CPU fan: Corsair iCUE H115i RGB PRO XT
PSU: Seasonic Prime GX-750

I also intend to upgrade my CPU and motherboard later, so I need a GPU that will be usable for the next 4-5 years. Is it safe to buy the RTX 3060 Ti this year? (Note: I usually use my computer for casual gaming, 3d modelling and video editing.) Thank you very much!
 
  • Like
Reactions: Avro Arrow
The thing about the 3060ti is the 8gb vram. For casual gaming you may be ok if you are willing to turn settings down but many new games coming out are wanting more than 8gb.

Personally I wouldn’t go any less than the amd rx 6700xt or 6750xt. But for not a lot more you can get a 6800 or 6800xt.

The 6700xt is a 12gb card, the other 2 are 16gb. The newer games arriving are starting to use more than 8gb vram. If you start with the 8gb card, you may find yourself making more compromises sooner than later. At least if you got a card like a 6800xt it would perform similarly to an rtx 4070 and have more vram and last a bit longer. Of course you’ll pay more now but that would be a safer bet in my opinion.
 
While AMD offers raw performance on par with Nvidia for less money, Nvidia offers much greater performance on a handful of games that support Ray Tracing and DLSS. Meanwhile, Nvidia suffers from a handful of games due to lack of vRAM. The performance gains from DLSS when Ray Tracing is enabled is where the extra money comes in on the premium price of Nvidia. So, you will have to ask yourself is it worth it to you?

I would personally wait until later in the summer to purchase a mid-range GPU such as the RTX 4060 or 4060 Ti. DLSS 3 should be worth the wait if you are interested in Ray Tracing and DLSS and you aren't discouraged by 8GB vRAM. AMD might have their mid-range out by then as well.
 

IDProG

Distinguished
Hi, I'm currently saving up to purchase an RTX 3060 Ti in the near future, but I'm not sure which model currently fits my computer build, or if there are other GPUs that would be better for me? My specs are:

CPU: Intel Core i7-7700
GPU: Asus - Nvidia GTX 1050
Motherboard: ASRock Fatal1ty Z270 Gaming K4
RAM (DDR4): 2x4GB Kingston HyperX + 2x8GB G.SKILL Trident Z RGB
CPU fan: Corsair iCUE H115i RGB PRO XT
PSU: Seasonic Prime GX-750

I also intend to upgrade my CPU and motherboard later, so I need a GPU that will be usable for the next 4-5 years. Is it safe to buy the RTX 3060 Ti this year? (Note: I usually use my computer for casual gaming, 3d modelling and video editing.) Thank you very much!
The answer is NO. 8GB VRAM GPUs overall are on a timer.

No matter how powerful the GPU is, if it has 8GB VRAM or lower, it will be unusable in 2-3 years, unless you want to play at 1080p Low settings.

My two cents:
Do not buy 3060 Ti.
Do not buy 4060 or 4060 Ti.
Buy a 16GB VRAM GPU. THAT will last for 4-5 years.
 
  • Like
Reactions: Avro Arrow

sansnom11

Distinguished
Apr 18, 2013
118
2
18,715
The answer is NO. 8GB VRAM GPUs overall are on a timer.

No matter how powerful the GPU is, if it has 8GB VRAM or lower, it will be unusable in 2-3 years, unless you want to play at 1080p Low settings.

My two cents:
Do not buy 3060 Ti.
Do not buy 4060 or 4060 Ti.
Buy a 16GB VRAM GPU. THAT will last for 4-5 years.
Unusable in 2-3 years?

Lmao

Technology doesn't move that fast. The 3060 TI currently runs most AAA titles at 1440P 60 fps on high settings.

In 3 years you will still easily max out most games at 1080P.
 
Last edited:

IDProG

Distinguished
Unusable in 2-3 years?

Lmao

Technology doesn't move that fast. The 3060 TI currently runs most AAA titles at 1440P 60 fps on high settings.

In 3 years you will still easily max out most games at 1080P.
Spec-wise, yes, you are correct. 3060 Ti is capable of 1440p gaming.

However, its VRAM is lacking. The moment they remove the support for PS4/Xbox One (which is around 2 years from now), 16GB VRAM will be required for 1440p and 4K gaming.

You just bought a 3060 Ti, so I understand why you feel the need to go against what I said.
 
  • Like
Reactions: Avro Arrow
To add to what was already said, new games such as last of us and hogwharts legacy are already having issues at 1080p. Much less 1440p. Hogwharts legacy was shown to use about 10gb vram on 1080p with things turned up. I would expect that tend to keep continuing.

In 3-4 years you may be doing good to get 1080p medium if publishers don’t optimize their games.
 
  • Like
Reactions: Avro Arrow
To add to what was already said, new games such as last of us and hogwharts legacy are already having issues at 1080p. Much less 1440p. Hogwharts legacy was shown to use about 10gb vram on 1080p with things turned up. I would expect that tend to keep continuing.

In 3-4 years you may be doing good to get 1080p medium if publishers don’t optimize their games.
I have a 3060 ti and have played Hogwarts Legacy and currently playing TLOU at max settings at 1080p and have not had a single issue. I do understand though that 8gb of vram only was a massive oversight from Nvidia.
 

IDProG

Distinguished
I have a 3060 ti and have played Hogwarts Legacy and currently playing TLOU at max settings at 1080p and have not had a single issue. I do understand though that 8gb of vram only was a massive oversight from Nvidia.
Oh, believe me, my friend, it is NOT an oversight.

Nvidia knew exactly what they were doing. They prepared a 16GB version of 3070 and a 20GB version of 3080, but scrapped them because most of the buyers were going to be crypto miners, anyway, and they didn't need a lot of VRAM.
 

KyaraM

Admirable
Would save up a little more and get a 4070. Maybe a 6800XT if you don't care for RT and energy consumption. They aren't that much more expensive and should last you a while.

On the VRAM topic, both my friend's 3060Ti and my old 3070Ti run everything quite well even with RT, including Hogwarts Legacy, at 1440p. With my 4070Ti, I naturally have even less issues. TLoU is horribly unoptimized, including the memory use (20% VRAM reserved for OS/CPU? The heck? At least make the effort not to straight up copy-paste your console memory useage to the PC port!), and most definitely NOT anything anyone should look at for future predictions. Hogwarts Legacy also fixed memory issues quite a while ago, which apparently still didn't register with people...
 

boju

Titan
Ambassador
However, its VRAM is lacking. The moment they remove the support for PS4/Xbox One (which is around 2 years from now), 16GB VRAM will be required for 1440p and 4K gaming.

4k ultra + highest aa settings possible might scratch 10gb today. In a couple of years that might change but that's the thing about pc gaming vs console, you can adjust things without really impacting image quality all that much. Like antialiasing for example, it can be a vram hog and if you have a 1440p or higher res monitor of around 27" it's not really needed. I don't.

So I doubt 16gb will be a requirement for 4k, let alone 2.5k anytime soon. Or even the next few years for any title, unless you wanted to max everything. We'll see.

I guess it's hard for young folk these days to really compare the graphics of today between settings, it's just full bore or nothing mentality. Not like the old days and what we had to deal with and still looked great for our time aka Wolfenstein 3D and Doom lol. Compare them to now...
 
Last edited:
  • Like
Reactions: KyaraM

Found a good read about legacy where they went through a lot of stuff about RT, vram etc. I will say when I look on Newegg the cheapest 3060ti is $389. If I look at the rx 6800 non xt with 16gb vram there’s one for $469. But for $529 you can have a 6800xt also with 16gb vram.

For me that’s the thing. 389 for a 3060ti on the way out or spend a little more for cards that should be stronger and last a little longer. I think for 399 you can get an Intel arc a770 16gb as well. While Intel isn’t quite to nvidias level they seem to keep improving.
 
  • Like
Reactions: Avro Arrow

boju

Titan
Ambassador
It's 2023 and Nvidia is still trying to sell us a $450 GPU with only 8GB vram to play the same games that a $400 console plays with 16GB. :poop:

That doesn't matter. Thermal constraints will see same performance levels you'd expect from laptops. Only thing consoles have going for it in terms of performance is optimisation otherwise the devs will be spanked not only from player base but the console manufacturer + law suits. Pc hardware is so much more capable if allowed the time but these days time is money so quality is out the door most times. This is why i wait at least a month before buying a new game.
 

KyaraM

Admirable

Found a good read about legacy where they went through a lot of stuff about RT, vram etc. I will say when I look on Newegg the cheapest 3060ti is $389. If I look at the rx 6800 non xt with 16gb vram there’s one for $469. But for $529 you can have a 6800xt also with 16gb vram.

For me that’s the thing. 389 for a 3060ti on the way out or spend a little more for cards that should be stronger and last a little longer. I think for 399 you can get an Intel arc a770 16gb as well. While Intel isn’t quite to nvidias level they seem to keep improving.
I am very confused by those stats. My 4070Ti shows 10GB reserved and 8.5GB used at 1440p, with RT enabled. Here they want to see 14GB used? What? The game plays very smoothly, no stuttering. PCGH also measured 13GB VRAM use at 4k, instead of 14GB here. I think that test misses some optimizations, seeing how it is from February 9th. My old 3070Ti also produces about 10 FPS more at 1440p than measured here... as does my 4070Ti. Minimum.
 
  • Like
Reactions: ohio_buckeye

IDProG

Distinguished
4k ultra + highest aa settings possible might scratch 10gb today. In a couple of years that might change but that's the thing about pc gaming vs console, you can adjust things without really impacting image quality all that much. Like antialiasing for example, it can be a vram hog and if you have a 1440p or higher res monitor of around 27" it's not really needed. I don't.

So I doubt 16gb will be a requirement for 4k, let alone 2.5k anytime soon. Or even the next few years for any title, unless you wanted to max everything. We'll see.

I guess it's hard for young folk these days to really compare the graphics of today between settings, it's just full bore or nothing mentality. Not like the old days and what we had to deal with and still looked great for our time aka Wolfenstein 3D and Doom lol. Compare them to now...
It's easy to talk about hardware requirements when you're not working at game development.

You can watch these 2 podcasts:
View: https://youtu.be/Isn4eLTi8lQ

View: https://youtu.be/tmfHxJT1I3I
 
  • Like
Reactions: Avro Arrow

boju

Titan
Ambassador
It's easy to talk about hardware requirements when you're not working at game development.

You can watch these 2 podcasts:

Sorry for the delay had to find time to watch these and thanks man this was quite interesting.

Being a developer i knew wasn't easy but i do feel for this guy and he's probably not alone. Vram 'doesn't' really need to be at a certain point, i mean 8GB could be fine but due to time constraints and budgeting i can see where he's coming from. If only targeted systems had more vram to work with it could save a lot of time and that would also improve personal well being as well. So one hand you're waiting for the public to catch up in system specs and same time have the company you work for screaming deadlines at you. Quote from the dude,"There's only so much we can do with so little time we have", that's the crazy corporate world we live in huh.

It looks like we're getting there though, not so sure it'll happen within 3~5 years though but it might. The things they want to do, different textures for a whole range of objects. Different rocks or grass, nothing is ever the same. U5 dev reckons 16 to 20Gb vram for something like that in 1080p med, i dunno man, i think maybe even 20gb might not be enough for an avg size open world game with that kind of detail and that's only in 1080p lol. Direct storage might help with that but as said there it's not a silver bullet but it could help.
 
  • Like
Reactions: Avro Arrow

IDProG

Distinguished
Sorry for the delay had to find time to watch these and thanks man this was quite interesting.

Being a developer i knew wasn't easy but i do feel for this guy and he's probably not alone. Vram 'doesn't' really need to be at a certain point, i mean 8GB could be fine but due to time constraints and budgeting i can see where he's coming from. If only targeted systems had more vram to work with it could save a lot of time and that would also improve personal well being as well. So one hand you're waiting for the public to catch up in system specs and same time have the company you work for screaming deadlines at you. Quote from the dude,"There's only so much we can do with so little time we have", that's the crazy corporate world we live in huh.

It looks like we're getting there though, not so sure it'll happen within 3~5 years though but it might. The things they want to do, different textures for a whole range of objects. Different rocks or grass, nothing is ever the same. U5 dev reckons 16 to 20Gb vram for something like that in 1080p med, i dunno man, i think maybe even 20gb might not be enough for an avg size open world game with that kind of detail and that's only in 1080p lol. Direct storage might help with that but as said there it's not a silver bullet but it could help.
No, DirectStorage will not help. What DirectStorage does is making direct streaming assets from storage to GPU possible without the need of CPU.

Normally, a PC cannot transfer assets from storage to VRAM directly. It has to go to DRAM first, then to VRAM. So, if the game needs 16GB of data, 16GB worth of data has to go to DRAM, first, then to VRAM. That is why you need some crazy amount of DRAM like 24GB or 32GB. 12-20GB of it is "reserved" for graphical assets of the game. The DRAM acts as a "temporary storage buffer" for the assets, because:
1. Streaming data from storage to GPU directly isn't possible
2. The storage speed is too slow to reliably stream assets
3. The GPU does not have enough VRAM to store all of the assets at once, so some kind of smart asset switching algorithm needs to be implemented

DirectStorage only solves #1. It does not solve #2, meaning you still need a 3000MB/s consistent read speed SSD. It does not solve #3, meaning that you still need a GPU with VRAM that is as much or more than the consoles' RAM.

Don't get me wrong, DirectStorage helps tremendously. The processing unit that moves the assets from storage to DRAM, then to VRAM is CPU. The one that decompresses the assets is also CPU. That's a lot of work. This might be shocking, but the CPU is not that powerful, especially compared to the GPU. Go find TFLOPs of a CPU, it is not that high.
What DirectStorage does is use the GPU to decompress the assets instead of the CPU.

That said, if what Nvidia said about RTX I/O is true, and it works as intended, THAT can help in reducing the amount of VRAM GPUs need to have.
 
Last edited:

Colif

Win 11 Master
Moderator
Meanwhile, Nvidia suffers from a handful of games due to lack of vRAM. The performance gains from DLSS when Ray Tracing is enabled is where the extra money comes in on the premium price of Nvidia
that handful of games now will grow as developers stop making games that run on consoles with less than 16gb unified ram. So if op wants a GPU that lasts 4 to 5 years, the window is shrinking for 8gb cards. 12gb isn't much better really

Rt still needs vram, so having better RT doesn't help if you can't use it for lack of vram

if you willing to reduce settings, games will still run. But do you want to reduce them on a card you just bought?
 

Karadjgne

Titan
Ambassador
Blah. You cannot compare Amd Gb to nVidia Gb on a 1-1 basis like they are the same thing. They are not. They don't work the same, communicate the same, really there's very little in common between the two. It's been that way for more years than I can remember.

Essentially Amd runs @ 3/4 of what nVidia does, so an 8Gb nVidia card isn't much different to a 12Gb Amd card in ability as far as ram goes.

The 3060ti is roughly on par in performance with a Rx6700 with some titles showing advantages to one over the other, which can depend a lot on whether the game is DLSS friendly or ram happy etc.
 
  • Like
Reactions: waynewal and KyaraM

TRENDING THREADS