Question Should i buy an RTX 3060 Ti ?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

qb176

Prominent
Apr 22, 2023
1
1
515
Hi, I'm currently saving up to purchase an RTX 3060 Ti in the near future, but I'm not sure which model currently fits my computer build, or if there are other GPUs that would be better for me? My specs are:

CPU: Intel Core i7-7700
GPU: Asus - Nvidia GTX 1050
Motherboard: ASRock Fatal1ty Z270 Gaming K4
RAM (DDR4): 2x4GB Kingston HyperX + 2x8GB G.SKILL Trident Z RGB
CPU fan: Corsair iCUE H115i RGB PRO XT
PSU: Seasonic Prime GX-750

I also intend to upgrade my CPU and motherboard later, so I need a GPU that will be usable for the next 4-5 years. Is it safe to buy the RTX 3060 Ti this year? (Note: I usually use my computer for casual gaming, 3d modelling and video editing.) Thank you very much!
 
  • Like
Reactions: Avro Arrow

Karadjgne

Titan
Ambassador
List of games that play like or look like trash at 1080p due to a only 8GB of VRAM on the RTX 3070.

If you can afford a 3070 based pc, why is 1080p a consideration? At that power level, 1440p would be the better option for anything other than competitive esports where fps is king and graphics quality is a secondary consideration.

The issue I have with every review stating that a particular game plays/looks like trash is that the reviewers do only One thing. Ultra, High, Medium presets. The one thing not a single reviewer does is tailor the settings, be they in-game or global. For instance, my Skyrim has (currently) over 170 mods, and looked like crap, glitched, stuttered, froze, turned skin solid black etc. Then I turned off ambient occlusion and [Display] iTintTextureResolution=2048 in the games .ini file and got a buttery smooth, glitch free, solid 60fps. There are generally ways to make most any game semi-perfect, reviewers don't go looking for them, they review the game as is, stock.
 
Last edited:
  • Like
Reactions: KyaraM

IDProG

Distinguished
If you can afford a 3070 based pc, why is 1080p a consideration?
Because it does not have enough VRAM to play next-gen games at 1080p.

The copium.
"Amd is 3/4 of Nvidia"
"Reviewers do not tailor the settings"

No.
The first statement was purely your assumption. I am not going to say that it is because of fanboyism, but it is pretty clear that to objective eyes, they operate basically the same way. More VRAM is always better.

As for the second,
One, most people are not LowSpecGamer. The only settings that they are going to change are Low/Medium/High/Ultra settings, maybe RT, too.
Two, if it is necessary to mess with .ini settings for the base game to work properly, then the problem is probably not the game, but your GPU's VRAM.
 
  • Like
Reactions: Avro Arrow

KyaraM

Admirable
Because it does not have enough VRAM to play next-gen games at 1080p.

The copium.
"Amd is 3/4 of Nvidia"
"Reviewers do not tailor the settings"

No.
The first statement was purely your assumption. I am not going to say that it is because of fanboyism, but it is pretty clear that to objective eyes, they operate basically the same way. More VRAM is always better.
Except there kinda exists evidence to the contrary?
View: https://imgur.com/gallery/pTPuJLW


It has been known for years that Nvidia has a more efficient memory interface. Note how AMD cards have higher useage even than Nvidia cards with 16GB and more VRAM. It's not simply because lower end Nvidia cards run out of memory. If that were the reason, 12GB Nvidia cards would have stuttering etc., too, which isn't the case. Their useage is in line with bigger Nvidia cards and they have no issues. Btw, resolution is 4k.
 

IDProG

Distinguished
Except there kinda exists evidence to the contrary?
View: https://imgur.com/gallery/pTPuJLW


It has been known for years that Nvidia has a more efficient memory interface. Note how AMD cards have higher useage even than Nvidia cards with 16GB and more VRAM. It's not simply because lower end Nvidia cards run out of memory. If that were the reason, 12GB Nvidia cards would have stuttering etc., too, which isn't the case. Their useage is in line with bigger Nvidia cards and they have no issues. Btw, resolution is 4k.
That's called RTX IO, Nvidia's attempt to justify the VRAM skimping of their GPUs. It's not because Nvidia has a more efficient memory interface.

It only managed to cut 2GB of VRAM at best. It is useful, I never say it isn't, but it will not solve the VRAM problem.
 
  • Like
Reactions: Avro Arrow

Karadjgne

Titan
Ambassador
The 3/4 thing has always been an approximation. Nvidia interface has always been better. If you had a 3Gb nvidia it had roughly the same throughput as a 4Gb amd on a frame to frame basis. Even the 12Gb 6700 is getting roughly the same fps as a 8Gb 3060 ti, which also is within spitting distance of the older 2070Super and 1080ti, none of which are 8Gb cards and perform similarly at 1080p.

Afaik, most 1080p games don't use enough vram to affect an 8Gb nvidia, simply not enough bandwidth saturation, you need to bump that to 4k to approach those numbers, which is where the lower vram of the older cards suffer as a result if settings such as frame buffering are left at their default 3.

Game settings are not just gpu bound, there's almost as many that are cpu bound and disabling or lowering things such as bloom or floating point damage or ambient occlusion in clouds can and do have a large affect on the cpu while having very little affect visually and no affect on the gpu.

Reviewers do not change global settings from default and do not use anything other than default presets because to do so in order to make the game run smoother or look better invalidates any results and makes any comparison between different cpus or gpus invalid. Default settings are always used so that comparison is the same. It's incumbent on the user to make any changes, in order to satisfy the users requirements according to the user's pc and taste.

RTX IO didn't exist before the introduction of the RTX cores, the GTX cards don't have it or use it, so claiming that it's an nvidia thing to copsplain gimping vram on cards doesn't hold water. Nvidia has historically had lower vram requirements on a card to card basis vs amd in an equitable performance range, the 3/6Gb 1060 vs the 4/8 480 etc.
 
Last edited:
  • Like
Reactions: KyaraM

IDProG

Distinguished
The 3/4 thing has always been an approximation. Nvidia interface has always been better. If you had a 3Gb nvidia it had roughly the same throughput as a 4Gb amd on a frame to frame basis. Even the 12Gb 6700 is getting roughly the same fps as a 8Gb 3060 ti, which also is within spitting distance of the older 2070Super and 1080ti, none of which are 8Gb cards and perform similarly at 1080p.

Afaik, most 1080p games don't use enough vram to affect an 8Gb nvidia, simply not enough bandwidth saturation, you need to bump that to 4k to approach those numbers, which is where the lower vram of the older cards suffer as a result if settings such as frame buffering are left at their default 3.

Game settings are not just gpu bound, there's almost as many that are cpu bound and disabling or lowering things such as bloom or floating point damage or ambient occlusion in clouds can and do have a large affect on the cpu while having very little affect visually and no affect on the gpu.

Reviewers do not change global settings from default and do not use anything other than default presets because to do so in order to make the game run smoother or look better invalidates any results and makes any comparison between different cpus or gpus invalid. Default settings are always used so that comparison is the same. It's incumbent on the user to make any changes, in order to satisfy the users requirements according to the user's pc and taste.

RTX IO didn't exist before the introduction of the RTX cores, the GTX cards don't have it or use it, so claiming that it's an nvidia thing to copsplain gimping vram on cards doesn't hold water. Nvidia has historically had lower vram requirements on a card to card basis vs amd in an equitable performance range, the 3/6Gb 1060 vs the 4/8 480 etc.
I give up in this particular forum post. Go ahead and believe whatever you want instead of the truth.

Just don't complain about "poor optimization" in 2 years when the 8GB VRAM GPUs struggle playing next-gen games at 1080p due to the low VRAM.

There is a saying "It's easier to fool someone than to convince him that he's being fooled".
 
  • Like
Reactions: Avro Arrow

Karadjgne

Titan
Ambassador
Just don't complain about "poor optimization" in 2 years when the 8GB VRAM GPUs struggle playing next-gen games at 1080p due to the low VRAM.
I'll take that bet, and see you in 2 years. There's just one small thing you are forgetting about, what ppl actually own. Game devs have 1 underlying priority, the game must be sellable. It's got to have generally low requirements while also satisfying the pc's on steroids. You bring out a game today that's only playable on 8Gb+, it's not going to sell well at all world wide because the vast majority of ppl who play do not own 3070's or better, they own 1060 to 3060 class cards.
 
  • Like
Reactions: KyaraM

IDProG

Distinguished
Game devs have 1 underlying priority, the game must be sellable.
Games must be sellable to consoles. Many game developers have said that PCs are an afterthought, since they always assume that if consoles can run it, PCs can.
You bring out a game today that's only playable on 8Gb+, it's not going to sell well at all world wide because the vast majority of ppl who play do not own 3070's or better, they own 1060 to 3060 class cards.
I do wonder about that.

By the way, this is a game that's full of political controversies on top of requiring more than 8GB VRAM.

About 1060, read this
 
  • Like
Reactions: Avro Arrow

boju

Titan
Ambassador
I can see both points. When there's a need, requirements will follow. When games do leave 2nd class console realm, however long that'll take. Say 2 yrs before we start seeing more titles with vast VASTLY more textures than anything current (which may take even longer for titles to be made with such details) then graphics card makers would probably cater more to the masses and what they can afford, including lower tiered class of cards to be suitable.
 
  • Like
Reactions: Avro Arrow

KyaraM

Admirable
Games must be sellable to consoles. Many game developers have said that PCs are an afterthought, since they always assume that if consoles can run it, PCs can.

I do wonder about that.

By the way, this is a game that's full of political controversies on top of requiring more than 8GB VRAM.

About 1060, read this
I literally know people with 960s who play this game... they even have some settings on high. That card has 4GB VRAM. No, you absolutely don't need more than 8GB to play this game.

But if the PC is only an afterthought and devs don't even attempt to optimize for it, then it's no wonder we get horrifically optimized trash games. That is, however, absolutely  NO excuse and pure laziness. Don't defend it.
 

boju

Titan
Ambassador
I literally know people with 960s who play this game... they even have some settings on high. That card has 4GB VRAM. No, you absolutely don't need more than 8GB to play this game.

But if the PC is only an afterthought and devs don't even attempt to optimize for it, then it's no wonder we get horrifically optimized trash games. That is, however, absolutely  NO excuse and pure laziness. Don't defend it.

It's not really about laziness i believe, i think it's more about publishers pushing devs too hard. If only they had more freedom, or time rather than to rely on short cuts - one of the reasons for poor optimisations. One way of doing that is to expand what is possible by letting go of limitations. Limitations being vram since it's also storage.

Should see the first video in post #18 interviewing U5 dev and he expresses well what's going on, why optimisations don't often pan out and what to probably expect in future if few things go their way. In any case, the things that's coming our way will only impress us in a big way.

The only argument here is pushing these requirements too soon. I get the notion buyers beware but this might be premature, not when events haven't eventuated yet. Fair warning for the future is as good as it gets.
 
Last edited:
  • Like
Reactions: Avro Arrow

Karadjgne

Titan
Ambassador
Games must be sellable to consoles.
Show me Any console that has the graphical power of a 3070 or better. They don't exist. Even the PS5 is only roughly equivalent to a 5700xt, or in nvidia terms, a 2070 which is closer to a 3060. Not exactly what I'd call 8Gb demand cards.

Hogwarts Legacy. DUH! It's Harry Potter. You are now into the realm of a massive world-wide phenomenon. Ppl are buying the game based on the name alone. If the characters were no-name toons in some castle somewhere with some magic powers it'd would be nowhere near as popular.
 
  • Like
Reactions: KyaraM

IDProG

Distinguished
Even the PS5 is only roughly equivalent to a 5700xt, or in nvidia terms, a 2070 which is closer to a 3060. Not exactly what I'd call 8Gb demand cards.
I think I am seeing a pattern here. You do not know that rasterization performance and VRAM usage are COMPLETELY UNRELATED things.

Let me print it out for you one more time. Pay attention.

Rasterization Performance
and
VRAM usage
are
COMPLETELY UNRELATED things.

They do not have a correlation. The massive problem with your comment is that you correlate 3070's performance to the fact that it has 8GB VRAM.
No, they're completely unrelated. Nvidia can make a 4090 with a 2GB VRAM if they want to. Doesn't mean that 2GB VRAM is enough.

Consoles might not be as powerful as the 3070, but the PS5 and Xbox Series X embarrass the 3070 so badly in games simply because they have more RAM.

Hogwarts Legacy. DUH! It's Harry Potter. You are now into the realm of a massive world-wide phenomenon. Ppl are buying the game based on the name alone. If the characters were no-name toons in some castle somewhere with some magic powers it'd would be nowhere near as popular.
Then there is no convincing you. Next, I can give you Resident Evil 4 Remake and you'll just "But, but, RE4 is the most popular RE game".
 
  • Like
Reactions: Avro Arrow
I wouldn't get that card. It has only 8GB of VRAM and there are already games where that's not enough VRAM:

The Last of Us Part 1: 1080p/1440p High, RT Off:
TLOU_High-p.webp

Hogwarts Legacy: 1080p/1440p Ultra:
Hogwarts-p.webp

"With the RTX 3070 we could notice the occasional jarring frame stutter, but the bigger issue is the image quality. The 3070 is constantly running out of VRAM and when it does the game no longer runs into crippling stuttering that lasts multiple seconds, instead all the textures magically disappear and then reappear at random. What you're left with is a flat horrible looking image and this occurs every few seconds. Even when standing completely still, the RTX 3070 keeps cycling textures in and out of VRAM as it simply can't store everything required for a given scene."
- Credit to Techspot

The RTX 3060 Ti costs at least $380. If I were you, I'd pay $10 less and get the RX 6750 XT for $370 because it's 10% faster on average (according to the TechPowerUp GPU Database) and it has 12GB of VRAM instead of only 8 so you won't have any VRAM limitations.

I hope that this changes your mind because the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti are just bad choices due to their lack of VRAM.
 
I give up in this particular forum post. Go ahead and believe whatever you want instead of the truth.

Just don't complain about "poor optimization" in 2 years when the 8GB VRAM GPUs struggle playing next-gen games at 1080p due to the low VRAM.

There is a saying "It's easier to fool someone than to convince him that he's being fooled".
There's a Doobie Brothers song that explains this perfectly:
 

Karadjgne

Titan
Ambassador
The massive problem with your comment is that you correlate 3070's performance to the fact that it has 8GB VRAM.
No. You are stuck on vram as a quallifying number. The 3070 has better boost clock speeds, higher RT core count, higher tensor core count, higher Cuda core coubt as well as having higher vram. And it all has to balance out or the card would he as useless as a 4090 with 2Gb of ram, it literally would not have the throughput to make use of the full card.

That was basically proven with the 3080ti and 3090, for gaming the 3090 excessive vram was useless.
 
  • Like
Reactions: KyaraM
No. You are stuck on vram as a quallifying number. The 3070 has better boost clock speeds, higher RT core count, higher tensor core count, higher Cuda core coubt as well as having higher vram. And it all has to balance out or the card would he as useless as a 4090 with 2Gb of ram, it literally would not have the throughput to make use of the full card.

That was basically proven with the 3080ti and 3090, for gaming the 3090 excessive vram was useless.

View: https://youtu.be/alguJBl-R3I
 

Colif

Win 11 Master
Moderator
Show me Any console that has the graphical power of a 3070 or better. They don't exist. Even the PS5 is only roughly equivalent to a 5700xt, or in nvidia terms, a 2070 which is closer to a 3060. Not exactly what I'd call 8Gb demand cards.
its nothing to do with the power of the processor, its more that they can't load the programs because they don't have enough vram. Above video shows that. If it had 16gb we wouldn't be having this discussion. 3070 8gb can't load same games, the professional version with 16gb plays them fine.

No one is arguing a PS5 is more powerful than a 3070, its a different problem. The PS5 has 16gb unified memory. Meaning it can give the GPU up to 15gb of VRAM to use. Many devs have swapped from making PS4 games that were 8gb to PS5 games that are up to 16gb. The market is moving and new games are growing, some only playable at 1080p medium with a 8gb card, regardless of how powerful the card is. It can't load the bigger texture files.

the RX7600 only having 8gb would be in same boat. Its nothing to do with how powerful processor is.
 

Karadjgne

Titan
Ambassador
And many ppl don't have $700+ to spend on a gpu when that's half or more than half of the total budget for the pc. Devs have to be able to sell the game to a wide market and gimping a game to 12Gb+ cards would be astronomically stupid if it were anything else other than labeled Harry Potter.

And that Techspot conclusion is in what context? Ultra with RT at max? Or low with RT off? 3d buffering at 3x or 1x?

And as I said, the PS5 roughly equates to a Rx5700xt, graphically. In other words you get roughly the same fps, same picture, same ability, regardless of actual ram usage.

PC gamers with a 3080 class card or better is a very small percentage of the 'gamer' market, there's always going to be simple games like CSGO and complex games like HPL, but the vast majority of games are going to be tailored for the ppl in the middle, the 4-6Gb crowd, not the high end, or you cut out too much of an already highly competitive market, which means you make no money. Meaning you can put out a game like Far Cry, that will play on just about anything, but also put out things like the HD Texture Pack for those in the high end, who can actually use it.

And I know ppl with a 3070 who play HPL a lot, and have Zero issues with vram or glitches or other anomalies described by Techspot, so my faith in that particular conclusion isn't gospel.
 
Last edited:
  • Like
Reactions: KyaraM

Colif

Win 11 Master
Moderator
It depends what you want to play. If you happy playing older games then there are 1000's of games to play, but the newer ones will keep asking more of PC to play them. That is because main market now is console and PC has long since been removed from importance. We lucky if games work.

I am not talking about Harry Potter. There are lots of games now wanting more than 12gb, have you seen the new Jedi Survivor that eats 4090 for breakfast, or some of the games coming based on unreal engine 5. Some devs don't care about creating games that work on old PC. Big AAA companies might, but others aren't so fussed. Small teams making unreal 5 games.

Some of them look amazing
the game in here where you play as a cop is so realistic.
 

KyaraM

Admirable
It depends what you want to play. If you happy playing older games then there are 1000's of games to play, but the newer ones will keep asking more of PC to play them. That is because main market now is console and PC has long since been removed from importance. We lucky if games work.

I am not talking about Harry Potter. There are lots of games now wanting more than 12gb, have you seen the new Jedi Survivor that eats 4090 for breakfast, or some of the games coming based on unreal engine 5. Some devs don't care about creating games that work on old PC. Big AAA companies might, but others aren't so fussed. Small teams making unreal 5 games.

Some of them look amazing
the game in here where you play as a cop is so realistic.
Can we please stop using obviously buggy games as examples for anything but bad programming? Please?
Btw, they already fixed that. There was a Day 1 patch that reduced VRAM useage drastically. As did Hogwarts Legacy, for that matter.
 
Everyone can argue whatever they want. For 400 dollars or more, there's no way I can recommend an 8gb gpu. Not all of the games are optimized so are, some new ones arent. I would dare say that going forward many of your AAA titles may not be or may be a long time getting patched. For that matter, will they care about patching for 2-5 year old cards? I would say get a card with the most vram and most horsepower you can.

To be honest though before I went 3060ti or even 3070, I think it would be worth having a hard look at this.


Intel is still a new player but they've been improving.

This is from a month ago it appears.

View: https://www.youtube.com/watch?v=2XUNoDno1z4


It looks like in many cases the 3060ti is still ahead but not always by a lot. I assume as drivers keep maturing performance would get better. I would definitely say in vram heavy games you might appreciate the extra vram. But worth a look anyway.

The other consideration is the rx 6800 non xt open box on newegg for 419.

 
Last edited:

7medd

Reputable
Nov 14, 2020
119
28
5,270
It's worse than I thought.

List of games that play like or look like trash at 1080p due to a only 8GB of VRAM on the RTX 3070.

04:48 - The Last of Us Part 1
05:56 - Hogwarts Legacy
06:48 - Resident Evil 4
07:21 - Forspoken
08:04 - A Plague Tale: Requiem
09:10 - The Callisto Protocol

I doubt some results here. I have a 3070 with an 11700K and I run TLOU and APT at 1080p ultra everything no dlss with like 70 to 80 fps in APT and rare stutters
60 to 80 FPS in TLOU with occasionnal stutter(high frametime) when I first enter new areas, but no framedrops
I might Make a video of gameplay, but what I wanna say is I'm no close the the stutters and framedrops happening in this review
 
  • Like
Reactions: KyaraM

HWOC

Reputable
Jan 9, 2020
148
28
4,640
For all *practical* purposes, 8GB VRAM will be enough for the next couple of years for the vast majority of games. I wouldn't buy an 8GB card now and expect it to last for very long at max details. But then again, I don't think it makes sense in any case to buy a GPU and expect it to last well for 3+ years. It's better to buy a cheaper card every 2-3 years than to buy top range every 5 years.