[SOLVED] RTX 3080 versus RTX 3070 Ti for 1440p gaming ?

Anti illuminati

Distinguished
Jun 26, 2015
106
0
18,580
hello guys. i doubt between 3070 ti and 3080 for gaming. i watched youtube videos comparing there two in 1440p and the difference was between 10 to 20 fps. the 3080 is rare and expensive but 3070 ti is much cheaper and more available. but on the other sider the 8 gb vram of 3070ti made me doubt that will it be a problem in 2k gaming in the future or no? while the 3080 vram is 10gb and there is no problem.

whats your opinion? should i go for 3070 ti and pay more and buy 3080? in addition i play everything in ultra setting and want to get more that 60 fps
 
Solution
Fps is all cpu. And it's a home pc, with all the stuff that comes on home pc's like Antivirus, bloatware, background apps, discord or YouTube running on a second monitor etc. Thats going to make a sizable difference to fps counters based purely on who is running the game on what.

Those videos are done on clean machines, fresh installs of windows that do realistically nothing more than the game. So you will not get the same fps. Those vids show what is possible, not what is probable.

That said, with a slightly less powerful cpu, the fps will be lower, so onus is not on the gpu as much. Meaning the 3070ti in a home pc can very well get almost the same fps as a 3080 at 1440p as the 3080 can have extra headroom that's not being maxed...
I have a 3080 and game predominantly at 1440p but occasionally at 4K. I bought my 3080 when launched so there was no 3070Ti. As you have seen the performance difference in reviews at 1440p is not enough to make any difference to the gaming experience. You could probably change 1-2 settings and get the same FPS. As for the amount of vram I don’t see it as a problem for 1440p. I don’t have any problems with 10gb running 4K so 8gb for 1440p seems ok to me. Remember when looking at reviews including vram usage that a game will usually use more if it available, it does not mean it needs that much. It’s hard to determine what is actually required.
 

Karadjgne

Titan
Ambassador
Fps is all cpu. And it's a home pc, with all the stuff that comes on home pc's like Antivirus, bloatware, background apps, discord or YouTube running on a second monitor etc. Thats going to make a sizable difference to fps counters based purely on who is running the game on what.

Those videos are done on clean machines, fresh installs of windows that do realistically nothing more than the game. So you will not get the same fps. Those vids show what is possible, not what is probable.

That said, with a slightly less powerful cpu, the fps will be lower, so onus is not on the gpu as much. Meaning the 3070ti in a home pc can very well get almost the same fps as a 3080 at 1440p as the 3080 can have extra headroom that's not being maxed out by the cpu.

If a 3080 is capable of 250fps at ultra, and the 3070ti capable of only 200fps at the exact same settings, but the cpu can only supply 150fps, both cards will hit 150fps. No difference. It's only when the cpu is capable of greater than 200fps limit of the 3070ti that you'd see any difference. And you'd have to be doing a side by side comparison to even care.

Get what you can get, even at 1440p, the difference is honestly negligible.
 
Solution

Anti illuminati

Distinguished
Jun 26, 2015
106
0
18,580
Fps is all cpu. And it's a home pc, with all the stuff that comes on home pc's like Antivirus, bloatware, background apps, discord or YouTube running on a second monitor etc. Thats going to make a sizable difference to fps counters based purely on who is running the game on what.

Those videos are done on clean machines, fresh installs of windows that do realistically nothing more than the game. So you will not get the same fps. Those vids show what is possible, not what is probable.

That said, with a slightly less powerful cpu, the fps will be lower, so onus is not on the gpu as much. Meaning the 3070ti in a home pc can very well get almost the same fps as a 3080 at 1440p as the 3080 can have extra headroom that's not being maxed out by the cpu.

If a 3080 is capable of 250fps at ultra, and the 3070ti capable of only 200fps at the exact same settings, but the cpu can only supply 150fps, both cards will hit 150fps. No difference. It's only when the cpu is capable of greater than 200fps limit of the 3070ti that you'd see any difference. And you'd have to be doing a side by side comparison to even care.

Get what you can get, even at 1440p, the difference is honestly negligible.
i should mention that my cpu is 11700k
 

Anti illuminati

Distinguished
Jun 26, 2015
106
0
18,580
I have a 3080 and game predominantly at 1440p but occasionally at 4K. I bought my 3080 when launched so there was no 3070Ti. As you have seen the performance difference in reviews at 1440p is not enough to make any difference to the gaming experience. You could probably change 1-2 settings and get the same FPS. As for the amount of vram I don’t see it as a problem for 1440p. I don’t have any problems with 10gb running 4K so 8gb for 1440p seems ok to me. Remember when looking at reviews including vram usage that a game will usually use more if it available, it does not mean it needs that much. It’s hard to determine what is actually required.
i mean in the future the 8gb will be a problem?
 
i mean in the future the 8gb will be a problem?
No way to know when it will be a problem. At some point in the future any card you buy today will have inadequate vram. You can turn down game settings if it does become a problem.

Another way of looking at it. If in 2 years 8gb is a problem then sell the card, add the money you saved from not buying a 3080 and you will probably have enough to get a much better gpu.
 
  • Like
Reactions: Bassman999

DSzymborski

Curmudgeon Pursuivant
Moderator
And really, VRAM quantity is a bit overrated, thanks to the penchant of games reporting VRAM that is allocated rather than VRAM that is actually used.

Just because I put name in on seven Newegg Product Shuffles in a given day doesn't mean I'm actually going to end up with the result of buying all seven of those products.
 

logainofhades

Titan
Moderator
AMD does have such features, now. RT, granted, isn't as good, but FSR is a pretty good alternative to DLSS. Drivers are fine. I have had 0 issues with my RX 6800, with regards to drivers. No good game support is also not accurate either.

https://www.amd.com/en/gaming/featured-games

I understand wanting RTX for its features set, but want to make you aware of what team red has too. My other rig has an RTX 2060, that I was using before I got my RX 6800 @MSRP.
I really only play WoW, and Shadowlands was optimized for AMD graphics, so it worked in my favor. I refuse to pay scalper prices so was after whatever I could get, at, or near MSRP, be it an RTX 3070, or an RX 6800.
 
And really, VRAM quantity is a bit overrated, thanks to the penchant of games reporting VRAM that is allocated rather than VRAM that is actually used.
There are a couple of tools/ways available to see not just allocated but used VRAM.
Several AAA titles today use more than 10GBs of VRAM with all settings maxxed out at 1440p and up. I've used the Special K DirectX framework to see this directly, for myself.
Some of the talk about GDDR6x being so fast that, 'you don't need as much VRAM,' is complete bulloks - VRAM speed is no substitute for VRAM capacity.
Here are some real-world results from hitting the 'VRAM cliff' -

View: https://www.reddit.com/r/nvidia/comments/itx0pm/doom_eternal_confirmed_vram_limitation_with_8gb/



Now, can someone just dial back texture settings to keep VRAM usage under 10GBs when using the RTX 3080 - absolutely. But VRAM quantity is definitely not overrated. The GPU needs to have the right amount of VRAM for its 'rasterization horsepower'. The RTX 3080 is a perfect example of NVIDIA giving too little VRAM to make sure the card has a shorter-than-it-should lifespan. This card will definitely run out of VRAM before it's rasterization horsepower is used up. The RTX 3060 (with 12GBs VRAM) is the exact opposite. It's rasterization horsepower will be used up way before it's 12GBs VRAM is fully utilized.
 
Last edited:
I vote the 3070Ti. Even if there is a problem on maxed out settings what does max settings actually achieve?
I'm merely stating the facts regarding VRAM usage in games today. Whether the OP thinks that Ultra settings are a waste of GPU rasterization horsepower and VRAM is for them to research, test out, and decide.

If you want to recommend a video card that will run today's AAA games at High settings and tomorrow's (2 years from now) games at Medium, that's fine.
I would rather recommend a video card that will run today's AAA games at Ultra settings and tomorrow's (2 years from now) games at High.
 
I'm merely stating the facts regarding VRAM usage in games today. Whether the OP thinks that Ultra settings are a waste of GPU rasterization horsepower and VRAM is for them to research, test out, and decide.

If you want to recommend a video card that will run today's AAA games at High settings and tomorrow's (2 years from now) games at Medium, that's fine.
I would rather recommend a video card that will run today's AAA games at Ultra settings and tomorrow's (2 years from now) games at High.

Your facts are based on gpu’s of a different generation which can be different in how efficiently the use vram. Where’s the comparison between a 3070Ti and 3080? Here is a benchmark I found of the same game you quoted at Ultra Nightmare settings at 1440p, the difference between the 3070Ti and 3080 is small enough you would never notice it if you didn’t have an FPS counter


https://www.techpowerup.com/review/gigabyte-geforce-rtx-3070-ti-gaming-oc/15.html
doom-eternal-2560-1440.png


As I said earlier, pocket the savings from going with a 3070Ti then use it to upgrade in 2 years time to something that outperforms both the 3070ti and 3080.

That’s my vote anyway but the OP is free to take all opinions and make their own choice. Ultimately none of us have a working Crystal ball so we are guessing what’s the better option.
 
  • Like
Reactions: alceryes