[SOLVED] Is the 1070ti okay in 2021?

SupersonicSaint

Commendable
Nov 19, 2020
6
0
1,510
I have the opportunity of getting a ichill GTX 1070 ti x3 v2 card for just £150 from a friend. Is the 1070ti, possibly overclocked, okay for 2020 through 2021? I will either get this and wait for what comes out next or spend a lot more for 3080 now.

Games I want to play 75pfs+
  • Cyberpunk 2077
  • COD Warzone
  • Forza Horizon 4
  • Watch Dogs Legion
  • Horizon Zero Dawn
  • Final Fantasy 16
  • Rocket League
I don't really care about ray tracing at this point given I'm content with how my PS4 Slim looks. (Although Cyberpunk 2077 ray tracing is really tempting).

What would you do in my situation? I currently have a 3080 on pre order for £750 which includes COD Cold War which I can cancel saving £600.

My current specifications are:
  • Ryzen 5 5600x
  • Asus Rog Strix B550 Gaming WI-FI
  • 32gb ddr4 3600mhz CL16 (2*16)
  • 750w power supply
  • Benq 32" 1440p 144hz monitor
 
Solution
If you're going to be getting one of the new 360hz refresh monitors, you're going to need a much more powerful card than if you were running at 60hz. A 3080 would not be inappropriate at that resolution and refresh.
iu
What you, and quite a few other folks out there have failed to realize/observe when trying to drive that many frames consistently, is that one needs:
-the fastest cpu available that they can afford
-the best combination of ram frequency and timings that they can...

Phaaze88

Titan
Ambassador
I may as well go for the 3080?

If it were a 1080p monitor instead, the 1070Ti would've been more attractive, and I wouldn't have had an issue waiting for the next gen of gpus.
3080 is real waste at 1080p, but folks are gonna do it anyway...
 

SupersonicSaint

Commendable
Nov 19, 2020
6
0
1,510
For 1440p 144Hz I’d go 3070/3080, actually I did go 3080 and it a great combination. If it had been 1080p 144Hz these cards would be overkill. However the 1070Ti is a reasonable card for 1080p 144Hz if not turning settings to the max.
As long as it is over 75fps at 1440p so don't mind. I only got 144hz for Rocket League and Genshin Impact which are eat to run.
 
I have the opportunity of getting a ichill GTX 1070 ti x3 v2 card for just £150 from a friend. Is the 1070ti, possibly overclocked, okay for 2020 through 2021? I will either get this and wait for what comes out next or spend a lot more for 3080 now.

Games I want to play 75pfs+
  • Cyberpunk 2077
  • COD Warzone
  • Forza Horizon 4
  • Watch Dogs Legion
  • Horizon Zero Dawn
  • Final Fantasy 16
  • Rocket League
I don't really care about ray tracing at this point given I'm content with how my PS4 Slim looks. (Although Cyberpunk 2077 ray tracing is really tempting).

What would you do in my situation? I currently have a 3080 on pre order for £750 which includes COD Cold War which I can cancel saving £600.

My current specifications are:
  • Ryzen 5 5600x
  • Asus Rog Strix B550 Gaming WI-FI
  • 32gb ddr4 3600mhz CL16 (2*16)
  • 750w power supply
  • Benq 32" 1440p 144hz monitor

I think you'll have a hard time hitting 75 fps + for several of those games at 1440p with a 1070Ti unless you turn settings down to medium/high. I'd personally go with the 3080 100%. I also think ray tracing will be a killer feature with Cyberpunk 2077 and wouldn't want to miss out on that.
 

Jogibearson

Prominent
Dec 28, 2019
55
7
545
Considering RayTracing is still in its infancy and will definetly not run as well as people think it does in Cyberpunk etc , the 1070ti would be a great choice. You can just keep the 600 and wait for next years "ground-breaking" new RTX line-up or just until the marked is saturated and affordable again.

(Keep in mind that most games on first-gen RTX cards run nvidia Raytracing at under 60 fps. My 1660 Super plays Modern Warfare DirectX Raytracing at roughly 50 FPS.)
 
Considering RayTracing is still in its infancy and will definetly not run as well as people think it does in Cyberpunk etc , the 1070ti would be a great choice. You can just keep the 600 and wait for next years "ground-breaking" new RTX line-up or just until the marked is saturated and affordable again.

(Keep in mind that most games on first-gen RTX cards run nvidia Raytracing at under 60 fps. My 1660 Super plays Modern Warfare DirectX Raytracing at roughly 50 FPS.)

Uh, the 1660 Super is a GTX card, not an RTX card; it does not have any hardware accelerated RT cores to speak of. Exactly how are you playing Modern Warfare with ray tracing using that GPU?
 
Through DirectX Raytracing. I can just switch it on in the settings. Sure it cuts the FPS by a hefty amount but it works.

Ok, but just to be clear, you said: "Keep in mind that most games on first-gen RTX cards run nvidia Raytracing at under 60 fps. My 1660 Super..." Even though it's true that most first gen RTX GPUs struggle with ray tracing, your GPU is a GTX card and you were using it as an example for first-gen RTX cards, which the GTX 1660 super is not.
 

Jogibearson

Prominent
Dec 28, 2019
55
7
545
Ok, but just to be clear, you said: "Keep in mind that most games on first-gen RTX cards run nvidia Raytracing at under 60 fps. My 1660 Super..." Even though it's true that most first gen RTX GPUs struggle with ray tracing, your GPU is a GTX card and you were using it as an example for first-gen RTX cards, which the GTX 1660 super is not.

I was simply saying that DirectX raytracing usually works well and even on lower end cards. Of course games run better with the RTX-Cores but that doesn't mean other cards can't run it.

And what is your point of repeating that GTX isnt RTX? It's clear that RTX is better in any way but for gaming alone, GTX is well sufficient to do just that.
 
I was simply saying that DirectX raytracing usually works well and even on lower end cards. Of course games run better with the RTX-Cores but that doesn't mean other cards can't run it.

And what is your point of repeating that GTX isnt RTX? It's clear that RTX is better in any way but for gaming alone, GTX is well sufficient to do just that.

I repeated that GTX isn't RTX because you made a statement about first-gen RTX GPUs and then immediately used your GTX 1660 Super as an example of your statement about RTX GPUs. I realized that this might confuse the OP, causing him to think that the GTX 1660 Super was in fact an RTX GPU, so pointed out the apparent discrepancy in order to clarify.
 

animekenji

Distinguished
Dec 31, 2010
196
33
18,690
I may as well go for the 3080?

If it were a 1080p monitor instead, the 1070Ti would've been more attractive, and I wouldn't have had an issue waiting for the next gen of gpus.
3080 is real waste at 1080p, but folks are gonna do it anyway...

If you're going to be getting one of the new 360hz refresh monitors, you're going to need a much more powerful card than if you were running at 60hz, or even 144hz. A 3080 would not be inappropriate at that resolution and refresh.
 

Phaaze88

Titan
Ambassador
If you're going to be getting one of the new 360hz refresh monitors, you're going to need a much more powerful card than if you were running at 60hz. A 3080 would not be inappropriate at that resolution and refresh.
iu
What you, and quite a few other folks out there have failed to realize/observe when trying to drive that many frames consistently, is that one needs:
-the fastest cpu available that they can afford
-the best combination of ram frequency and timings that they can afford

The gpu contributes so little to this - after all, it's purpose is graphical fidelity, is it not?
The cpu + ram have a greater impact on input and motion clarity.

People can sustain 360fps or more in eSports titles - the purpose of such a high refresh screen - with a bloody RTX 2060.
Why the heck do they need a 3080, when a good number of them are going to turn down the resolution, turn down the eye candy, and make whatever other competitive tweaks, to maximize their fps minimums? The bulk of that falls on the cpu + ram.
A 3080 will improve average and max fps, sure, but will do very little on the minimum end, making it a big ol' waste of money because it did so little for the performance metric that those specific players want the most.

If they already have a top tier cpu and ram, then they can go ahead and get a 3080 if they want for craps-n-giggles, I guess.
 
Last edited:
Solution
Get the 1070ti, that's a good deal. Upgrade to something better later. I got my son a 1070ti for xmas. It does exactly what it is supposed to. Are you going to get 75fps in all newer games? No, but closee enough. Turn down shadows to medium and you'll likely get there.
 
Get the 1070ti, that's a good deal. Upgrade to something better later. I got my son a 1070ti for xmas. It does exactly what it is supposed to. Are you going to get 75fps in all newer games? No, but closee enough. Turn down shadows to medium and you'll likely get there.
I used to go through and tweak all my settings manually, but lately I find that just using the geforce experience app to auto-optimize my new titles (two in particular, valhalla and cold war) they ran better than with my own manual settings or the default they initially ran at.
This may not be news to anyone else, but for me I was really suprised that geforce optimization was working better than my own discretion!
1070ti is a great card and if it wasn't for 4k becoming more prominent I'd be happy at sitting at 2k, but FOMO and just being able to.. afford the blow to the wallet, makes me want the upgrade.
 

animekenji

Distinguished
Dec 31, 2010
196
33
18,690
iu
What you, and quite a few other folks out there have failed to realize/observe when trying to drive that many frames consistently, is that one needs:
-the fastest cpu available that they can afford
-the best combination of ram frequency and timings that they can afford

The gpu contributes so little to this - after all, it's purpose is graphical fidelity, is it not?
The cpu + ram have a greater impact on input and motion clarity.

People can sustain 360fps or more in eSports titles - the purpose of such a high refresh screen - with a bloody RTX 2060.
Why the heck do they need a 3080, when a good number of them are going to turn down the resolution, turn down the eye candy, and make whatever other competitive tweaks, to maximize their fps minimums? The bulk of that falls on the cpu + ram.
A 3080 will improve average and max fps, sure, but will do very little on the minimum end, making it a big ol' waste of money because it did so little for the performance metric that those specific players want the most.

If they already have a top tier cpu and ram, then they can go ahead and get a 3080 if they want for craps-n-giggles, I guess.


Esports titles don't count because they can be run on a potato. I'm talking about running AAA titles with the highest frames possible, and you're not going to get that with a "bloody RTX 2060" with all the settings except the screen resolution maxed out.

And it is assumed that you have already maxed out your CPU and memory to get to that level and the GPU is the only thing holding you back.
 

Phaaze88

Titan
Ambassador
Esports titles don't count because they can be run on a potato. I'm talking about running AAA titles with the highest frames possible, and you're not going to get that with a "bloody RTX 2060" with all the settings except the screen resolution maxed out.
Still can't sustain 360fps in AAA even with a 3080, therefore, why does someone need a 360hz monitor for AAA?

And it is assumed that you have already maxed out your CPU and memory to get to that level and the GPU is the only thing holding you back.
Shouldn't make assumptions like that, because already there's people out there that can't, or haven't done, such a cpu and ram config for one reason or another.


You know what? Go back and reread the OP.
They were asking for opinions on what others would do in their situation. I gave them mine.
You came at me from left field about 360hz monitors - which the OP doesn't have, and so on.
Tell the OP your thoughts, not me.
 
Last edited: