[SOLVED] Upgrade my 2070 super with a second card or go to 3080?

Sep 3, 2020
3
0
10
Hey this is my first post here. I recently bought a 2070 super and while it's been wonderful overall, I'm worried about the next generation of games like cyberpunk. I play my games at 1440p and will have all of my system specs below. I was curious if anyone knows the comparison in performance with 2070 super sli vs the new 3080 card just released. I'd like to run games mostly at max settings at my resolution, willing to play at 30 fps for higher demanding games. Should I just sell my 2070 super and buy a 3080 or go sli with 2 supers? Any advice would be great! Thank you.

Corey

EVGA GEFORCE RTX 2070 Super
Intel Core i7-6700k @ 4.00GHz
32.0 GB of DDR4 ram
Corsair 800 watt power supply
Windows 10
 
Solution
SLI doesn't provide much if any benefit in most games. Some it provides a little boost. A few are well optimized for SLI and get a big boost. A single 3080 will be a better option.

Although, I don't see a reason to worry. Once you get games you can't get the framerates you want. Then that is the time to get a new card. Not before hand. Plus you can likely knock the settings down a little to get the framerates you want. Settings which likely won't make much visual difference. Unless scrutinized side by side.
SLI doesn't provide much if any benefit in most games. Some it provides a little boost. A few are well optimized for SLI and get a big boost. A single 3080 will be a better option.

Although, I don't see a reason to worry. Once you get games you can't get the framerates you want. Then that is the time to get a new card. Not before hand. Plus you can likely knock the settings down a little to get the framerates you want. Settings which likely won't make much visual difference. Unless scrutinized side by side.
 
Solution
Do you think it's time for me to replace my cpu then? I think cyberpunk did recommend the newest but how much better is it?

Depends on the FPS you want. At 4Ghz you still have a fair bit of OC headroom. It's still a pretty good CPU. Except for the most CPU intensive, heavily multi-threaded games. You probably won't notice much difference between it and a new one. Especially if you bring it up to 4.4/4.5Ghz and play at 1440p. Those aren't crazy OC numbers.
 
Sep 3, 2020
3
0
10
Thanks bud. I future proofed my system for the most part, didn't expect Nvidia's next gpu to be such a huge upgrade... Also the pricing seems more consumer friendly so i might just sell my 2070 super and take the hit.
 

gonesy

Distinguished
Aug 29, 2007
215
0
18,680
I added another RTX 2070 Super to my computer, all I got was heat! Even though there is a gap between the cards the difference in temperature between the two was noticeable and, as already has been said, the fps improvement wasn't that great.

If you've got a 2070 Super I'd hold onto it until, as also has been said, you notice a drop in performance you're not happy with, but I think that's going to be a while yet.
 

groo

Distinguished
Feb 3, 2008
1,046
6
19,295
Thanks bud. I future proofed my system for the most part, didn't expect Nvidia's next gpu to be such a huge upgrade... Also the pricing seems more consumer friendly so i might just sell my 2070 super and take the hit.
The concept of "future-proofing" sounds good, until you run the numbers. You are far better off buying cheaper stuff more often then blowing a huge wad and sitting on it for awhile. You end up with a better machine on average and spend less money doing it.
My history has been; sit on my system until a new game I want comes out that requires an upgrade, then I upgrade to a bit over recommended for the game. You already a few steps above Projekt Red's recommended video card for 2077.
I've never bothered with such a small upgrade as to go from a 2070 to a 3080. By the time you get to where the 2070 is unacceptable, the 3080 will also be garbage, no DX15 or whatever, after all.
 
Last edited:
  • Like
Reactions: poorbugger

specterk

Distinguished
Dec 1, 2011
4
3
18,515
If you are only interested in playing at 1440p at sane framerates, there is absolutely no need to upgrade beyond a 20-series (and there won't be for a long, long time). If you can maintain your target framerate and resolution without any form of V/Gsync active, there's nothing to gain - and if you can't, you're still much more likely to benefit more from a display upgrade, an overclock, or some well-thought-out driver level tweaking. That's the short version; if you're interested in some GPU food for thought, you can continue reading as you please.

ENGINEERED OBSOLESCENCE:
There are no obsolete premium (i.e. non-mobile, non-budget series) graphics cards still for sale within the current premium market. It's important to understand that NVIDIA has gone the way of Apple lately, and particularly with its RTX line; they are only producing new product lines with the intention of retiring old (still perfectly viable) lines MUCH sooner than the market would naturally push them out, thus keeping the median prices (and their profit margins) for their GPUs as high as possible. This is a terrible anti-consumer trend that, if not checked, is eventually going to run most of us out of the affordable gaming business as other component manufacturers and vendors follow suit. Don't encourage them by buying a card you don't need!

IT STILL WORKS!:
The current (and next) generation of PC games can be maxed out on properly installed, moderately-overclocked 10th (and even 9th!) generation GTX gpus with frame buffers of at least 4gb, at 1080p to 1440p resolution, playing at 60hz and higher, with no issues, by a competent user. That last part about competence is pretty important, though; a significant amount of "free" performance will cost a modicum of time to figure out how to attain it. Most GTX cards since at least the Fermis, for instance, have been capable of at least a 20-30% overclock (OVER their preconfigured vendor-overclocks!) out of the box, without even touching voltage. Better-binned cards can almost always do 40-50% overclock while still remaining well below voltage and thermal ceilings for added peace of mind, and the really good ones can go higher than that if you know what you're doing (yes, even on air). Before you even consider any GPU upgrade, first see how much free value you can get out of yours!

OVERKILL AND BETTER ALTERNATIVES:
Core and memory speed (i.e. render throughput) have both been ahead of the computational render curve in games for quite some time now. It is only when you combine drastic increases in pixel density (resolution), egregious levels of supersampling styles of AA, unnecessary resource-intensive render synchronization schemes and massively-overkill framerates that your framebuffer requirements begin to skyrocket, and your (considered) personal version of that formula is what should drive your GPU upgrade strategy. Instead of V/Gsync, use scanline synchronization - same or better result as a sync scheme, with none of the performance cost. Instead of AA, get a better display that performs scaling operations locally and with better accuracy - same or better result as higher levels of AA, with none of the performance cost. Instead of hiking up your framerate, take the time to learn how to optimize your drivers (and, for online FPS games, your network) for low latency - same or better result as higher framerates, with none of the performance cost. Most of the "features" you think you need a new GPU in order to attain/maintain are literally just lazy-layperson's options; if you can just easily change your behavior and save a ton of money, why not do it?

FRAMEBUFFER = BOTTOM LINE:
When your preference for higher resolution (biggest impact), higher anti-aliasing (almost equal impact), higher framerate (next biggest) and max detail levels (less impact) combined are causing your GPU to fully utilize its RAM and causing framebuffer churn, (and assuming you've done a decent overclock already), that's when it makes sense to upgrade. Until then, your CPU is always going to be your (render) bottleneck - and that will increasingly be true as time goes on, given that most of the "shiny" advancements in render technology are, and will continue to be, in the realm of real-time post-processing - which is going to be heavily tied to your CPU for at least as long as games primarily utilize only one or several cores (which, in turn, is going to be for a long time).

For reference, I'm personally still running a GTX 970 ME (technically only a 3.5gb card) with a very moderate overclock (1500 core, 4500 mem), and can easily run any high-fidelity AAA current title at 1440p, capped at 60hz, with everything but AA maxed (I usually run 2x - and even that is overkill for most titles at 1440p on my monitor). Because I don't need 2160p resolution, and because I prefer to remove latency from my games by correctly configuring gpu driver profiles specifically for those games rather than by simply throwing 120/144hz at it until I can't feel the difference, the only upgrade worth considering in my situation will eventually be an 8gb card - but it's just not necessary yet. There ARE a few games that can max my framebuffer on ultra at 1440p... but as long as I'm still overshooting my target framerate, that doesn't matter. When I do upgrade, I'm just going to try to snag a cheap old 1070 or 1080, because I know they'll both do 4k at 60hz with no issues well into the next several generations of games (and I don't care about ray-tracing gimmicks, which I won't get into).

Just some food for thought. People have been spending way, WAY more money on GPUs than they need to for a long time, now.
 

groo

Distinguished
Feb 3, 2008
1,046
6
19,295
If you are only interested in playing at 1440p at sane framerates, there is absolutely no need to upgrade beyond a 20-series (and there won't be for a long, long time). If you can maintain your target framerate and resolution without any form of V/Gsync active, there's nothing to gain - and if you can't, you're still much more likely to benefit more from a display upgrade, an overclock, or some well-thought-out driver level tweaking. That's the short version; if you're interested in some GPU food for thought, you can continue reading as you please.

ENGINEERED OBSOLESCENCE:
There are no obsolete premium (i.e. non-mobile, non-budget series) graphics cards still for sale within the current premium market. It's important to understand that NVIDIA has gone the way of Apple lately, and particularly with its RTX line; they are only producing new product lines with the intention of retiring old (still perfectly viable) lines MUCH sooner than the market would naturally push them out, thus keeping the median prices (and their profit margins) for their GPUs as high as possible. This is a terrible anti-consumer trend that, if not checked, is eventually going to run most of us out of the affordable gaming business as other component manufacturers and vendors follow suit. Don't encourage them by buying a card you don't need!

IT STILL WORKS!:
The current (and next) generation of PC games can be maxed out on properly installed, moderately-overclocked 10th (and even 9th!) generation GTX gpus with frame buffers of at least 4gb, at 1080p to 1440p resolution, playing at 60hz and higher, with no issues, by a competent user. That last part about competence is pretty important, though; a significant amount of "free" performance will cost a modicum of time to figure out how to attain it. Most GTX cards since at least the Fermis, for instance, have been capable of at least a 20-30% overclock (OVER their preconfigured vendor-overclocks!) out of the box, without even touching voltage. Better-binned cards can almost always do 40-50% overclock while still remaining well below voltage and thermal ceilings for added peace of mind, and the really good ones can go higher than that if you know what you're doing (yes, even on air). Before you even consider any GPU upgrade, first see how much free value you can get out of yours!

OVERKILL AND BETTER ALTERNATIVES:
Core and memory speed (i.e. render throughput) have both been ahead of the computational render curve in games for quite some time now. It is only when you combine drastic increases in pixel density (resolution), egregious levels of supersampling styles of AA, unnecessary resource-intensive render synchronization schemes and massively-overkill framerates that your framebuffer requirements begin to skyrocket, and your (considered) personal version of that formula is what should drive your GPU upgrade strategy. Instead of V/Gsync, use scanline synchronization - same or better result as a sync scheme, with none of the performance cost. Instead of AA, get a better display that performs scaling operations locally and with better accuracy - same or better result as higher levels of AA, with none of the performance cost. Instead of hiking up your framerate, take the time to learn how to optimize your drivers (and, for online FPS games, your network) for low latency - same or better result as higher framerates, with none of the performance cost. Most of the "features" you think you need a new GPU in order to attain/maintain are literally just lazy-layperson's options; if you can just easily change your behavior and save a ton of money, why not do it?

FRAMEBUFFER = BOTTOM LINE:
When your preference for higher resolution (biggest impact), higher anti-aliasing (almost equal impact), higher framerate (next biggest) and max detail levels (less impact) combined are causing your GPU to fully utilize its RAM and causing framebuffer churn, (and assuming you've done a decent overclock already), that's when it makes sense to upgrade. Until then, your CPU is always going to be your (render) bottleneck - and that will increasingly be true as time goes on, given that most of the "shiny" advancements in render technology are, and will continue to be, in the realm of real-time post-processing - which is going to be heavily tied to your CPU for at least as long as games primarily utilize only one or several cores (which, in turn, is going to be for a long time).

For reference, I'm personally still running a GTX 970 ME (technically only a 3.5gb card) with a very moderate overclock (1500 core, 4500 mem), and can easily run any high-fidelity AAA current title at 1440p, capped at 60hz, with everything but AA maxed (I usually run 2x - and even that is overkill for most titles at 1440p on my monitor). Because I don't need 2160p resolution, and because I prefer to remove latency from my games by correctly configuring gpu driver profiles specifically for those games rather than by simply throwing 120/144hz at it until I can't feel the difference, the only upgrade worth considering in my situation will eventually be an 8gb card - but it's just not necessary yet. There ARE a few games that can max my framebuffer on ultra at 1440p... but as long as I'm still overshooting my target framerate, that doesn't matter. When I do upgrade, I'm just going to try to snag a cheap old 1070 or 1080, because I know they'll both do 4k at 60hz with no issues well into the next several generations of games (and I don't care about ray-tracing gimmicks, which I won't get into).
Just some food for thought. People have been spending way, WAY more money on GPUs than they need to for a long time, now.
The engineered obsolescence has to do with chip wafer yields and what not. If you are reliably producing 7nm chips, and are tooled up for them, why waste money producing 22nm chips? Especially if you would have to sell them at a discount.