News Cyberpunk 2077 PC Benchmarks, Settings, and Performance Analysis

saunupe1911

Distinguished
Apr 17, 2016
203
74
18,660
5900x and EVGA FTW3 3080 Ultra in my rig. Getting over 70 FPS at 1440p Ultra with Full Ray Tracing and DLSS set to Auto. I think Auto finds a real nice balance between fast and slow paced moments. The game looks great!
 

MorganPike

Prominent
Jan 8, 2020
106
43
610
The bugs are real. Like Bethesda level. Nothing game breaking so far but it's just my first couple hours. I'll give them some time to smooth it out. Runs great in 1440 on my 2070 though.
 
Will you be putting up charts for DLSS without RT (it does sound like you tried it with the 2070 Super)? This seems like the only way many of the lower end cards will manage good framerates above 1920x1080 without significantly lowering the graphics.
I haven't been doing those tests on the initial pass, but I can look into it if there's enough interest. The general rule is that DLSS Quality adds ~50% if you're GPU limited, DLSS Balanced adds ~80%, and DLSS Performance can more than double performance. I find the upscaling artifacts of DLSS Performance quite noticeable at 1080p and even 1440p, though, which is why I've tested at 1080p Quality, 1440p Balanced, and 4K Performance.
 
Dec 10, 2020
2
1
15
I haven't been doing those tests on the initial pass, but I can look into it if there's enough interest. The general rule is that DLSS Quality adds ~50% if you're GPU limited, DLSS Balanced adds ~80%, and DLSS Performance can more than double performance. I find the upscaling artifacts of DLSS Performance quite noticeable at 1080p and even 1440p, though, which is why I've tested at 1080p Quality, 1440p Balanced, and 4K Performance.

I second his request :). Also, when you do test DLSS w/o RT, can you use a mix of lower/higher cards? Throw in a 1080Ti if possible.
 
  • Like
Reactions: VforV
Dec 10, 2020
2
1
15
Can you test the DLSS auto setting as well? Does it generally lean towards better graphics or performance? Would you recommend it over manually selecting the DLSS setting that best fits your resolution/performance needs (E.g balanced on 1440p)?
 

ssj3rd

Commendable
Dec 10, 2020
16
6
1,515
So the most interesting setting for me isn’t here yet, sigh:

1440p - RT Ultra - DLSS Quality
 
Last edited:
Dec 10, 2020
4
2
15
Why is the GTX 1070 struggling so much compared to the 1660 Super? When in most other titles on average they're pretty close with an edge to the GTX 1070 at 1440p for example (averaging a 4% lead over an average of 11 titles). In Cyberpunk 2077 that flips and translates into a 21% lead for the GTX 1660 at 1440p medium over the GTX 1070. Downloading the game now but a little scared to see how my GTX 1080 performs. 👀It seems the 10 series cards under performing a little, relative to other titles. Even the 1080 Ti goes from normally being 12.5% quicker than a 5700 XT in that same 11 title average to nearly 13% slower. It goes from practically matching the 2070 Super to being nearly 28% slower in Cyberpunk. This is just raw rasterization power that is not being translated and it actually makes no sense. The delta between the 10 series and other newer cards seems relatively larger than the norm and I'm curious as to why? I get it's a 2 generation old card but cards like the GTX 1080TI, GTX 1080, and GTX 1070 still extremely relevant, in wide use in a wide number of hands, and it is generally disappointing to see the poor optimizations for "older" hardware. Especially when they have 10 series cards on their recommended specs list like the GTX 1060 you'd assume optimization for "older" tech would be better. Also curious to see how some other older cards run on the lower settings in further benchmarking.
 
Last edited:
  • Like
Reactions: VforV
Why is the GTX 1070 struggling so much compared to the 1660 Super? When in most other titles on average they're pretty close with an edge to the GTX 1070 at 1440p for example (averaging a 4% lead over an average of 11 titles). In Cyberpunk 2077 that flips and translates into a 21% lead for the GTX 1660 at 1440p medium over the GTX 1070. Downloading the game now but a little scared to see how my GTX 1080 performs. 👀It seems the 10 series cards under performing a little, relative to other titles. Even the 1080 Ti goes from normally being 12.5% quicker than a 5700 XT in that same 11 title average to nearly 13% slower. It goes from practically matching the 2070 Super to being nearly 28% slower in Cyberpunk. This is just raw rasterization power that is not being translated and it actually makes no sense. The delta between the 10 series and other newer cards seems relatively larger than the norm and I'm curious as to why? I get it's a 2 generation old card but cards like the GTX 1080TI, GTX 1080, and GTX 1070 still extremely relevant, in wide use in a wide number of hands, and it is generally disappointing to see the poor optimizations for "older" hardware. Especially when they have 10 series cards on their recommended specs list like the GTX 1060 you'd assume optimization for "older" tech would be better. Also curious to see how some other older cards run on the lower settings in further benchmarking.
Different GPU architectures. Pascal was released in 2016, and it doesn't have all of the updates present in Turing. Concurrent FP + INT may be part of it, probably Turing is just better with DX12 code as well. It might also just be optimizations targeting Turing and Ampere more than Pascal, but there are other cases where GTX 1660 Super does better relative to GTX 1070. It's not the usual case, but also not impossible, particularly with a game that's using a lot of high-end visual techniques.
The 1080 Ti doesn't have Tensor cores, so you can't test it for DLSS.
Correct! I could test it with FidelityFX CAS, though ... but I need to do some image quality comparisons first before deciding if CAS is worth using. Probably, but don't expect miracles. Also, 1080 Ti, 1080, 1070, and 1060 6GB have been added to the charts, so I'm basically done with Pascal for a bit. I need to do some more 20-series and RX 5000/500 stuff next I think. We'll see where the voting stands when I wake up. https://s.surveyplanet.com/TnEtEkm3e
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
I cannot believe the GTX 1660 Super is equal to the GTX 1080 in this game.... or how bad the GTX 1080 runs, if I were to say it the other way around.

None of the older games has this scenario and none of the newer games have it neither.
AC Valhalla, WD Legion, RDR2, Death Stranding, Borderlands 3, Gears 5, etc - in all of them, always the GTX 1080 is comfortable ahead of the GTX 1660 Super/Ti.

GTX 1070 & GTX 1080 Ti are also down from where they are supposed to be...

They completely **** on the Pascal generation. Way to go nvidia (with CDPR support) on "forcing" Pascal users to upgrade to Ampere!
 

ssj3rd

Commendable
Dec 10, 2020
16
6
1,515
still not 1440p - RT Ultra - DLSS Quality testing? <Mod Edit> me
 
Last edited by a moderator:

dpeter45

Distinguished
Sep 22, 2011
8
4
18,515
Why is the GTX 1070 struggling so much compared to the 1660 Super? When in most other titles on average they're pretty close with an edge to the GTX 1070 at 1440p for example (averaging a 4% lead over an average of 11 titles). In Cyberpunk 2077 that flips and translates into a 21% lead for the GTX 1660 at 1440p medium over the GTX 1070. Downloading the game now but a little scared to see how my GTX 1080 performs. 👀It seems the 10 series cards under performing a little, relative to other titles. Even the 1080 Ti goes from normally being 12.5% quicker than a 5700 XT in that same 11 title average to nearly 13% slower. It goes from practically matching the 2070 Super to being nearly 28% slower in Cyberpunk. This is just raw rasterization power that is not being translated and it actually makes no sense. The delta between the 10 series and other newer cards seems relatively larger than the norm and I'm curious as to why? I get it's a 2 generation old card but cards like the GTX 1080TI, GTX 1080, and GTX 1070 still extremely relevant, in wide use in a wide number of hands, and it is generally disappointing to see the poor optimizations for "older" hardware. Especially when they have 10 series cards on their recommended specs list like the GTX 1060 you'd assume optimization for "older" tech would be better. Also curious to see how some other older cards run on the lower settings in further benchmarking.

Nvidia and CDPR colluded to slow down older but still good cards to sell newer cards that people don't need. NV does shady stuff like this all the time. They probably rolled some crap into the drivers to nerf the 10 series.

In general I would say this game is horribly optimized. Something is very wrong when a $1500 state of the art card is struggling to max out the game. I wish companies that don't know how to program engines would just license engines from companies that do (Id, Valve, Epic). Doom Eternal is a perfect example of a how a game should look and run. But Id actually knows how to do engines.

Basically if you make a game and it doesn't run flawlessly on the top 10 cards on the steam hardware survey, you have failed.
 
Last edited:

magbarn

Reputable
Dec 9, 2020
119
92
4,670
I cannot believe the GTX 1660 Super is equal to the GTX 1080 in this game.... or how bad the GTX 1080 runs, if I were to say it the other way around.

None of the older games has this scenario and none of the newer games have it neither.
AC Valhalla, WD Legion, RDR2, Death Stranding, Borderlands 3, Gears 5, etc - in all of them, always the GTX 1080 is comfortable ahead of the GTX 1660 Super/Ti.

GTX 1070 & GTX 1080 Ti are also down from where they are supposed to be...

They completely **** on the Pascal generation. Way to go nvidia (with CDPR support) on "forcing" Pascal users to upgrade to Ampere!
Nvidia's Sandbagging has begun with Pascal....
Yeah the performance delta between the 2080Ti and 1080Ti is much much bigger than before....
 

dpeter45

Distinguished
Sep 22, 2011
8
4
18,515
Nvidia's Sandbagging has begun with Pascal....
Yeah the performance delta between the 2080Ti and 1080Ti is much much bigger than before....

And they think pulling stunts like this will get people to buy new video cards. If anything this and their ridiculous pricing on the 20 and 30 series might be enough to make me switch to consoles. You get some powerful hardware for your money with the PS5 and Series X.
 
I got R5 2600x 16gb ram @3200 and hybrid 1080 ti @ 4k. I'm overclocking 1080ti by 70(2050mhz) on the clock and 500(6003mhz) on the memory and I'm getting 32-34 fps on Medium @ 4k and it's very playable and smooth enough. Since it's RPG and not FPS, I'll wait until next gen cards come out.

Edit: HAGS ON
 

MorganPike

Prominent
Jan 8, 2020
106
43
610
Nvidia and CDPR colluded to slow down older but still good cards to sell newer cards that people don't need. NV does shady stuff like this all the time. They probably rolled some crap into the drivers to nerf the 10 series.

Easy enough to test. Run some older games on it that have performance videos on youtube. I don't doubt it's happening, but as I said, not real hard to show if it is.

I wish companies that don't know how to program engines would just license engines from companies that do (Id, Valve, Epic).

Would have been nice to see this in Unreal. As it is it feels like I'm playing Deus Ex again and I hated that engine, whatever it was.
 

seymoorebutts

Distinguished
Nov 6, 2013
63
1
18,535
Any chance on running some tests at 21:9 resolutions? Currently getting ready to procure parts for my first build since 2013, as a GTX 770 isn't cutting it these days. I'm hoping Nvidia will be looking to release a 3080 Ti in January, and I think it might be the sweet spot for this game at 3440x1440p.
 
Any chance on running some tests at 21:9 resolutions? Currently getting ready to procure parts for my first build since 2013, as a GTX 770 isn't cutting it these days. I'm hoping Nvidia will be looking to release a 3080 Ti in January, and I think it might be the sweet spot for this game at 3440x1440p.
I may have to carry a PC downstairs -- or carry a monitor upstairs -- to do UW testing. I've got one, but not in my office. Usually, UW is about 15% lower performance than the equivalent WS resolution (give or take 5%).
Nvidia and CDPR colluded to slow down older but still good cards to sell newer cards that people don't need. NV does shady stuff like this all the time. They probably rolled some crap into the drivers to nerf the 10 series.
The idea that a game must do 60 fps at 1080p medium or high or whatever on whatever hardware people arbitrarily define is ludicrous. As is the idea that Nvidia is intentionally trying to reduce performance on older GPUs. I've looked into that before, and never found a clear instance where a game's performance inexplicably dropped on older generation hardware (and was never fixed with an updated driver) -- a major game overhaul the makes it more taxing obv. doesn't count. That doesn't mean new games won't perform worse relative to old games, but that's because the technology marches onward.

Turing has concurrent FP and INT support, which can be a big deal. In fact, it is a big deal in a lot of games. Older games might only improve by 10-15 percent thanks to having concurrent FP + INT, but some games run up to 35 percent faster. The more complex shader code becomes, the more likely the FP+INT support will help. Now add in the fact that this is a DX12 game, and Pascal never was quite as good at DX12 generic code compared to AMD's GPUs, as well as Turing. Turing has architectural updates that help it do better in DX12 mode, in part because DirectX Raytracing requires DX12! (Except for the oddball Crysis Remastered.)

Those two aspects are more than enough reason for Turing generation GPUs to perform better than Pascal generation. Could further code optimizations improve performance on Pascal? Absolutely. But each GPU architecture entails a different sort of code to reach maximum performance, and at some point developers have to draw the line. Could Nvidia have tried to help CDPR get Pascal GPUs to work better? Yes. Maybe they even did that, but the mix of shaders and graphics effects is just too much for the older GPUs. It happened with the GTX 900 series, and the GTX 700 series, and so on.
 
And they think pulling stunts like this will get people to buy new video cards. If anything this and their ridiculous pricing on the 20 and 30 series might be enough to make me switch to consoles. You get some powerful hardware for your money with the PS5 and Series X.

The PS5 and series X have performance somewhere between the RTX 2060 Super and the RTX 2070 Super. Unless your current GPU is slower, buying a new console is going to leave you sorely disappointed.
 
Dec 10, 2020
4
2
15
I may have to carry a PC downstairs -- or carry a monitor upstairs -- to do UW testing. I've got one, but not in my office. Usually, UW is about 15% lower performance than the equivalent WS resolution (give or take 5%).

The idea that a game must do 60 fps at 1080p medium or high or whatever on whatever hardware people arbitrarily define is ludicrous. As is the idea that Nvidia is intentionally trying to reduce performance on older GPUs. I've looked into that before, and never found a clear instance where a game's performance dropped on older generation hardware. That doesn't mean new games won't perform worse relative to old games, but that's because the technology marches onward.

Turing has concurrent FP and INT support, which can be a big deal. In fact, it is a big deal in a lot of games. Older games might only improve by 10-15 percent thanks to having concurrent FP + INT, but some games run up to 35 percent faster. The more complex shader code becomes, the more likely the FP+INT support will help. Now add in the fact that this is a DX12 game, and Pascal never was quite as good at DX12 generic code compared to AMD's GPUs, as well as Turing. Turing has architectural updates that help it do better in DX12 mode, in part because DirectX Raytracing requires DX12! (Except for the oddball Crysis Remastered.)

Those two aspects are more than enough reason for Turing generation GPUs to perform better than Pascal generation. Could further code optimizations improve performance on Pascal? Absolutely. But each GPU architecture entails a different sort of code to reach maximum performance, and at some point developers have to draw the line. Could Nvidia have tried to help CDPR get Pascal GPUs to work better? Yes. Maybe they even did that, but the mix of shaders and graphics effects is just too much for the older GPUs. It happened with the GTX 900 series, and the GTX 700 series, and so on.


Would be cool to see a guide of which settings effect Pascal more and result in a bigger performance hit in Pascal relatively due to its limitations. Also thanks for throwing in the GTX 1080 and then the GTX 970 through my older hardware rec! Appreciate that! Also in following that theme, would you mind testing old AMD hardware like a R9 290x or 390x for comparison? You mentioned how AMD hardware tends to perform better in DX12 so would be interesting to see if this remains true with older AMD hardware as well. Especially considering the older AMD tech faring relatively well in low level APIs still would be interesting to see if they're aging like "fine wine" continues in this title as we see in for example Doom 2016 and Eternal performance. Thanks again! Obviously still really disappointed in my GTX 1080, was hoping it would make it for one last major title release and at least keep up with the 2060, but it is what it is I guess. Had this thing for over 4 years, willing to replace it and upgrade, just wish there were 30 series cards in availability to actually replace it with. 😔🙃
 
Last edited:
Despite everyone's whining about …
  1. Cyberpunk pre-release performance
  2. Your article being "clickbait" because the original performance comparisons were based on pre-release code
  3. Cyberpunk post-release performance
  4. CDPR making an awful game because it doesn't run well on my (insert GPU here)
  5. Nvidia making horrible GPUs because Cyberpunk doesn't run well on my (insert GPU here)
… I just want to say thanks for all the work you are putting in answering everyone's criticisms and updating the article with useful relevant information. Thank you for trying to include hardware combinations that are still relevant to people and even moving stuff around to get it done.