News Cyberpunk 2077 PC Benchmarks, Settings, and Performance Analysis

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hey CodgerFace!

So, RT Ultra is its own preset. It's everything from Ultra, plus RT Shadows, RT Reflections, and RT Lighting at Ultra. (RT Medium is RT Shadows and RT Lighting at Medium, but no RT Reflections.) i'm working on a settings analysis piece, but I can say this right now:
Turn off Chromatic Aberration. It sucks. It makes the whole image way too blurry! Also, the two Volumetric settings (mostly Fog, though) don't seem to do much visually and can be turned down for a 10-15 percent performance boost. And RT Shadows isn't really needed, or RT Lighting at Ultra, so using RT Lighting at medium and regular shadows is a decent compromise.

OK, understood. Thanks for the reply!

From your eyes-on experience, would you play at 1440p with DLSS Balanced or is it worth the bump to DLSS Quality? And if DLSS Quality is 'recommended', did you experience any VRAM issues with the 3070 at that setting at 1440p (similar to how the lows for the 3070 dropped below those of the 2080ti, at 4K)?

Just asking because of the "awkward" situation the 3070 seems to be in with the ability to deliver decent fps with RT on, but possibly being on the knife-edge with regard to VRAM capacity depending on the DLSS setting used in conjunction with RT. And you 'know' me, I'm not a 'bro, you nEeD 20GB VRAM or ur casual' person. Just curious for the sake of giving good buying advice to friends and randoms trying to build PCs for CP2077 (and "next gen" games). Cheers
 
OK, understood. Thanks for the reply!

From your eyes-on experience, would you play at 1440p with DLSS Balanced or is it worth the bump to DLSS Quality? And if DLSS Quality is 'recommended', did you experience any VRAM issues with the 3070 at that setting at 1440p (similar to how the lows for the 3070 dropped below those of the 2080ti, at 4K)?

Just asking because of the "awkward" situation the 3070 seems to be in with the ability to deliver decent fps with RT on, but possibly being on the knife-edge with regard to VRAM capacity depending on the DLSS setting used in conjunction with RT. And you 'know' me, I'm not a 'bro, you nEeD 20GB VRAM or ur casual' person. Just curious for the sake of giving good buying advice to friends and randoms trying to build PCs for CP2077 (and "next gen" games). Cheers
Yeah, I didn't see any VRAM issues with the 3070, but I didn't specifically try to create VRAM issues. AFAIK, DLSS Quality vs. DLSS Balanced shouldn't really affect VRAM use much. Balanced renders a lower res and thus runs faster, which is as expected. Is the quality trade off worth it? That's harder to say, There are definitely scenes where DLSS doesn't handle the textures well, but they're the exception rather than the rule. I do have one area of my benchmark sequence that has those textures, though, and now I notice the odd rendering EVERY TIME when I test with DLSS. LOL

Realistically, people should plan on a few things:
  1. No 60 fps with ray tracing maxed out at 4K. (3090 can get there with a few tweaks, but everything else needs more effort.)
  2. DLSS Auto is a reasonable compromise if you enable RT.
  3. I think the best suite of settings for quality (with an RTX card) is something like DLSS Auto, Volumetric at medium, Chromatic Aberration off, Anisotropy at 4, Screen Space Reflections off (leave it to RT, or at most go with Low), turn on RT Reflections and RT Lighting at medium, leave RT Shadows off.

That should give you around 50-ish fps on a 3080 at 1440p (with a sufficiently fast CPU), and over 30 fps on any other RTX card (maybe the RTX 2060 will fall short).
 
  • Like
Reactions: CodgerFace
Yeah, I didn't see any VRAM issues with the 3070, but I didn't specifically try to create VRAM issues. AFAIK, DLSS Quality vs. DLSS Balanced shouldn't really affect VRAM use much. Balanced renders a lower res and thus runs faster, which is as expected. Is the quality trade off worth it? That's harder to say, There are definitely scenes where DLSS doesn't handle the textures well, but they're the exception rather than the rule. I do have one area of my benchmark sequence that has those textures, though, and now I notice the odd rendering EVERY TIME when I test with DLSS. LOL

Realistically, people should plan on a few things:
  1. No 60 fps with ray tracing maxed out at 4K. (3090 can get there with a few tweaks, but everything else needs more effort.)
  2. DLSS Auto is a reasonable compromise if you enable RT.
  3. I think the best suite of settings for quality (with an RTX card) is something like DLSS Auto, Volumetric at medium, Chromatic Aberration off, Anisotropy at 4, Screen Space Reflections off (leave it to RT, or at most go with Low), turn on RT Reflections and RT Lighting at medium, leave RT Shadows off.
That should give you around 50-ish fps on a 3080 at 1440p (with a sufficiently fast CPU), and over 30 fps on any other RTX card (maybe the RTX 2060 will fall short).

Got it. Appreciate the feedback. Have a good one.
 
  • Like
Reactions: JarredWaltonGPU
i know this is an enthusiast site, but i would specifically like to see this game performance using the 1060 GTX 6gb *arguably the most popular GPU in the planet, (also in the official system reqs) under different configurations: with a 5600x, with stronger CPU, with 32000 CL 16 32GB SINGLE RANK kits, with 32gb DUAL RANK KITS, 3600 cl18 KITS, with mitigations disabled, enabled, i mean. go crazy. thanks!
 
First, bugs- understand that despite this game having officially seen its' trailer ~7 years ago; CDPR was known for The Witcher series. Do recall The Witcher 3 did not have a "pretty" launch from the start. In fact, it was worse. The game actually would hard crash & it took weeks of patching until it became what would eventually be the 'crown jewel'. Take that reference for TW3 & apply it here; because it's not simply an "excuse", it's quite literally an explanation that although some may have encountered odd things during their sessions- it has in NO WAY been as BAD as TW3.
I don't think that's entirely accurate. The general consensus seems to be that Cyberpunk is very buggy, and performance, visuals and stability on the base consoles makes it nearly unplayable on those platforms, while I didn't see feedback as bad around The Witcher 3's launch. And while The Witcher 3 may have been quite demanding on hardware at the time of its release, low frame rates tend to be much less of an issue in games played from a third-person perspective. Ultimately, The Witcher 3 got a 90+ Metascore on all platforms, whereas Cyberpunk is currently seeing Metascores in the low-50s on consoles. Now sure, those same consoles are quite a bit older now, but it might have been best if CD Projekt just dropped those older platforms once it became clear that the game likely wouldn't run well on them, and instead focused their resources on fixing the game's numerous bugs and getting the "next gen" improvements ready for launch. The only reason they didn't, is because they knew they wouldn't be getting a majority of those 8 million pre-orders. As for giving the game "weeks of patching", it was already supposedly feature complete over a year ago, and they've been working on fixing and optimizing it ever since. I imagine things will improve to some extent in the coming weeks, but it's still likely to be a buggy, janky mess for months to come.

In reality taking what a small studio has had a track record with & now jumping truly into a new direction. New everything.. To expect zero issues is quite unrealistic.
They apparently had around 500 people working on the game. That's not exactly what I would call a "small studio". And even The Witcher 3 had around half that.

Probably more like 3-4 years. They had to get TW3 and its expansions done first.
Definitely over 4 years. They apparently had a small number of people working on initial plans for the game following its announcement in 2012, but started devoting full resources to developing it after the Witcher 3 expansions released, which would have been over 4.5 years ago.

i know this is an enthusiast site, but i would specifically like to see this game performance using the 1060 GTX 6gb *arguably the most popular GPU in the planet, (also in the official system reqs) under different configurations: with a 5600x, with stronger CPU, with 32000 CL 16 32GB SINGLE RANK kits, with 32gb DUAL RANK KITS, 3600 cl18 KITS, with mitigations disabled, enabled, i mean. go crazy. thanks!
With a 1060, performance is going to be graphics limited practically the entire time. So it shouldn't matter much what CPU and RAM one is using, so long as they are reasonably modern and have a sufficient number of threads to go around, as the results should be nearly identical to what's shown for that card in the article. So, most likely framerates typically in the 30s for 1080p at medium settings, or in the 60s for 720p at low settings. Or somewhere in-between, depending on how much one values frame rates over visuals.
 
  • Like
Reactions: MorganPike
Exactly. This was more than expected.
It's history being repeated. Each major architecture change or new console launch, devs like CDPR push it to the limits. They did it with The Witcher games and are doing it again.

Now we have the result. Only with the highest end hardware of this year can we enjoy next gen features like ray tracing at its fullest.

Well, actually you need next gen Highest end hardware to enjoy this gen features like ray tracing at its fullest!
3090 is not fast enough for it. But it is good example what low end to middle range ray tracing will look at low resolution...
At 2023 we will get first GPUs that can run this game as it "should be played!"
 
Right now I think no RT and DLSS quality is the most interesting to me. Fps tanks pretty hard with RT on not that it doesn't look good but the performance hit isn't worth it maybe 2022 gpu's will handle it but for now rt is a niche not the norm.
 
Right now I think no RT and DLSS quality is the most interesting to me. Fps tanks pretty hard with RT on not that it doesn't look good but the performance hit isn't worth it maybe 2022 gpu's will handle it but for now rt is a niche not the norm.

No RT and DLSS quality is running great for me at high settings (not ultra) at 1440 on rtx 2070. Looks great, runs great. RT isn't worth the hit. I'm sure it will be some day.
 
Cyberpunk 2077 is the most anticipated game of 2020, and it supports ray tracing and DLSS. The release version also adds support for FidelityFX CAS, in case you don't have an RTX card.

Cyberpunk 2077 PC Benchmarks, Settings, and Performance Analysis : Read more
I understand 99.9999999% will NOT do this,BUT wonder how the game handles Duel GPUs? I know the RTX 3090 will SLI, but not sure if the 5900 XT will, it would be just a nice thing to see how it does or does not handle SLI and what it does to the performance.
 
I understand 99.9999999% will NOT do this,BUT wonder how the game handles Duel GPUs? I know the RTX 3090 will SLI, but not sure if the 5900 XT will, it would be just a nice thing to see how it does or does not handle SLI and what it does to the performance.
It doesn't, at all, utilize multi-GPU. Because it's a DX12 engine, any multi-GPU needs to be explicitly handles by the game developers. Obviously, CD Projekt Red has much bigger things to worry about right now. Any difference seen in multi-GPU testing elsewhere would be due to the vagaries of Windows and having multiple GPUs available, plus margin of error.
 
I suppose to do test with single CPU and GPU and various memory speed and channels mode.
When I played Cyberpunk 2077 I have 2-channel DDR4 3200, but now I use 1-channel memory only. And experience afwul bad - fps drops, statters, texture loads on screen (like PS4).