News Cyberpunk 2077 PC Benchmarks, Settings, and Performance Analysis

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

seymoorebutts

Distinguished
Nov 6, 2013
63
1
18,535
Someone was asking about ultrawide testing. I can't remember if they wanted it tested on RTX 3080 or some other GPU, but I tested it on a 2060 Super (with an i7-9700K, because that's what was hooked up to the 21:9 display). Here are the results:

View attachment 77

Perhaps not the best way to show performance differences, but making a 'better' chart would take more time and I decided not to bother. Basically, at Ultra (less GPU demanding), UW runs ~17% slower than WS. At RT Ultra (heavier GPU load), it's ~20% slower at 1080p and ~24% slower at 1440p. That last one is almost perfect scaling with the number of pixels to draw, which isn't really surprising as with ray tracing every pixel matters. "No pixel left behind" or something. LOL


Twas I! Ideally yes, I would have liked to see 3080 numbers, as I plan on building with an anticipated 3080 Ti, but thank you for taking the time to test UW in the first place! With other benchmarks published online and the accurate scaling based on your numbers, it definitely helps me guestimate what an RTX 3080/3080 Ti might pull at 3440 by 1440p.

I know that "4K" has become the target resolution due to consoles and the majority of consumed media being displayed in 16 : 9, but I think that UltraWide Resolutions are a vastly underappreciated advantage for PC players! At monitor display sizes, I would gladly grab some more cinematic real estate than, in my opinion, less meaningfully try to push a few more pixels per inch. The other upside is ultrawide displays being easier to drive than 4K widescreens! (At least at 21 : 9) For a lot of games, that might mean averaging 120 fps vs 90 or less. Looking at your own benchmarking, this game ain't locking 60 fps with everything cranked on an RTX 3080 at 4K, but in UW it will be damn close.
 
In general I would say this game is horribly optimized. Something is very wrong when a $1500 state of the art card is struggling to max out the game. I wish companies that don't know how to program engines would just license engines from companies that do (Id, Valve, Epic). Doom Eternal is a perfect example of a how a game should look and run. But Id actually knows how to do engines.
Doom Eternal is a fast-paced, mostly-linear FPS though, whereas this is a slower-paced open-world game taking place in a densely-packed city, so they are not particularly comparable in what they are trying to accomplish, or what their game engines need to be capable of. Not many open-world games have utilized the ID Tech engine. I guess there was RAGE the better part of a decade ago, but even though ID owns the RAGE franchise, the game's sequel wasn't developed on their own engine, instead utilizing Avalanch Studio's proprietary Apex engine. And even ID Tech can perform poorly. Dishonored 2 was built on that engine, and was widely considered to be "poorly optimized", and while it may have been less linear than something like DOOM, even it wasn't an open-world game.

Ray-Tracing isn't a "next gen" feature. It's been out for 2 years and we are on the second generation of supporting hardware.
It's just that the hardware was/is overpriced and developer support for ray tracing has been... basically nonexistent. We are up to what now, 20ish total PC games that support it? I would rather get what I pay for with a $400 PS5 than get way less than what I pay for with a $700+ RTX card (and I even already have the $1200+ in other PC hardware needed to fully use that card).
The existing consoles didn't have RT hardware, so it's a "next-gen" feature as far as consoles are concerned. Between the new consoles, AMD's new cards, and a wider range of Nvidia cards offering at least some level of RT acceleration, it should make more sense for developers to look into supporting it. And the graphics hardware in a $400 PS5 is weaker than what's in a $400 3060 Ti, so you certainly don't need a more expensive card to exceed that level of graphics performance.

It makes sense if NV is pulling some <Mod Edit>. A card that is normally on par with a 1070 is suddenly faster than a 1080 or 1080Ti. NV is deliberately gimping perfectly good 10 series cards. And during a pandemic when people are broke and stuck inside on lockdown with not much to do besides play video games. What a great company. This needs to be investigated. And I hope someone files a class action lawsuit like they did with the 970 VRAM thing.
This is likely just due to the compute capabilities of Pascal cards not being as good, at least compared to Nvidia's newer cards, or AMD's competing cards of the time. When those cards were new, it didn't make as much of a difference as most games were still targeting DX11, but even then, there were some games where AMD's cards managed to perform nearly a tier above their similarly-priced competition. Look at the Battlefield 1 (DX12) results in Tom's Hardware's RX 580 review, for example. While the RX 580 and GTX 1060 6GB performed fairly similar in most titles at the time, in that game, at those settings, the RX 580 performed significantly better, with the RX 570 4GB performing on par with the 1060 6GB. That's a lot like what we are seeing here. Vega 11 integrated graphics are competing with the 1050, the RX 570 4GB is competing with the 1060 6GB, the RX 580 is competing with the 1070, and Vega 64 is nearly at the level of a 1080 Ti. That's not necessarily because AMD's GCN architecture was "better", as it was certainly much less efficient, but in order for their architecture to compete with Nvidia's cards at the time, they had to provide more compute performance at a given price level. And with the consoles also utilizing AMD's graphics hardware, we might even be seeing effects of the game engine being optimized to make the most of that hardware.

Look at the RTX scores of 2080 Super vs 3060Ti in 1080p, it's hilarious how the maximum fps for the 2080 Super is the minimum for the 3060Ti.
What you are looking at there is likely the slightly better RT and DLSS performance making a difference, rather than Ampere-specific optimizations. Note that with RT and DLSS disabled, the two cards perform nearly identical, within a couple percent of one another. Only with those enabled does the 3060 Ti pull ahead of the 2080 SUPER by around 15%, allowing it to come within 5% of the 2080 Ti. I think it's a similar situation in Control, with the 3060 Ti getting nearly 15% more performance than the 2080 SUPER with RT cranked up and DLSS enabled. Which is probably a good sign for those getting 30-series cards, in terms of what the relative performance might be like in demanding raytraced games that require DLSS to get reasonable framerates. The performance hit from enabling RT is still huge, but not quite as bad with the new cards.

Twas I! Ideally yes, I would have liked to see 3080 numbers, as I plan on building with an anticipated 3080 Ti, but thank you for taking the time to test UW in the first place! With other benchmarks published online and the accurate scaling based on your numbers, it definitely helps me guestimate what an RTX 3080/3080 Ti might pull at 3440 by 1440p.

I know that "4K" has become the target resolution due to consoles and the majority of consumed media being displayed in 16 : 9, but I think that UltraWide Resolutions are a vastly underappreciated advantage for PC players! At monitor display sizes, I would gladly grab some more cinematic real estate than, in my opinion, less meaningfully try to push a few more pixels per inch. The other upside is ultrawide displays being easier to drive than 4K widescreens! (At least at 21 : 9) For a lot of games, that might mean averaging 120 fps vs 90 or less. Looking at your own benchmarking, this game ain't locking 60 fps with everything cranked on an RTX 3080 at 4K, but in UW it will be damn close.
As a general rule of thumb for estimating ultrawide performance, it's usually roughly in-between the nearest higher and lower widescreen resolutions. So, 1440 ultrawide tends to perform in-between 1440p and 4K, and 1080 ultrawide tends to perform in-between 1080p and 1440p. Average those two results, and you should get pretty close, so long as the lower resolution isn't significantly CPU-limited. Based on that, I would expect a 3080 at 1440 ultrawide with ultra settings (no RT, no DLSS) to deliver nearly 60fps on average, with minimums dropping to around 50fps, at least within the area of the game they tested. For raytracing, the current charts use different DLSS settings for different resolutions, making that way of estimating a little less straightforward. Without using DLSS to recover some performance, we would see framerates roughly cut in half, though DLSS balanced mode would recover most of that performance loss, likely bringing average frame rates back into the mid-50s, with minimums probably in the mid-40s.

As for the 3080 Ti, I wouldn't expect much more performance out of it. The 3090 is only around 10% faster than the 3080 here with RT and DLSS active, and the 3080 Ti will almost certainly be at least a bit weaker than that, and will most likely cost at least a couple hundred dollars more than a 3080. Even with that card, performance at 1440 ultrawide with RT and DLSS balanced mode enabled likely still won't average over 60 fps, and will still dip to around 50fps. Maybe DLSS performance mode could result in a 60+ fps experience most of the time, but it will probably look a little blurrier too.
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
506
199
19,070
I did some tests at home with the 1660 Super and the GTX 1080, and it's staggering. The article is accurate and it's perhaps something to keep in mind with NVIDIA in the future, since they still seem to rely heavily on driver optimizations for games which will stop at about 3-4 years.

I'm jumping ship to AMD's GPUs since Raytracing isn't ready for games without a considerable effort nor it will for at least 5 more years.
 
D

Deleted member 14196

Guest
Yeah well the AMD 6800 XT can’t even run it very good just check out the thread that was just created on this forum
 
D

Deleted member 14196

Guest
And they think pulling stunts like this will get people to buy new video cards. If anything this and their ridiculous pricing on the 20 and 30 series might be enough to make me switch to consoles. You get some powerful hardware for your money with the PS5 and Series X.
This is another major reason I decided to game only on consoles because I’m not paying those prices for cards that you can’t even buy anymore they were ridiculous to begin with and now they’re super ridiculous and I will never pay that price ever not for any game
 

Bamda

Honorable
Apr 17, 2017
102
36
10,610
Just for giggles I dropped down to 1280x720p Low to see how bad the game looks. IMHO the game does not look that bad at all. Obviously you want to enjoy the game at a higher resolution if possible but if you have an older PC, the game is still enjoyable. This game is decent at that resolution, unlike what I have seen on consoles.
 

mathesar

Distinguished
Jul 13, 2005
10
0
18,510
As a 2070 (non super) owner I can confirm the benchmark result is spot on at 1080P RT Ultra / DLSS Quality, I set the framerate cap at 40 in-game and actually got used to it, feels a lot better than capping 30fps.
 

seymoorebutts

Distinguished
Nov 6, 2013
63
1
18,535
Doom Eternal is a fast-paced, mostly-linear FPS though, whereas this is a slower-paced open-world game taking place in a densely-packed city, so they are not particularly comparable in what they are trying to accomplish, or what their game engines need to be capable of. Not many open-world games have utilized the ID Tech engine. I guess there was RAGE the better part of a decade ago, but even though ID owns the RAGE franchise, the game's sequel wasn't developed on their own engine, instead utilizing Avalanch Studio's proprietary Apex engine. And even ID Tech can perform poorly. Dishonored 2 was built on that engine, and was widely considered to be "poorly optimized", and while it may have been less linear than something like DOOM, even it wasn't an open-world game.


The existing consoles didn't have RT hardware, so it's a "next-gen" feature as far as consoles are concerned. Between the new consoles, AMD's new cards, and a wider range of Nvidia cards offering at least some level of RT acceleration, it should make more sense for developers to look into supporting it. And the graphics hardware in a $400 PS5 is weaker than what's in a $400 3060 Ti, so you certainly don't need a more expensive card to exceed that level of graphics performance.


This is likely just due to the compute capabilities of Pascal cards not being as good, at least compared to Nvidia's newer cards, or AMD's competing cards of the time. When those cards were new, it didn't make as much of a difference as most games were still targeting DX11, but even then, there were some games where AMD's cards managed to perform nearly a tier above their similarly-priced competition. Look at the Battlefield 1 (DX12) results in Tom's Hardware's RX 580 review, for example. While the RX 580 and GTX 1060 6GB performed fairly similar in most titles at the time, in that game, at those settings, the RX 580 performed significantly better, with the RX 570 4GB performing on par with the 1060 6GB. That's a lot like what we are seeing here. Vega 11 integrated graphics are competing with the 1050, the RX 570 4GB is competing with the 1060 6GB, the RX 580 is competing with the 1070, and Vega 64 is nearly at the level of a 1080 Ti. That's not necessarily because AMD's GCN architecture was "better", as it was certainly much less efficient, but in order for their architecture to compete with Nvidia's cards at the time, they had to provide more compute performance at a given price level. And with the consoles also utilizing AMD's graphics hardware, we might even be seeing effects of the game engine being optimized to make the most of that hardware.


What you are looking at there is likely the slightly better RT and DLSS performance making a difference, rather than Ampere-specific optimizations. Note that with RT and DLSS disabled, the two cards perform nearly identical, within a couple percent of one another. Only with those enabled does the 3060 Ti pull ahead of the 2080 SUPER by around 15%, allowing it to come within 5% of the 2080 Ti. I think it's a similar situation in Control, with the 3060 Ti getting nearly 15% more performance than the 2080 SUPER with RT cranked up and DLSS enabled. Which is probably a good sign for those getting 30-series cards, in terms of what the relative performance might be like in demanding raytraced games that require DLSS to get reasonable framerates. The performance hit from enabling RT is still huge, but not quite as bad with the new cards.


As a general rule of thumb for estimating ultrawide performance, it's usually roughly in-between the nearest higher and lower widescreen resolutions. So, 1440 ultrawide tends to perform in-between 1440p and 4K, and 1080 ultrawide tends to perform in-between 1080p and 1440p. Average those two results, and you should get pretty close, so long as the lower resolution isn't significantly CPU-limited. Based on that, I would expect a 3080 at 1440 ultrawide with ultra settings (no RT, no DLSS) to deliver nearly 60fps on average, with minimums dropping to around 50fps, at least within the area of the game they tested. For raytracing, the current charts use different DLSS settings for different resolutions, making that way of estimating a little less straightforward. Without using DLSS to recover some performance, we would see framerates roughly cut in half, though DLSS balanced mode would recover most of that performance loss, likely bringing average frame rates back into the mid-50s, with minimums probably in the mid-40s.

As for the 3080 Ti, I wouldn't expect much more performance out of it. The 3090 is only around 10% faster than the 3080 here with RT and DLSS active, and the 3080 Ti will almost certainly be at least a bit weaker than that, and will most likely cost at least a couple hundred dollars more than a 3080. Even with that card, performance at 1440 ultrawide with RT and DLSS balanced mode enabled likely still won't average over 60 fps, and will still dip to around 50fps. Maybe DLSS performance mode could result in a 60+ fps experience most of the time, but it will probably look a little blurrier too.

By the time I'll be playing the game, it will most likely be with "optimized" PC settings ala Digital Foundry or the like, and after (hopefully) several patches for performance and stability. With DLSS, I think at or near 60 fps is pretty reasonable. And I'm not specifically purchasing a 3080 Ti for this game, I'm looking for some extra VRam to future proof a bit. I think 10 Gb currently for a 3080 is pretty sufficient, but we are starting to see some games approach that limit.
 

SlickDizzy

Distinguished
Sep 13, 2006
10
0
18,510
Is there any chance we will see some CPU tests sooner than later? It is emerging that the game has major performance deficits on AMD CPUs, apparently because the game is coded to utilize only physical cores on AMD while it can utilize all logical cores on Intel.

I have a RTX3080 and Ryzen 3600 and my performance has been disappointing in areas so I’m very curious to see what CPU scaling looks like, especially after learning about the Ryzen issues...

View: https://www.reddit.com/r/Amd/comments/kbp0np/cyberpunk_2077_seems_to_ignore_smt_and_mostly/
 
D

Deleted member 14196

Guest
So basically it’s not worth playing and these guys really don’t have a clue as to what they’re doing
 

Zerk2012

Titan
Ambassador
I can confirm that your original listings were about spot on.

GTX 1070, 1440p mid settings getting from 36 to 31 FPS.
10600K @4.8 all cores game loaded on a SSD. 32GB of memory I have not checked the memory usage yet but will update.

EDIT with the low FPS I'm having no stuttering problems it's really smooth that is a surprise.

It's bad you really can't buy a video card right now, waiting on a 3070, 3070ti, 3080 to come in stock or my brother to get one and I get his 2080 S hand-me-down.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU

Demorthus

Distinguished
Mar 2, 2014
383
1
18,965
Hi, I'm mostly going to write this just to add some context as to the situation & address a few obvious things just in general related to Cyberpunk 2077.

First, bugs- understand that despite this game having officially seen its' trailer ~7 years ago; CDPR was known for The Witcher series. Do recall The Witcher 3 did not have a "pretty" launch from the start. In fact, it was worse. The game actually would hard crash & it took weeks of patching until it became what would eventually be the 'crown jewel'. Take that reference for TW3 & apply it here; because it's not simply an "excuse", it's quite literally an explanation that although some may have encountered odd things during their sessions- it has in NO WAY been as BAD as TW3.

Another point of contention is the following; TW3 & Cyberpunk share the REDEngine however, have to understand both worlds are entirely different. One is a 3rd person open-world RPG... The second? An open-world RPG with vehicles (many btw), melee weapons, (ranged) firearms, etc. In reality taking what a small studio has had a track record with & now jumping truly into a new direction. New everything.. To expect zero issues is quite unrealistic.

Here's why.. Though the game does need further optimizations & patches that I will not deny is obvious- it never actually came from a studio the size nor scope of a Rockstar or Blizzard- yet, here you are in 2020 where the hardware landscape has changed so much from when you originally began working and even finishing up say- the last Witcher DLC? & fast forward to the middle of development for the game & you're in a reality where Intel is no longer the "king" for gaming CPUs (more importantly that QUAD-CORE CPUs are not simply the 'limit' for your playerbase), you also have Nvidia grasping at your throat to add in ray-tracing (a technology not meant for games in the first place; if you wish to test it at the max settings vs. off- do not call it game-changing when you realize the only real difference to see is your framerate drop like a rock in a pond...). Are there exceptions? Sure, Minecraft is perhaps one (maybe the only one even..) where RT is visually noticeable. However in this game, I've seen & had 3D artists -just for fun- do live polls to see if people can simply IDENTIFY which picture is the one with RTX ON and off... Results: For CP 2077 most cannot differentiate the difference whatsoever... and this is comparing STILL frames, let alone when you're actively walking, running, driving, etc..

I made reference to quad cores & ray-tracing for a reason. Which is the following, insert every configuration of every machine for the people either reading this thread or whom have submitted their specs in previous posts. Now, predict the variables of what can go wrong at any given second. Seems impossible doesn't it? Likewise, it's why you must be patient and not just cry out in anger when #1 it's a studio that has issued several patches already- you don't get that from others like Blizzard; who'll wait a week or whenever they feel like it to fix something, no. CDPR actually cares about you all. #2 you own a game that will receive free DLC ( insert another title/publisher who'll never do such a thing..).

There are so many things that people willfully decide to be ignorant about because it's easier. It's easier to be mad when something doesn't go the way you want than to pause, analyze the possible issues- and submit feedback or even just comprehend the base from which it all stems from. Where in this case, there are many.
 
  • Like
Reactions: Phaaze88

Demorthus

Distinguished
Mar 2, 2014
383
1
18,965
Despite everyone's whining about …
  1. Cyberpunk pre-release performance
  2. Your article being "clickbait" because the original performance comparisons were based on pre-release code
  3. Cyberpunk post-release performance
  4. CDPR making an awful game because it doesn't run well on my (insert GPU here)
  5. Nvidia making horrible GPUs because Cyberpunk doesn't run well on my (insert GPU here)
… I just want to say thanks for all the work you are putting in answering everyone's criticisms and updating the article with useful relevant information. Thank you for trying to include hardware combinations that are still relevant to people and even moving stuff around to get it done.

Indeed. It just seems like the most convenient thing to do these days. Yet it makes the perfect smokescreen for the failures of that like Anthem or flaws in Watch Dogs Legion lol.
 

MorganPike

Prominent
Jan 8, 2020
106
43
610
There are so many things that people willfully decide to be ignorant about because it's easier. It's easier to be mad when something doesn't go the way you want than to pause, analyze the possible issues- and submit feedback or even just comprehend the base from which it all stems from.

Quite a post there dude! LOL

I guess I may have missed it but... who's angry? All the posts that I've read are just discussing the issues. I've heard maybe one person get a little out of control and that typically happens when someone has let their expectations get the better of them. In those cases just give them a day or two and they typically calm down.

I'm pretty sure the handful of complainers aren't hurting Cyberpunk sales.
 
That's the team usually responsible for the screw ups in the first place! XD
Normally it's the publisher that's to blame ... but CDPR is its own publisher so that doesn't work. Simply put: It wanted to ship in 2020 (reference to the Cyberpunk 2020 RPG), and hit the Christmas season. It's out now, for better or worse, and now the next steps involve polishing and fixing things. And really, I don't think trying to say "CDPR is small, give them a break" really holds water these days. How many people work for the studio? How much money did it make off The Witcher 3? Cyberpunk 2077 is a huge and ambitious triple-A title, not some indie pet project. It's nobody's fault other than CDPR that there were issues.
 
@JarredWaltonGPU
Is not PR part of the publisher side of things?
Sure, but if you want to look at it that way, so is the software developer side of things and the artists and HR, etc.

What I’m saying is that usually it’s the accounting side of the publishers that push for releasing buggy games early. They run the numbers and say, “If we ship a buggy game in November or December, even if it scores lower with critics and the public, we’ll sell 50 percent more copies.”

I think that’s what happened here. Accountants and executives looked at where the game was, looked at company financials, and determined it was better to ship now than to spend more time polishing the game.

Because CDPR is both the developers and publishers of CP77, it’s more difficult to say who exactly should be blamed. I’m certain some people at the company wanted to delay and spend more time fixing things. They did not win the argument.
 

CodgerFace

Prominent
Dec 14, 2020
6
6
515
Hey Jarred!

(Your pal, CodgerFace, from the Max PC and PCG disqus comments)

Just hopping in to say thanks for the huge effort on this analysis of CP2077. That's a lot of data 👀

Oh. Also, what was the overall graphics setting that was used during the 'RT Ultra' testing ? In other words, you had 'Medium' and 'Ultra' above that without RT, so I was just curious which overall preset was used with RT Ultra at each resolution.

Thanks again! Hope you're doing well. Seems like it 👍
 
@JarredWaltonGPU
I see... Thanks.

Wow. 7-8 years in development... and stocks also fell.
It was either launch now, or the company loses even more money in marketing and development costs. The execs probably aren't as 'well established' as certain ones over at Electronic Arts or Activision-Blizzard either.
Hey CodgerFace!

So, RT Ultra is its own preset. It's everything from Ultra, plus RT Shadows, RT Reflections, and RT Lighting at Ultra. (RT Medium is RT Shadows and RT Lighting at Medium, but no RT Reflections.) i'm working on a settings analysis piece, but I can say this right now:
Turn off Chromatic Aberration. It sucks. It makes the whole image way too blurry! Also, the two Volumetric settings (mostly Fog, though) don't seem to do much visually and can be turned down for a 10-15 percent performance boost. And RT Shadows isn't really needed, or RT Lighting at Ultra, so using RT Lighting at medium and regular shadows is a decent compromise.
 
  • Like
Reactions: CodgerFace

TRENDING THREADS