News Cyberpunk 2077 System Requirements: What Hardware Do You Need?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yeah for AMD gaming right now I wouldn't recommend anything less than a 3700X, but a 3900X for high end games would be preferable. I'm sure that the 3950X would be the best, but not everyone wants / needs a $700 CPU with 16 cores.
The additional cores and extra chiplet actually have a negative impact on latency and gaming performance in a lot of games. Ryzen 9 3900X is almost universally faster than the Ryzen 9 3950X in the gaming tests I did back at launch, and the few times I've looked since then it still usually loses -- not by a lot, but often the order of performance goes:

Ryzen 7 3800X is slightly faster than...
Ryzen 9 3900X is basically tied with (wins some, loses some)...
Ryzen 7 3700X is a few percent faster than...
Ryzen 9 3950X is barely faster than...
Ryzen 5 3600
 
The additional cores and extra chiplet actually have a negative impact on latency and gaming performance in a lot of games. Ryzen 9 3900X is almost universally faster than the Ryzen 9 3950X in the gaming tests I did back at launch, and the few times I've looked since then it still usually loses -- not by a lot, but often the order of performance goes:

Ryzen 7 3800X is slightly faster than...
Ryzen 9 3900X is basically tied with (wins some, loses some)...
Ryzen 7 3700X is a few percent faster than...
Ryzen 9 3950X is barely faster than...
Ryzen 5 3600

Wow, so all you really need is a Ryzen 3600? I would have thought much higher.
 
Wow, so all you really need is a Ryzen 3600? I would have thought much higher.
This is data from last year (when I was at PC Gamer), but here are the overall average fps at 1080p ultra with RTX 2080 Ti FE:

CPURyzen 9 3950XRyzen 9 3900XRyzen 7 3700XRyzen 5 3600XRyzen 5 3600Ryzen 7 2700X
Avg FPS (9 games)
119.8​
120.7​
120.9​
119.5​
118.4​
111.4​
99th Percentile
82.4​
82.2​
82.9​
81.1​
79.7​
75.3​

So I guess technically 3900X and 3700X are tied (very slightly lead to 3700X), and the 3950X is basically tied with the 3600X. Of course, different games and test settings would show some variation, but that was with nine different games:

The Division 2 (DX12 1080p Ultra FPS)
Far Cry 5 (DX11 1080p Ultra FPS)
Hitman 2 (DX12 1080p Max FPS)
Metro Exodus (DX12 1080p Ultra FPS)
Assassin's Creed Odyssey (DX11 1080p Ultra FPS)
Strange Brigade (DX12 1080p Ultra FPS)
Middle-Earth Shadow of War (DX11 1080p Ultra FPS)
Shadow of the Tomb Raider (DX12 1080p Ultra FPS)
Total War Warhammer 2 (DX11 1080p Ultra FPS)
 
This is data from last year (when I was at PC Gamer), but here are the overall average fps at 1080p ultra with RTX 2080 Ti FE:

CPURyzen 9 3950XRyzen 9 3900XRyzen 7 3700XRyzen 5 3600XRyzen 5 3600Ryzen 7 2700X
Avg FPS (9 games)
119.8​
120.7​
120.9​
119.5​
118.4​
111.4​
99th Percentile
82.4​
82.2​
82.9​
81.1​
79.7​
75.3​

So I guess technically 3900X and 3700X are tied (very slightly lead to 3700X), and the 3950X is basically tied with the 3600X. Of course, different games and test settings would show some variation, but that was with nine different games:

So if I were to clock my 3900X at 4.5GHz with all cores operating it would perform the same as a 3600X?
 
So if I were to clock my 3900X at 4.5GHz with all cores operating it would perform the same as a 3600X?
No. Those are all stock performance figures, and I think the Ryzen 5 3600 by default runs at around 4.1-4.2GHz in gaming workloads. The Ryzen 9 3900X also runs at around 4.1-4.2GHz, maybe a bit faster. Basically, more cores means it doesn't normally hit the maximum 4.6GHz boost clock, but the 3600 actually comes much closer to its maximum 4.2GHz boost clock. That's my recollection of performance when I looked at clock speeds during gaming, but I don't have detailed logs of everything at this point.

Anyway, if you clock both the 3900X and 3600 at the same speed, for most games the performance will be very similar. It's mostly the same with Intel -- if you run 10900K, 10700K, 10600K, 9900K, 9700K, 9600K all at 4.7GHz, the difference in gaming performance is going to be very small. And it will only get smaller if you start including 1440p or 4K gaming results.

There are always exceptions, of course -- Ashes of the Singularity would probably prefer the higher core and thread counts for example. However, there are also exceptions that go the other way. Some games will actually prefer no Hyper-Threading (or SMT), and some also prefer slightly fewer cores/threads than the 3900X/3950X offer. That will probably be less of a factor over time, but right now many games simply don't know what to do when there's more than about 20 CPU threads available.

I think it was Far Cry 5 where performance peaked on Ryzen 9 3900X, but then the 3950X was slower, Threadripper 3960X was slower still, and Threadripper 3970X was about 20-30% off the expected pace. But if you disabled half the cores, performance shot back up.
 
  • Like
Reactions: artk2219
This is data from last year (when I was at PC Gamer), but here are the overall average fps at 1080p ultra with RTX 2080 Ti FE:

CPURyzen 9 3950XRyzen 9 3900XRyzen 7 3700XRyzen 5 3600XRyzen 5 3600Ryzen 7 2700X
Avg FPS (9 games)
119.8​
120.7​
120.9​
119.5​
118.4​
111.4​
99th Percentile
82.4​
82.2​
82.9​
81.1​
79.7​
75.3​

So I guess technically 3900X and 3700X are tied (very slightly lead to 3700X), and the 3950X is basically tied with the 3600X. Of course, different games and test settings would show some variation, but that was with nine different games:

The Division 2 (DX12 1080p Ultra FPS)
Far Cry 5 (DX11 1080p Ultra FPS)
Hitman 2 (DX12 1080p Max FPS)
Metro Exodus (DX12 1080p Ultra FPS)
Assassin's Creed Odyssey (DX11 1080p Ultra FPS)
Strange Brigade (DX12 1080p Ultra FPS)
Middle-Earth Shadow of War (DX11 1080p Ultra FPS)
Shadow of the Tomb Raider (DX12 1080p Ultra FPS)
Total War Warhammer 2 (DX11 1080p Ultra FPS)

Honestly all that chart tells me is that for the most part these cpu's all perform about the same, who would really notice the difference between 111 and 121 FPS. Hell, all of the ryzen 3rd gen parts are almost within margin of error of each other. They could very well swap positions all over the place depending on the motherboard, game, ram, power fluctuations, etc. I know this is an average, so it will definitely vary depending on what youre doing, but I can see why people are on the "just get the Ryzen 5 3600 for gaming" cpu train, atleast prior to the R3 3100 and 3300x ,which for games that only use up to 8 threads are crazy good contenders for the price.
 
Last edited:
I remember reading a similar article about Doom Eternal's requirements and thought my system might fall short. I am running the game at 1440p in Ultra Nightmare at 60 frames per second (thanks in large part to my Titan X cards' 12GB of VRAM). Here's hoping they can handle Cyberpunk 2077 (even without all the ray-traced glory).
 
I don't understand the article. Most predictions place the minimum CPU as an FX series AMD or Intel i3.
I'm not saying, put one of them in a new build, but for most people, there would be no need for a new build. Certainly not the level recommended here.
 
I don't understand the article. Most predictions place the minimum CPU as an FX series AMD or Intel i3.
I'm not saying, put one of them in a new build, but for most people, there would be no need for a new build. Certainly not the level recommended here.
Most likely none of these predictions will be accurate. Only the developers are likely to have a reasonable idea about what type of hardware the game will require, and even that could change depending on how optimizations go in the 4+ months before the game launches. It's difficult to predict things like CPU demand in an open-world game like this. And of course, some people might be fine running the game at 30fps on low settings, but for those seeking ultra settings at 60+fps, better hardware will likely be required.
 
Most likely none of these predictions will be accurate. Only the developers are likely to have a reasonable idea about what type of hardware the game will require, and even that could change depending on how optimizations go in the 4+ months before the game launches. It's difficult to predict things like CPU demand in an open-world game like this. And of course, some people might be fine running the game at 30fps on low settings, but for those seeking ultra settings at 60+fps, better hardware will likely be required.
It's not that that difficult to predict. The Witcher 3 was made by the same people, and while the graphics engine behind Cyberpunk is evolved, I doubt the CPU demands will be much higher. Especially since it still needs to run on Xbox One and PlayStation 4. So if you have a CPU that was powerful enough to run The Witcher 3, it will certainly be able to run Cyberpunk 2077 -- though obviously not at ultra high fps, and it will still need a good GPU.

As for the question about Core i3 or an FX-series CPU, yeah, both can probably run the game. But the FX series had a lot of issues, and it might struggle to maintain 30 fps in crowded city environments -- especially on something like the FX-6300. 60 fps in Night City will almost certainly require a 6-core Core i5 or Ryzen CPU, or 4-core/8-thread (i7-6700K) at a minimum. If you have a 4-core/4-thread CPU, don't be surprised when it causes some stuttering. Because that was the case with 4-core/4-thread CPUs in places like Novigrad in The Witcher 3, and Cyberpunk 2077 definitely isn't going to be easier on hardware than TW3.

Basically, running the game won't be that hard. Running it well is another matter entirely.
 
It's not that that difficult to predict. The Witcher 3 was made by the same people, and while the graphics engine behind Cyberpunk is evolved, I doubt the CPU demands will be much higher.
The Witcher 3 is arguably a rather different game though, even beyond the fact that it's over 5 years old at this point, and the developers will probably be targeting newer hardware.

The Witcher tends to be set in more open, wilderness areas with long view distances and vegetation, whereas the setting of Cyberpunk will mainly be far more urban. That might help Cyberpunk's performance in some ways, with shorter view distances and more angular surfaces potentially keeping graphics demands in check despite increased details, but there may also be more NPCs around at any given time, which could have an adverse effect on CPU performance. And it's really difficult to say what sort of changes they might have made to things like AI, and animations and such in the last half-decade.

And of course, being played from a first-person perspective, dips in performance are likely to be a lot more noticeable than when your viewpoint is pivoting around your character, especially when playing with responsive controls, like mouse input. The existing consoles might be able to get away with 30fps when using slower gamepad input, but that tends to not be great on the PC, and I suspect the PC requirements may target 60fps. And of course, the "recommended" specs on PC are likely to be for high settings, which will likely feature improved effects compared to what were put into games at the time of The Witcher 3's release. The fact that the game is coming out on existing consoles will likely keep performance in check to some degree, but it will still almost certainly be more demanding than the Witcher 3.
 
If the engine is as well optimized as Id Tech engine was on 2016, probably it´ll run in far-from-recommended CPUs, GPus and RAM ammount.

I remember that on some sites the minimum requirements to play DOOM 2016 were Core i5, 8GB RAM and GTX 980.

I was like "f**ck, I will not ever play Doom with my old Core 2 Quad, my GT 740 and only 4 GB of RAM"!!.

WRONG!!, I was able to play it at "decent" quality (most on medium settings) at no less of 45-55 fps, with some peaks of more than 60, at 4:3 with widescreen aspect, not real 16:9.

Only adding a far less powerful GPU than requierd (GTX 1050) I was able to play it at constant 60+ fps at most high settings 16:9. We all have hope.

What I think is that most powerful OLD systems, armed with GTX1080s will play it ad 60 fps with decent quality with no problems, BUT (I repeat, BUT) if the engine is well optimized. Also I´m quite sure that at minimum settings 720p it could be played even on RYZEN 3 3200U laptops, of course I am being TOO speculative, and probaly I am being a stupid "Wisfulthinker" (I made a simple "rule of three", if DOOM 2016 runs better on my RYZEN LAPTOP than in my Core2Quad with GT 740 dedicated CPU, and I can play CP2077 on my Core2Quad System...).

If not be prepared for CP2077 to be some kind of "Gaming Performance Test" as Crysis was 13 years ago.
 
Last edited:
For Australians, those parts are almost $3700. Just to play a game? That's insane.
I can't find the Adata XPG Gammix drive, but 1TB seems to be around A$250 -A $350 depending on brand
1TB is not much storage unless the only thing you plan to run is this game and a couple of others.
maybe you should add in A$550 - A$600 for a 10TB HDD drive?
What I found interesting though is that the parts list didn't include a monitor, mouse or Keyboard. Probably want to allow at least another A$600 - A$1200 depending on monitor size and resolution
Hmmm... Really want to throw that much at one game?
I think I'll pass.

I think you guys are good candidates for GeForce Now. Prices in Australia are downright ridiculous, so as long as latency isn't atrocious, that might be the best way to go.