News Cyberpunk 2077 System Requirements: Ray Tracing Recs Revealed

Admin

Administrator
Staff member
Nov 21, 2018
19,857
858
2,860
The Cyberpunk 2077 system requirements are quite tame, but the baseline hardware won't support a ray tracing, DLSS, or other high-tech visuals. Here's the hardware you'll really want to make the game look its best when it launches on November 19.

Cyberpunk 2077 System Requirements: Read more
 
Last edited by a moderator:

King_V

Illustrious
Ambassador
What sort of experience will this deck get you? CDPR doesn't say, so it might be 720p minimum quality at 30 fps, or it might be 1080p low quality at 60 fps. If we were to hazard a guess, it's closer to the former than the latter.

and

The required/recommended hardware listings are kinda of pointless anyway since they don't tell me what I can expect.

For all I know the required hardware is to run the game at low settings at 1024x768 to get 30FPS.

These are the things that drive me nuts about minimum and recommended requirements - at what resolution and frame rate? It may be pretty egregious here, but it seems like most games don't specify.

Absolutely maddening!
 
Ah yes, nothing like trusting your brand new shiny ryzen 3600, b550 board, and rtx 2060 to a apevia branded, andyson built unit with no verification on quality. What can go wrong lol.
There's a certain level of quality that has to be reached to hit 80 Plus Gold. Plus, the PSU I originally selected is no longer available without spending $100. The reality is that the 'minimum for ray tracing' build will pull about 300W peak from the PSU. For that level of power, any 80 Plus Bronze or higher PSU will suffice. If you want to try loading up 800W on the Apevia? Yeah, that's likely asking for trouble. But basic efficiency and power requirements for a modest PC mean anything should work.

Let me look around a bit more and see if I can find a better alternative. The PSU price spikes are painful, though! I just want a 550W or higher 80 Plus Bronze PSU for $50. They were readily available a year ago, but not now.

Ah, here's a Thermaltake 600W Gold for $70. I don't know if it's really better than the Apevia (might even be the same ODM), but the 600W rating is much more believable. https://www.newegg.com/p/N82E16817153395
 

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
The obvious question is how well did Witcher 3 perform under minimum specs. The minimum GPU is GeForce GTX 660 or Radeon HD 7870. Here're some numbers from a Polish site. The cards are all at or below minimum as far as I can tell.

w3m_low_1920.png


At lower resolution:

w3m_llow_1366.png
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
Worst article I have read in ages from TH.

  • Suggesting you need and Nvidia gpu for the game to look good.
  • Saying we have no idea about the fps these setups would yield, and then pulling literally from thin air ideas about what fps these setups would yield.
  • Trying to sell new hardware, even with prices listed.
  • No actual new information about absolutely anything.
  • Don't even get me started with "processors from 2012"... Like, its about a decade since we lived in an era where 8 years of processor technology actually meant something in GPU intensive games (especially ones to be gamed at high resolutions).

Just a load of crap that may only make sense if you care about tech as such. If you focused on how much your tech investment brings you gaming performance, then you'd realize that you constantly get less and less for the same money. A decade ago a top notch gpu was max 600e, now 1000e or more (and no, its not about inflation, inflation would make that 600e card in 2010 be equivalent of 690e in 2020).

So. Stop saying some recommended specs are wrong if you have nothing concrete to back your claims up with. Stop suggesting you need an nvidia RTX card to enjoy games. Stop blowing hot air in new tech, when purchasing new tech gives you increasingly diminishing returns with every passing year. Thanks.
 
Worst article I have read in ages from TH.

  • Suggesting you need and Nvidia gpu for the game to look good.
  • Saying we have no idea about the fps these setups would yield, and then pulling literally from thin air ideas about what fps these setups would yield.
  • Trying to sell new hardware, even with prices listed.
  • No actual new information about absolutely anything.
  • Don't even get me started with "processors from 2012"... Like, its about a decade since we lived in an era where 8 years of processor technology actually meant something in GPU intensive games (especially ones to be gamed at high resolutions).
Just a load of crap that may only make sense if you care about tech as such. If you focused on how much your tech investment brings you gaming performance, then you'd realize that you constantly get less and less for the same money. A decade ago a top notch gpu was max 600e, now 1000e or more (and no, its not about inflation, inflation would make that 600e card in 2010 be equivalent of 690e in 2020).

So. Stop saying some recommended specs are wrong if you have nothing concrete to back your claims up with. Stop suggesting you need an nvidia RTX card to enjoy games. Stop blowing hot air in new tech, when purchasing new tech gives you increasingly diminishing returns with every passing year. Thanks.
I've been testing and reviewing games for a long time. I am confident that a lot of people are excited to play Cyberpunk 2077 on PC, and many people are even excited to have a chance to put their RTX cards to good use for a change. Having a "recommended" PC that can't enable ray tracing effects in a game where they'll actually matter? That's pretty bad.

The minimum spec hardware listed for Cyberpunk 2077 would have issues at times maintaining 60 fps with The Witcher 3 (especially in large cities) -- mostly because of the older CPUs listed. 30-45 fps, though, sure -- no problem! But there's no way Cyberpunk 2077 is less demanding than The Witcher 3, so no, I don't trust the specs from CD Projekt Red at all.

I've seen good system requirements from other companies. Doom Eternal gave specs, settings, and performance targets for both min and recommended hardware. The Division 2 gave low, medium, high, 1440p, and 4K recommendations in its system requirements, including fps targets as well (30 for low, 60 for the others). So, when CDPR gives no details other than min and recommended, and the recommended specs look appropriate for medium quality? Yup, I call bunk. Check back in two months and we'll have benchmarks, and you'll discover exactly the level of performance you get from a variety of GPUs and CPUs.

As far as Nvidia, they're the only ray tracing game in town right now. I mention AMD's upcoming Big Navi multiple times and suggest waiting to see how it performs. But it won't support DLSS -- that's Nvidia exclusive tech, and as much as AMD likes to pretend it doesn't really matter, the fact is that it actually does matter quite a bit for anyone wanting smooth framerates at 4K on modest hardware.

FWIW, the original title was "Cyberpunk 2077 System Requirements: I Don't Trust CD Projekt Red" -- but that didn't fit the Google headline length requirements so it got tweaked.
 
I've been testing and reviewing games for a long time. I am confident that a lot of people are excited to play Cyberpunk 2077 on PC, and many people are even excited to have a chance to put their RTX cards to good use for a change. Having a "recommended" PC that can't enable ray tracing effect in a game where they'll actually matter? That's pretty bad.

The minimum spec hardware listed for Cyberpunk 2077 would have issues at times maintaining 60 fps with The Witcher 3 (especially in large cities) -- mostly because of the older CPUs listed. 30-45 fps, though, sure -- no problem! But there's no way Cyberpunk 2077 is less demanding than The Witcher 3, so no, I don't trust the specs from CD Projekt Red at all.

As far as Nvidia, they're the only ray tracing game in town right now. I mention AMD's upcoming Big Navi multiple times and suggest waiting to see how it performance.

Hi Jarred. The article didn't read like trying to sell me hardware, but it did come off a bit strong. I guess you probably knew the words "don't trust CDPR" would cause some recoil.

I got the point that the recommended hardware still doesn't include some interesting features that people may care about, but it seems at least strange to call the recommendation bunk and subsequently acknowledge that we can't possibly know what their vision for the game is. It's kinda their call here.

Thanks for the thorough analysis on the features that we might be missing on the game, though, that was a good read and maybe something CDPR could've mentioned in their requirements. Not that we needed any more reason to want a new GPU after what NVidia has thrown at us, but it might just tip the scale on a guilty purchase.
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
Having a "recommended" PC that can't enable ray tracing effects in a game where they'll actually matter? That's pretty bad.

I don't think that is pretty bad. It is just literally CDPR recommending the gaming experience also using hardware without ray tracing capabilities. I mean, that sounds reasonable, right?

And, yes, also a sensible piece of information for CDPR to divulge. Their primary concern in this communication, anyway, is to create demand for their game. Whereas their helping nvidia to sell its hardware is a secondary consideration.

If those priorities were to be the other way around, now that is what would be "pretty bad". Not the allegedly "too low" recommended specs.

It is only among some quite specific circles where it would even make sense to say: "Yeah, we cannot recommend this otherwise perfectly capable hardware for playing this ray tracing title, it's not an nvidia RTX."

That said, ray tracing looks lovely, and I may be able to enjoy it soon. Have to see that big navi card turned before deciding.
 
  • Like
Reactions: Roland Of Gilead
Hi Jarred. The article didn't read like trying to sell me hardware, but it did come off a bit strong. I guess you probably knew the words "don't trust CDPR" would cause some recoil.

I got the point that the recommended hardware still doesn't include some interesting features that people may care about, but it seems at least strange to call the recommendation bunk and subsequently acknowledge that we can't possibly know what their vision for the game is. It's kinda their call here.

Thanks for the thorough analysis on the features that we might be missing on the game, though, that was a good read and maybe something CDPR could've mentioned in their requirements. Not that we needed any more reason to want a new GPU after what NVidia has thrown at us, but it might just tip the scale on a guilty purchase.
If you look at CDPR's history of recommended hardware, as well as changes to the graphics engines to hit certain targets (see: "Witcher 3 graphics downgrade"), and then you look at the screenshots and videos being shown for Cyberpunk 2077 along with the recommended system requirements, there's definitely precedence for me to say "don't trust CDPR" -- or really, "Don't trust CDPR's system requirements, because they're bunk" but that was too long a headline.

I think it will be a great game, and it will probably still look okay even at medium quality and 1080p. I also think the recommended hardware will suffice for >30 fps at 1080p medium, but not 60 fps. CDPR should have known this, and should have known better, but instead of giving several clear expectations for performance, it went with the super nebulous route and listed older hardware with no context as to what it should manage.

So many recent games can't do 1080p high at 60 fps on something like a GTX 1060, and they don't even look as good or as detailed as Cyberpunk 2077. Large open world games with state-of-the-art graphics, though? On a PS4 or Xbox One console, they'll have to dumb things down a lot just for 30-ish fps. And when you get into that console mindset, it's easy to say, "Oh, 30 fps is pretty good!" But it's not, not on PC anyway. Games running at 30 fps suck IMO.
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
I guess the author clearly didn't see this blog post in which CDPR clarifies what minimum and recommended is supposed to get you: https://support.cdprojektred.com/en...issue/1556/cyberpunk-2077-system-requirements

Shh... maybe that is just an evil trick to make us not buy Ampere? Maybe that is why they say "1080p", but actually mean "720p HD-ready"?

The cleverest of us cannot be fooled by this ridiculous trickery, allowing us to draw the logical conclusion: "Buy now Ampere from your local graphics card scalper (when available)."
 
I guess the author clearly didn't see this blog post in which CDPR clarifies what minimum and recommended is supposed to get you: https://support.cdprojektred.com/en...issue/1556/cyberpunk-2077-system-requirements
I did actually see that before, but forgot about it due to RTX 30-series launch crunch. It says 1080p low and 1080p high -- but critically, no fps! That last part really matters a lot to many PC gamers. I'll tweak the article, but fundamentally I still disagree with the lack of detail, or even mentioning ray tracing. Which makes me wonder if the game will actually ship with RT and DLSS enabled, or will that get cut as well, to be added in at a future date?
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
So many recent games can't do 1080p high at 60 fps on something like a GTX 1060, and they don't even look as good or as detailed as Cyberpunk 2077. Large open world games with state-of-the-art graphics, though? On a PS4 or Xbox One console, they'll have to dumb things down a lot just for 30-ish fps. And when you get into that console mindset, it's easy to say, "Oh, 30 fps is pretty good!" But it's not, not on PC anyway. Games running at 30 fps suck IMO.

So, which recent games would have to be played at below 60fps on 1060? Or are you talking about 1080p scenarios with everything ultra dynamically reflecting and shadow-shading shadows ultra? Or could it be something like mostly "high, some shadow-thingies medium"?

Quite similar to how I think many Cyberpunk players will set their in-game graphics details, either to enjoy even higher frame rates, or then just higher resolutions, DLSS2.0 notwithstanding.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
Bleh, what is this mess of an article? This had nothing to do with Projekt Red knowing or not knowing it's game minimum requirements or recommended requirements. I figured you guys had some sort of inside information on running the game and had tested it with several existing cards to get a real world gauge of performance. My rig is a based on a Ryzen 3900 and Nvidia 2080ti so I'm not worried but I was hoping for some real world insight not a bunch of bull.
 
  • Like
Reactions: Soaptrail and NP

Giroro

Splendid
I won't/can't watch some random youtube video to find out what the minimum specs are.

Edit: My cosmetic filter to block the horrible unrelated autoplaying videos seems to also be blocking bulleted lists now - but I still refuse to disable that.
 
  • Like
Reactions: Roland Of Gilead

King_V

Illustrious
Ambassador
So, which recent games would have to be played at below 60fps on 1060? Or are you talking about 1080p scenarios with everything ultra dynamically reflecting and shadow-shading shadows ultra? Or could it be something like mostly "high, some shadow-thingies medium"?

Quite similar to how I think many Cyberpunk players will set their in-game graphics details, either to enjoy even higher frame rates, or then just higher resolutions, DLSS2.0 notwithstanding.
CDPR simply said "high".

So, not ultra, but whatever "high" is. 60fps on a 1060 at 1920x1080? They didn't say 60fps. I would like to think they'd mean 60fps, but they didnt say.

And, I have to agree that it is REALLY strange that the 1060 6GB meets the recommended requirements, but somehow, an RX 580 8GB does not, and an R9 Fury level card is required? That seems weird.
 
  • Like
Reactions: JarredWaltonGPU
So, which recent games would have to be played at below 60fps on 1060? Or are you talking about 1080p scenarios with everything ultra dynamically reflecting and shadow-shading shadows ultra? Or could it be something like mostly "high, some shadow-thingies medium"?

Quite similar to how I think many Cyberpunk players will set their in-game graphics details, either to enjoy even higher frame rates, or then just higher resolutions, DLSS2.0 notwithstanding.
To name a few:

Borderlands 3
The Division 2
Horizon Zero Dawn (comes close in the benchmark, averages 45-50 fps in the actual game)
Metro Exodus
Microsoft Flight Simulator
Red Dead Redemption 2

Some of those don't just come up a bit short, but far short of 60 fps. Red Dead Redemption 2 is an open world game with demanding graphics that still don't look as complex as what we're seeing in the Cyberpunk trailers, and it averaged just 28 fps at 1080p ultra, and ~45 fps at 1080p high.
 

alan.campbell99

Honorable
Sep 11, 2017
32
3
10,545
Hmm. I have a Ryzen 3700X on an Aorus Elite X570 with 32GB and an RTX 2080 Super [that I upgraded to relatively recently from a 1080]. I play wherever possible at 1440p. I'm not all that keen on dropping for an Ampere card right now [if I can even get hold of one?] so I wonder how well this game will run?
 
Hmm. I have a Ryzen 3700X on an Aorus Elite X570 with 32GB and an RTX 2080 Super [that I upgraded to relatively recently from a 1080]. I play wherever possible at 1440p. I'm not all that keen on dropping for an Ampere card right now [if I can even get hold of one?] so I wonder how well this game will run?
With RT and DLSS, performance will probably be somewhat similar to Control with RT and DLSS is my guess. So 1440p DLSS on a 2080 Super should be fine.