News Cyberpunk 2077 System Requirements: Ray Tracing Recs Revealed

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You missed the "1080p Ultra", "1440p Ultra" "4k Ray Tracing" headings throughout the article huh?
They wrote that two months ago when the article was first posted, at which point only minimum and recommended requirements were listed. The article was updated yesterday to include new details, but what they said still holds true, since the requirements continue to say nothing about what sort of framerates to expect.

People say this is AMD's first generation RT - but IMO that's irrelevant. Ray tracing has been available on RTX cards for a couple of years, so it's not like there wasn't a target to aim for - much like AMD delivered ~2x RDNA1 in one stroke. I doubt AMD would have aimed just for 2080Ti performance - so I'd imagine drivers or other optimizations will draw quite a bit more performance out of the hardware.
The plans for a new or significantly-updated architecture will typically be laid out years in advance. So, it's likely that AMD designed this raytracing hardware a couple years back, perhaps around the release of the RTX 20 series, if not earlier, and it's possible that they were only targeting 2080 Ti-like raytracing performance for these cards.

Of course, it's also possible that the hardware has more to deliver, but existing games featuring RT-effects are optimized for Nvidia's RT architecture, since that's all game developers had access to until now. So, just as there were no games featuring raytracing at the launch of the RTX 20 series, it could be months before we have games specifically optimized with AMD's RT hardware in mind.
 
The plans for a new or significantly-updated architecture will typically be laid out years in advance. So, it's likely that AMD designed this raytracing hardware a couple years back, perhaps around the release of the RTX 20 series, if not earlier, and it's possible that they were only targeting 2080 Ti-like raytracing performance for these cards.

Of course, it's also possible that the hardware has more to deliver, but existing games featuring RT-effects are optimized for Nvidia's RT architecture, since that's all game developers had access to until now. So, just as there were no games featuring raytracing at the launch of the RTX 20 series, it could be months before we have games specifically optimized with AMD's RT hardware in mind.
The weak RT performance of Navi 21 is pretty clear at this point. Any game or test that makes remotely significant use of RT has a substantial deficit (relative to the same game without RT). There are multiple games where RX 6800 XT goes from tying or even leading RTX 3080 performance to being 15-25% slower with RT. For example:

Shadow of the Tomb Raider -- A 16 point swing:
Without RT: 6800 XT is 3-4% faster at 1080p/1440p, 3% slower at 4K
With RT: 6800 XT is 13-14% slower at 1080p/1440p, 16% faster at 4K

Metro Exodus -- A 19 point swing:
Without RT: 6800 XT is 1-6% slower at 1080p/1440p/4K
With RT: 6800 XT is 20-25 % faster at 1080p/1440p/4K

The only games with RT that I tested that didn't show AMD trailing by a significant amount are Dirt 5 with a DXR beta patch (so definitely not final, and AMD is quite a bit faster without RT -- plus the RT effects are very limited) and Watch Dogs Legion, where AMD's RT isn't doing a bunch of the calculations for some reason (patch coming in December to fix this, apparently).

Anyway, Nvidia is also being extremely vague about exactly how fast its RT cores are in a variety of situations. We know it's doing ray/triangle intersections for ray tracing, but we're not totally sure how fast it is on those intersections. We also don't have any information on how many ray/bounding box intersections it can do relative to ray/triangle -- probably a lot more. And there are other elements Nvidia hasn't fully discussed about the RT cores. Basically (paraphrasing what I've heard from Nvidia), it doesn't want to explicitly give AMD or Intel any data on how Nvidia has balanced its RT core performance for various workloads. Yes, AMD/Intel can analyze the hardware and work backward from there, but that takes more time.

I think the best-case result for AMD right now is that one AMD Ray Accelerator is able to match on Turing RT core. However, the worst-case result is going to be more like one Turing RT core is perhaps 25% faster than AMD's current Ray Accelerators. And an Ampere RT core is potentially 80-90% faster than a Ray Accelerator. Actually, it's even worse if we include clockspeeds, because AMD is hitting 2.2-2.3GHz compared to about 1.8-1.9GHz on Nvidia. So, even with 5% more Ray Accelerators (72 vs 68) and 20% higher clocks, RTX 2080 Ti still beats 6800 XT in ray tracing performance. And with 57% more Ray Accelerators (72 vs 46) and 20% higher clocks, the RX 6800 XT can't keep up with RTX 3070.

Bottom line is, I'm quite confident that the more RT effects a game uses, the worse Navi 21 will do relative to Ampere and Even Turing. Reading between the lines of AMD's materials, I think FidelityFX for ray tracing stuff is going to focus on finding ways to cast fewer rays to hopefully lessen the impact. Conversely, Nvidia will try to find ways to get devs to cast more rays to increase the impact. AMD improving image quality in games while casting fewer rays is going to be the trick.
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
"and then pulling literally from thin air ideas about what fps these setups would yield. "

can you edit this please. you misused the word literally. We all know the ideas didn't come from the thin air. thanks in advance.

No, I did not misuse it at all. I really mean what I say here. Certain ideas do emerge from air, literally. Of course, you are welcome to try proving otherwise with arguments, not assertions.

So can you edit your criticism? Some may think that no ideas emerge from air, but they are wrong. And more importantly, the way you use "we all" is wrong. You have no idea who "we all" are, let alone what we all do or don't know.
 

Mibo_49

Reputable
Nov 2, 2016
4
0
4,510
I am reading all kinds of articles, forums, watching videos, and getting a frickin headache. I can't get responses directly from developers because I am a mere mortal.

Bottom line: I want to play AC Vallhala and Cyberpunk 2077 on my PC as opposed to console. I think my specs are good to game at 1440p BUT I am not sure about storage. They make it seem like if you don't have an SSD you may not be able to play. Well, my drive is both. Meaning my hard drive is a hybrid.

My specs:
  • AMD Ryzen 7 2700x
  • RTX 2070 (not Super)
  • 16 GB RAM
  • 2TB Seagate FireCuda
  • Windows 10
  • Using a 34" ultrawide display
Can someone with experience and knowledge of these things help a casual gamer out, please? If I need to upgrade I will but I don't want to if it's not necessary. Thanks.
 
I am reading all kinds of articles, forums, watching videos, and getting a frickin headache. I can't get responses directly from developers because I am a mere mortal.

Bottom line: I want to play AC Valhalla and Cyberpunk 2077 on my PC as opposed to console. I think my specs are good to game at 1440p BUT I am not sure about storage. They make it seem like if you don't have an SSD you may not be able to play. Well, my drive is both. Meaning my hard drive is a hybrid.

My specs:
  • AMD Ryzen 7 2700x
  • RTX 2070 (not Super)
  • 16 GB RAM
  • 2TB Seagate FireCuda
  • Windows 10
  • Using a 34" ultrawide display
Can someone with experience and knowledge of these things help a casual gamer out, please? If I need to upgrade I will but I don't want to if it's not necessary. Thanks.
AC Valhalla is more optimized for AMD GPUs right now (at least 6800 series), and for 1440p you're probably looking at ~high settings to get a decent experience. For Cyberpunk 2077, I suspect medium ray tracing with DLSS enabled will give you a decent result.

As for your storage, the FireCuda is better than a pure HDD, but not by that much, particularly for modern games. Valhalla and Cyberpunk are both in the 80GB range, and your SSHD has an 8GB NAND 'cache' to try to help out with random reads/writes. It can't possibly store everything that the game will use in that small of a cache. But how much will it matter? Tough to say, but if you're not looking for the fastest possible load times you should be okay.

Basically, don't try maxing out settings at 1440p and expect 60+ fps.
 

xravenxdota

Reputable
Aug 26, 2017
434
67
4,990
Here there's no reason to go intel at the moment.It's more expensive than the 5xxx series of amd and the 5xxx cpu's are still in stock mainly cause of price.The 5600x are double the price than my 2600.I may look to snag a 3xxx cpu up if the price is right.