News Cyberpunk 2077 RT Overdrive Path Tracing: Full Path Tracing, Fully Unnecessary

It would be worth having a look at Digital Foundry's coverage:
View: https://www.youtube.com/watch?v=I-ORt8313Og

Once you know what to look for - particularly when it comes to global bounce-light and reflections of lit objects - the differences between the full rasterised GI in RT Psycho vs. the raytraced GI in RT Overdrive become much more clear. e.g. the splitscreen at 06:40 where the raster GI solution results in a main scene looking pretty ok (though with no bounce-lighting on any surface not directly lit by the scene light) but with its reflection being completely lit in the wrong colour. That's going to be a general theme with raster vs. RT GI - if the raster GI probe is different from the as-lit scene (e.g. you have a high temperature global light to match a sunny sky but the local environment is lit by a low-temp point light) then RT GI will produce a correct output, but raster GI will end up completely wrong.
 
most gamers would likely prefer saving money on the cost of a new graphics card rather than helping a corporation save on game development costs....
Despite Jarred's subtle anti-capitalist slam, the fact remains that consumers ultimately bear the costs of game development, not corporations. I also question the logic of saying ray-tracing isn't necessary, because of its impact on "muh framez". The average Hollywood CGI film is vastly more realistic than any game, despite that it runs at a mere 30fps. Why? Better physics; better ray tracing. If you're capable of generating more than 60-75 fps already, the additional horsepower is better used for improved rendering, not more frames (I exclude the potential, but unlikely case of an advantage for professional eSports gamers).
 
  • Like
Reactions: KyaraM
Competition will not let ray tracing be for development cost saving. Time spent on pre-bake lighting scale with complexity. Without having to worry about not looking right competition will drive developers to reallocate resources into adding details into the game. Perhaps we could finally see a big leap in physics. I am guessing realistic destructible stuff will be a lot easier to add into games if ray tracing can take care of lighting of all those stuff flying and falling instead of using artists. More variety in weathers will finally look good without adding a lot of costs too. Also as we add more vertices to characters ray tracing is almost a must for realistic faces to look good.
 
  • Like
Reactions: geogan and KyaraM
The built-in benchmark does a poor job of showing the difference between hybrid rendering and Path Tracing. You can easily notice the difference at the beginning of the game, especially when you get Dexter's quest (i.e., get in his car and talk to him about the bot quest). It looks night and day different.

In addition to that, Cyberpunk's Overdrive mode utilises four of Nvidia's algorithms, that's why it runs better on their cards. Your stance on taking this game's performance on other cards with a dose of scepticism is correct.
 
  • Like
Reactions: KyaraM and prtskg
This feels like a mumbo-jumbo of misinformation put together by manipulation of data and data selection to guide the reader towards the conclusion the author wants to put out all the while completely missing the point of WHY this mode is even available in an already pretty demanding game. Guess anything that is not pure raster is not to exist for some people.
 
Ray Tracing continues to be a solution looking for a problem. It's a waste of silicon that is not at all worth the relentless RTX-era price hikes.


Doesn't Nvidia own Digital Foundry?
If not outright, then through their massive ad-buys and exclusive access that built DF and are needed to keep DF afloat? Maybe something has changed lately, but they definitely used to be at the bottom of the classic YouTube trap: Shilling hard to try and win ad money, affiliate commissions, and "exclusive access".

Don't get me wrong, they are important pioneers and mouthpieces for the direct-to-customer influencer-marketing industry. It's very powerful to be able to make an ad that people are more likely to believe than a NASA scientist... but I trust DF's opinion about Nvidia in the same way I would trust the Home Shopping Network, or George Foreman's opinions about tabletop electric grills.

.... the fact remains that consumers ultimately bear the costs of game development, not corporations.

Consumer is a dirty word. It's dehumanizing. I'm tired of being treated like a disposable consumer.
I would much rather be a customer in a mutually beneficial relationship with a provider of goods and services.
 
Last edited by a moderator:
Guess anything that is not pure raster is not to exist for some people.
Then how else can 4090 owner lord himself over lowly console gamer? Because console gamer is already reaching dangerously close (120 FPS)! Can you imagine the DREAD of having to be on par with other human beings?!
High FPS is the right of only a few and it belongs to them alone.
 
Multiple people here are missing the point. I'm primarily focused on performance here, and I discussed that at length. You're all getting hung up on the "RT Overdrive doesn't radically alter the way the game looks." Yes, it absolutely changes the lighting, it's more complex, etc. But it doesn't make the game feel completely different, other than the fact that it can bring most GPUs to their knees.

Digital Foundry spent way more effort on hyping up the image quality enhancements. Kudos to them. But for every part where they show RT Ultra or even RT Psycho (which generally doesn't look that different from RT Ultra) versus RT Overdrive, there are ten comparisons between Max Rasterization and RT Overdrive. That's because those differences are far more noticeable.

There will be rooms lit up by a light source that look far more "correct" with Overdrive than with RT Ultra or rasterization. It's still the same game underneath, however, so we're putting a bit of lipstick on a pig. Unless you love Cyberpunk, in which case maybe it's a fox. Whatever. If I hadn't already finished the game, sure, I'd turn on Overdrive and play it with DLSS upscaling and frame generation on a 4090. But I don't have a compelling need to go back and replay the game.

The guns still feel the same. The randomly generated people and cars that go nowhere are all the same. The quests are the same. But the lighting and shadows are different! Assuming you have a GPU capable of playing the game.
 
It might be unnecessary, but it's the best kind of unnecessary - a fully playable tech demo in a full-fledged AAA game. And better yet you can even reach a decent performance level with a last gen GPU. Personally I'm quite excited that this sort of new tech is no longer relegated to canned benchmarks for the better part of a decade before we actually get to play games utilizing them. I remember running a sub-surface scattering benchmark on a 7800 GT!

Doesn't Nvidia own Digital Foundry?
If not outright, then through their massive ad-buys and exclusive access that built DF and are needed to keep DF afloat? Maybe something has changed lately, but they definitely used to be at the bottom of the classic YouTube trap: Shilling hard to try and win ad money, affiliate commissions, and "exclusive access".
Digital Foundry exists to analyze game graphics, especially cutting-edge graphics technology. Just because nVidia has been the company pushing graphics technology the hardest doesn't mean that DF is "shilling" or "owned" by nVidia, it's just the company providing the most relevant content for their channel.
 
Last edited:
  • Like
Reactions: geogan and KyaraM
Consumer is a dirty word. It's dehumanizing. I'm tired of being treated like a disposable consumer.
As the Wizards of the Coast OGL whistleblower put it: "company leadership views customers as obstacles between them and their money."

Sums up most large businesses since the day they officially spiked prices during COVID.

But it doesn't make the game feel completely different, other than the fact that it can bring most GPUs to their knees.
There is no point in complaining about differences when most of them are so minor and inconsequential to gameplay that most people probably wouldn't be able to tell which is which without knowing what to look for and where beforehand unless they stopped playing to do pixel-peeping. (And with "lipstick on a pig" implementations, many people prefer the raster hacks over RT anyway.)

That puts RT comfortably over the diminishing return cliff for me. If I had a GPU with RT, I'd be in the "I turn it on for pixel peeping every now and then, off the rest of the time" crowd.
 
  • Like
Reactions: pclaughton
But it doesn't make the game feel completely different, other than the fact that it can bring most GPUs to their knees.
That's true for most things relating to graphics, though. Cyberpunk 2077 wouldn't feel completely different if it had 2009 era graphics either, and it would run fine on a $110 GTX 650 from 2012.

Is RTX worth spending $1000+ on it? Maybe, maybe not. But using "it's still the same game" as argument is questionable at best.
 
  • Like
Reactions: KyaraM
The guns still feel the same. The randomly generated people and cars that go nowhere are all the same. The quests are the same.
Isn't that true, whether the game runs at 60fps or 600?

As the Wizards of the Coast OGL whistleblower put it: "company leadership views customers as obstacles between them and their money."

Sums up most large businesses since the day they officially spiked prices during COVID.
Actually, it sums up nearly all individuals since the dawn of mankind. Capitalism harnesses that human psychology, and puts it to the benefit of society.
 
Consumer is a dirty word. It's dehumanizing. I'm tired of being treated like a disposable consumer. I would much rather be a customer in a mutually beneficial relationship with a provider...
I apologize for invading anyone's safe space with a dose of reality, but I doubt if any game studio regards you as any more or less disposable than you do them. And in economics, the term 'consumer' exists for real and valid reasons. A brewery, say, may be a customer of a grain supplier, or an aluminum can maker, but it doesn't consume these resources: it transforms them. We thus define a special class of customer: the one who actually consumes the end product.
 
Isn't that true, whether the game runs at 60fps or 600?

Actually, it sums up nearly all individuals since the dawn of mankind. Capitalism harnesses that human psychology, and puts it to the benefit of society.

I apologize for invading anyone's safe space with a dose of reality, but I doubt if any game studio regards you as any more or less disposable than you do them. And in economics, the term 'consumer' exists for real and valid reasons. A brewery, say, may be a customer of a grain supplier, or an aluminum can maker, but it doesn't consume these resources: it transforms them. We thus define a special class of customer: the one who actually consumes the end product.
"Capitalism"
"I apologize ... BUT"
"disposable"
"economics"
"supplier"
"resources"
"special class of customer"
"end product"
Are you literally a corporate Overlord? does the word "dehumanizing" mean anything to you?
 
Just ran the tech demo and it looks way better but AMD RX 6000 cards need not apply. Even the AMD RX 7900xtx has poor performance. Top end ampere or ada series needed. This is basically RTX Portal in Cyberpunk 2077. RIP AMD cards. Remember this is a tech demo which means performance "could" get better.

The lighting an shadows are far better, the image looks a little real life (very small effect). Raster is nowhere close to the image quality. AMD are wrong to focus on raster.

4k@60 RTX 3080 ti.

4k RX 7900xtx
 
  • Like
Reactions: KyaraM
Just ran the tech demo and it looks way better but AMD RX 6000 cards need not apply. Even the AMD RX 7900xtx has poor performance. Top end ampere or ada series needed. This is basically RTX Portal in Cyberpunk 2077. RIP AMD cards. Remember this is a tech demo which means performance "could" get better.

The lighting an shadows are far better, the image looks a little real life (very small effect). Raster is nowhere close to the image quality. AMD are wrong to focus on raster.

4k@60 RTX 3080 ti.

4k RX 7900xtx
Wrong to focus on raster when 90% of the games on the market need it?

I think we still don't have fast enough hardware to bother with this. Even 4090 needs DLSS and frame generation for it to run good. Maybe in another 2-3 generations of GPU we can get path tracing at native without having to use Upscaling and fake frames to save performance.

I agree with the author's conclusion.
 
Last edited:
I did my own captures to compare in View: https://imgur.com/a/K6UKkpW


The choice of mostly night/dark scenes was deliberate, because when the sun's out, that tends to overblow any other light source.

I think the biggest difference in all of them, though it can be subtle as heck, is inclusion of direct illumination. For example in the scene under Megatower 10 at night, the lights don't really light up the scene in RT Ultra in a realistic manner (those are bright lights, yet the surrounding walls are still dark).

I also see some light leakage in the Cherry Blossom Market image (look to the right of the neon sign, the corner is lit up, even though there's no obvious indication as to where the light is coming from)

The apartment however I feel provided the best results: the dinnerware casts shadows now and there's an improvement in indirect lighting and global illumination.
 
  • Like
Reactions: dalauder
Maybe this technology would be better for huge servers to deal with.

Like all the online game services where you can play games without needing to install them on your machine or phone.

I have a feeling people would pay a lot more per month for rendering like that.

Something like:
1080p = $10
2k = $15
4k = $20
Path Tracing = $30+ per month.
 
Wrong to focus on raster when 90% of the games on the market need it?

I think we still don't have faster enough hardware to bother with this. Even 4090 needs DLSS and frame generation for it to run good. Maybe in another 2-3 generation of GPU we can get path tracing at native without having to use Upscaling and fake frames to save performance.

I agree with the author's conclusion.
The 4080 and 4090 get good enough performance at 4k now. Next generation from nVidia is not required. There will never be native path tracing. AI will completely take over in future. It will do everything. AI will speed up the path tracing next. In the end AI will render the whole image.

What you see a native is dead and wont be comming back.

This is 4 years ago.


This is one year ago.

 
  • Like
Reactions: KyaraM and Makaveli
The 4080 and 4090 get good enough performance at 4k now. Next generation from nVidia is not required. There will never be native path tracing. AI will completely take over in future. It will do everything. AI will speed up the path tracing next. In the end AI will render the whole image.

What you see a native is dead and wont be comming back.

This is 4 years ago.


This is one year ago.

I saw that 1 year old video before not the older one and I guess we will see.

As for next generation not required you already know NV has a 5090 in the lab already and 6090 in development.
 
"Capitalism" ... "disposable" ... "economics" ... "supplier" ... "resources" ... "special class of customer" ..."end product"
Are you literally a corporate Overlord?
I prefer the title "Eternal Supreme Overlord". And the laws of economics still apply, whether or not you choose to recognize them. You consume products and services; that makes you a consumer. Your abject desire to replace the term with a euphism brings to mind a quote of Lincoln's:

" “How many legs does a dog have if you call the tail a leg?…. Four; calling a tail a leg doesn't make it a leg.”.
 
I saw that 1 year old video before not the older one and I guess we will see.

As for next generation not required you already know NV has a 5090 in the lab already and 6090 in development.
nVidia could be using AI to speed up the path tracing in their next gpu. It reduces the number of rays needed to be computed. Native like people think from the past is never returning. GPUs will take short cuts from now on.
 
nVidia could be using AI to speed up the path tracing in their next gpu. It reduces the number of rays needed to be computed. Native like people think from the past is never returning. GPUs will take short cuts from now on.
I'm still waiting to see what AMD has planned for the AI Block in RDNA 3 which seems to be unused currently.