News Cyberpunk 2077 RT Overdrive Path Tracing: Full Path Tracing, Fully Unnecessary

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's not a fair comparison between PRE-RENDERED scenes and on the fly rendering.
You've missed the point. What are those rendering farms doing with all that extra time? They're not generating more frames per second at higher resolutions -- they're using it to ray trace and better simulate physics. This is the path forward to more realistic graphics.

So until that day comes, the RTX cards that have released so far and later are like a Kickstarter or Early Access, and folks are paying the associated fees through buying these cards?
There's a reason they call it the bleeding edge. Early adopters have long paid the way for those who follow.

"It is not from the benevolence of the butcher, the brewer or the baker, that we expect our dinner, but from their regard to their own interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities, but of their advantages." - Adam Smith

And the consumer only indirectly "bears the cost" of game development and only if the corporation can convince them that the value of the game is worth more to the consumer
A Wealth of Nations quote! Bless you, my son. However, if the production costs affect the entire industry, and ignoring product-substitution and other effects, the costs are invariably passed on. In fact, depending on the marginal-cost curve, the consumer might bear more than 100% of the increase. The converse situation is more easily understandable: a small reduction in production costs increases demand, allowing economies of scale, and resulting in a much larger reduction. We see this often in consumer electronics: at some point declining costs open the product to a far larger market, and production costs plummet precipitously.

But let's not get tangled in the weeds. It's indisputable that large-scale decreases in production costs will benefit both consumers and producers -- game studios AND gamers will both reap the rewards of path tracing.
 
You've missed the point. What are those rendering farms doing with all that extra time? They're not generating more frames per second at higher resolutions -- they're using it to ray trace and better simulate physics. This is the path forward to more realistic graphics.


There's a reason they call it the bleeding edge. Early adopters have long paid the way for those who follow.


A Wealth of Nations quote! Bless you, my son. However, if the production costs affect the entire industry, and ignoring product-substitution and other effects, the costs are invariably passed on. In fact, depending on the marginal-cost curve, the consumer might bear more than 100% of the increase. The converse situation is more easily understandable: a small reduction in production costs increases demand, allowing economies of scale, and resulting in a much larger reduction. We see this often in consumer electronics: at some point declining costs open the product to a far larger market, and production costs plummet precipitously.

But let's not get tangled in the weeds. It's indisputable that large-scale decreases in production costs will benefit both consumers and producers -- game studios AND gamers will both reap the rewards of path tracing.
This was on AMD powerpoint slide. Using cloud for RT.

Ignored is the fact that a 7900xtx cant match a 3070 at native 1080p and is close to a rtx 2080 ti performance wise.

Conclusion, "we're not convinced these performance results are fully representative of non-Nvidia GPUs." Results are the same in other path-tracing games and is about hardware performance. This is Nvidia's GPU path tracing technology preview. We already know AMD GPU's lack the performance. The same happens in metro exodus path-tracing, minecraft rtx and portal rtx. Every other path-traced game. Given the path-tracing SDK has been released by nVidia, other games could get support.

Even at RT ultra AMD's RX 7900xtx cant match ampere's top cards.
 
Last edited:
  • Like
Reactions: KyaraM
Conclusion, "we're not convinced these performance results are fully representative of non-Nvidia GPUs." Results are the same in other path-tracing games and is about hardware performance.
They're specifically NOT the same in other ray tracing games, or even in path tracing games. I included the 3DMark DXR Feature Test as one clear example that AMD and Intel GPUs are underperforming — check the Pure RT section of the article. Intel's Arc GPUs are also (still) underperforming in Minecraft, though I haven't even tried them in Portal RTX or Quake II RTX. But a simple look at the Native 1080p results compared to the 3DMark results shows that there's potential for AMD and Intel performance to improve a lot.

Now, 3DMark might also not be representative of "full path tracing" in games, but I'd trust UL a lot more than an Nvidia promoted game to give other GPU vendors a fair shake. You can also see the RT Ultra results for what a more typical implementation of hybrid ray tracing might look like. Basically, RX 7900 XT goes from trailing the RTX 3070 to beating the RTX 3080. Arc A770 goes from trailing the RTX 2060 to potentially matching the RTX 3070. Those are not insignificant changes!

At the very least, RX 7900 XT and XTX should probably land between 3070 and 3080, not 20% below the 3070. Likewise, the Arc A770 should probably come close to matching the 3060 rather than trailing it by 55%. (Though I suspect it's far more a case of drivers holding Intel back than it is for AMD. Because Arc support for Minecraft RTX is basically broken.)

1682009653985.png

Here's Control with maxed out settings, for example. At 1080p, that chart looks very different from the Overdrive results. Or here's Minecraft RTX:

1682009801803.png
Again, Intel's performance is messed up, but I'm fairly confident it will eventually get the A700 cards above the 3050 at the very least. I had pinged Intel about its poor results and apparent memory leak in Bright Memory Infinite Benchmark and the most recent driver finally fixed that.

Of course, Minecraft RTX and other games aren't remotely agnostic implementations of "path tracing" either. That's Nvidia's reason for pushing such things so much. It knows it has better RT hardware and drivers. It actually can run Cyberpunk 2077 Overdrive settings at decent levels of performance, especially if you factor in DLSS 3. (Again, for 4K with performance mode upscaling, I'll note that the jump from 65 fps on the 4090 to "103 fps" with Frame Gen really feels more like ~75 fps, not 103 fps. Likewise, the 4080's "74 fps" versus 46 fps feels more like ~53 fps.)
 
It's indisputable that large-scale decreases in production costs will benefit both consumers and producers -- game studios AND gamers will both reap the rewards of path tracing.
I would say that the first is true: large-scale decreases in production costs will benefit consumers and producers. But will use of full ray tracing actually result in reduced production costs? That is far from "indisputable" and in fact I'd argue the opposite: For the foreseeable future, virtually every game that uses ray tracing (whether hybrid or full) will end up having to do both ray tracing and rasterization. And game engines like Unreal Engine are making it less painful to do a lot of the rasterization approximations in the first place.

Look at Cyberpunk 2077: Nvidia has been talking about RT Overdrive for over six months, and now it's finally here. If implementing full path tracing was so easy, why did it require far more than six months of work? Even if it's only a limited number of CDPR people doing that work rather than the full team, that clearly indicates path tracing isn't some magical panacea where you flip a switch and "it just works." You get a whole new slate of bugs to try and fix, never mind trying to optimize performance.
 
For the foreseeable future, virtually every game that uses ray tracing (whether hybrid or full) will end up having to do both ray tracing and rasterization.
I believe we're foreseeing different timeframes. Within 2 or 3 card generations, even NVidia's low-end offerings will perform full path tracing, and high-end cards will be focusing on advanced physics and in-game AI. In 10 years, the idea of a game *not* supporting ray tracing will be absurd, and rasterization will go by the wayside. Remember, the cost savings is more than just developer manpower: the entire reason that lightmaps, shadow maps, bump maps, reflection maps, texture maps, etc. exist is that graphics cards lack the horsepower to calculate these on the fly. Assume enough processing power, and all these shoestring hacks go away.
 
I believe we're foreseeing different timeframes. Within 2 or 3 card generations, even NVidia's low-end offerings will perform full path tracing, and high-end cards will be focusing on advanced physics and in-game AI. In 10 years, the idea of a game *not* supporting ray tracing will be absurd, and rasterization will go by the wayside. Remember, the cost savings is more than just developer manpower: the entire reason that lightmaps, shadow maps, bump maps, reflection maps, texture maps, etc. exist is that graphics cards lack the horsepower to calculate these on the fly. Assume enough processing power, and all these shoestring hacks go away.
Ten years down the road, yes, ray tracing hardware will be mainstream and perhaps even at acceptable levels of performance for budget GPUs. If those even continue to exist. Two or three GPU generations is potentially four to six years at most. The top of the performance spectrum, especially from Nvidia, has improved in leaps and bounds. The bottom, not so much.

RTX 2060 came out in early 2019. It's still generally faster than RTX 3050, which came out three years later. So far, RTX 40-series has improved on its predecessor at every level, but I'm not sure RTX 4050 will actually do that, and I'm pretty sure RTX 4060 will only have 8GB VRAM. Frame Generation on a 4050 will also be something of a joke, because doubling fps to 60 will still feel like a smoother looking version of 30 fps. I guess we'll hopefully find out what Nvidia is actually going to do with RTX 4050/4060 in the near future.

There's also the question of how many people are still running older hardware. According to Steam, right now only about a quarter of gamers have an RTX 3060 or better (counting RTX 2080 and above as also being "better"). And of those, about half are running RTX 3060 hardware. So over four years after the 20-series launch, maybe a bit more than 10% of gamers have an RTX 2080 or better. And over a quarter of gamers have GPUs that are six or more years old, with another 20% using newer hardware that's often slower that older hardware (but was cheaper to buy).

Six years from now, we'll probably have half of gamers running RTX 2080 or better hardware — and some of those will be using hardware that's MUCH better. But as a game company executive, are you going to be fully on board with excluding a potentially 50% of your target market? If not, we'll keep getting games that have to have both rasterization and ray tracing support.
 
This was on AMD powerpoint slide. Using cloud for RT.

Ignored is the fact that a 7900xtx cant match a 3070 at native 1080p and is close to a rtx 2080 ti performance wise.
....
Even at RT ultra AMD's RX 7900xtx cant match ampere's top cards.
At some point, AMD needs to go for ray tracing, but raster performance is mainstream and games don't utilize enough ray tracing for it to matter. Until ray tracing capable hardware is a mainstream price, it won't shift in that direction either. I would wager that it's gonna take a PS6, or similar, capable of ray tracing before it switches.
 
At some point, AMD needs to go for ray tracing, but raster performance is mainstream and games don't utilize enough ray tracing for it to matter. Until ray tracing capable hardware is a mainstream price, it won't shift in that direction either. I would wager that it's gonna take a PS6, or similar, capable of ray tracing before it switches.
Yeah, sadly that's the true reality. When we have consoles offering something roughly on the level of RTX 4080/4090 performance, only then will we start to see mainstream gaming consider full ray tracing as a viable option. It didn't help that both PS5 and Xbox Series X are relatively anemic when it comes to RT hardware (as in, less than RX 6800).
 
Yeah, sadly that's the true reality. When we have consoles offering something roughly on the level of RTX 4080/4090 performance, only then will we start to see mainstream gaming consider full ray tracing as a viable option. It didn't help that both PS5 and Xbox Series X are relatively anemic when it comes to RT hardware (as in, less than RX 6800).
I was going to say that the console entry point will have to be lower than that, but the PS6 is slated for 2027. So maybe it's possible?
 
Yeah, sadly that's the true reality. When we have consoles offering something roughly on the level of RTX 4080/4090 performance, only then will we start to see mainstream gaming consider full ray tracing as a viable option. It didn't help that both PS5 and Xbox Series X are relatively anemic when it comes to RT hardware (as in, less than RX 6800).
That argument means get a console PC is dead. The whole point of path-tracin on PC is to create something consoles don't provide. If PC's are just a little better consoles, then just get a console and save the money.

nVidia understand this and its why ray tracing is a big PC feature. Its a way to differentiate PC's from console's and justify the higher asking price for PC hardware.

Saying that RT doesn't matter is a console plus PC gaming experience and the end of PC's. Its just cheaper to get a console, if RT doesn't matter. Why pay the extra for the PC. Raster performance wise the console is enough. A little RT and a console is enough. Why pay 1000's for a PC?
 
Last edited:
  • Like
Reactions: Endymio
That argument means get a console PC is dead. The whole point of path-tracin on PC is to create something consoles don't provide. If PC's are just a little better consoles, then just get a console and save the money.

nVidia understand this and its why ray tracing is a big PC feature. Its a way to differentiate PC's from console's and justify the higher asking price for PC hardware.

Saying that RT doesn't matter is a console plus PC gaming experience and the end of PC's. Its just cheaper to get a console, if RT doesn't matter. Why pay the extra for the PC. Raster performance wise the console is enough. A little RT and a console is enough. Why pay 1000's for a PC?

as i have mentioned to some one else on here when they say something to the effect of " just get a console ".
what if there are no games on consoles that someone wants to play, and they already have a comp, but it needs an upgrade ? then what ? this argument kind of becomes moot.

keep in mind, to some, RT doesnt matter to them, yet. the reason ? they dont play any games that supports RT, or have any interest in the games that do support it. in this case, rasterization is more important to them.
 
You've missed the point.
seems like you may have missed the point as well.

comparing a movie that uses a render farm, vs a game that has to do it in real time isnt really a fair comparison.

The average Hollywood CGI film is vastly more realistic than any game, despite that it runs at a mere 30fps. Why?
simple, its rendering a scene on a render farm that wont change once the movie is final. with a game, the GPU has to render each scene on the fly, you play the game once, it renders it, play the same game again, but in a different view point, it has to re render that scene. play it a 3rd time, re render. 1st time, you go straight, 2nd time, you go left, and the 3rd time you go right, each time the view you have, is different.
 
  • Like
Reactions: JarredWaltonGPU
as i have mentioned to some one else on here when they say something to the effect of " just get a console ".
what if there are no games on consoles that someone wants to play, and they already have a comp, but it needs an upgrade ? then what ? this argument kind of becomes moot.

keep in mind, to some, RT doesnt matter to them, yet. the reason ? they dont play any games that supports RT, or have any interest in the games that do support it. in this case, rasterization is more important to them.
If you just care about gameplay and don't care about RT. Then get a console. No point in spending $1000 on a AMD RX 7900XTX or $1600+ on a 4090. A PS5 digital console is US$399.00. Cheaper than a whole budget PC system.

Afterall RT lite games on PC will just be console ports and run on AMD cards just fine. Its gameplay that matters most, so get a console. Thats the argements being made against path-tracing and RT in general.

Conclusion you are all pushing is get a console better PC graphics don't matter. Gameplay and raster performance matter. Console plus is what PC games should be. The issue is this arguement ends with just getting a console. Cheaper, good raster performance, lite RT and the gameplay is the same. Its the gameplay that matters.

Who makes the console GPU's? AMD. So AMD wins. In the PC space nVidia has won, I wonder why? Could it be no one wants to pay $1000's more for console plus experience.
 
If you just care about gameplay and don't care about RT. Then get a console. No point in spending $1000 on a AMD RX 7900XTX or $1600+ on a 4090. A PS5 digital console is US$399.00. Cheaper than a whole budget PC system.

Afterall RT lite games on PC will just be console ports and run on AMD cards just fine. Its gameplay that matters most, so get a console. Thats the argements being made against path-tracing and RT in general.

Conclusion you are all pushing is get a console better PC graphics don't matter. Gameplay and raster performance matter. Console plus is what PC games should be. The issue is this arguement ends with just getting a console. Cheaper, good raster performance, lite RT and the gameplay is the same. Its the gameplay that matters.
again, and if there are NO games on the consoles that a person wants to play, but already has games on the comp that they are already playing ? the " get a console " argument is still moot.
that money would be better spend on the comp, at least they would be able upgrade something, and be able to play it, where spending 400-700+ on a console, that has NO games they play, would sit and collect dust
 
again, and if there are NO games on the consoles that a person wants to play, but already has games on the comp that they are already playing ? the " get a console " argument is still moot.
that money would be better spend on the comp, at least they would be able upgrade something, and be able to play it, where spending 400-700+ on a console, that has NO games they play, would sit and collect dust
US$399.00 for a PS5 digital console. Gaming PC Alienware Aurora R13 $2,399.96. If you don't care about PC graphics, just gameplay. Why pay $2000 more. For $2000 less you can live without one game. There are lots of great games with good gameplay on console. Most PC games get ported to console as well like Cyberpunk 2077 and the gameplay is just the same. If you don't care about RT, then why spend $1000's more? Raster performance on consoles is fine. RT lite on console is fine performance wise. Sure look a Unreal Engine 5 on consoles with Lumen. They look great and the cost is so much less than PC. Its gameplay that matters afterall.

If you need Path tracing just do it on the cloud, a console can do that cheaper that a PC. Hell you could play all the PC games you like that way as well.
 
Last edited:
US$399.00 for a console
that same console here is $599
your now talking about buying a NEW comp, i was talking about upgrading an existing comp, 2 different things, and moving the goal posts a little.

also, if the person knows enough. or knows some one that can build them one, you can build a decent comp for half that price, which is also funny that you used one of the pricier prebuilt comp companies for your example.

If you don't care about PC graphics, just gameplay. Why pay $2000 more. For $2000 less you can live without one game.
again, i wasnt talking about buying a new comp.

There are lots of great games with good gameplay on console
maybe, but again, and you sure seem to be ignoring this point : what if there are no games on a console that a person wants to play ?????????????????????????????????????

Most PC games get ported to console
and if they already have a comp, then there is no need to spend the money on a console, and just upgrade the comp if needed, and get that game for the comp instead.

this is the situation i am in, there are NO games on consoles that i want to play, cyberpunk ? no thanks, not interested in it. but i have 15+ games on my comp/for a comp, that i have played, would play, or are playing, there fore, that 600 bucks would be MUCH better spent upgrading my gtx 1060, then it would be spent on a console, that would collect dust, more then i would play it.

just in case you ignore my points again, what if there are no games on a console that a person wants to play ? and i am NOT talking about buying a new comp, i am talking about upgrading an existing one.
 
that same console here is $599

your now talking about buying a NEW comp, i was talking about upgrading an existing comp, 2 different things, and moving the goal posts a little.


again, i wasnt talking about buying a new comp.


maybe, but again, and you sure seem to be ignoring this point : what if there are no games on a console that a person wants to play ?????????????????????????????????????


and if they already have a comp, then there is no need to spend the money on a console, and just upgrade the comp if needed, and get that game for the comp instead.

this is the situation i am in, there are NO games on consoles that i want to play, cyberpunk ? no thanks, not interested in it. but i have 15+ games on my comp/for a comp, that i have played, would play, or are playing, there fore, that 600 bucks would be MUCH better spent upgrading my gtx 1060, then it would be spent on a console, that would collect dust, more then i would play it.

just in case you ignore my points again, what if there are no games on a console that a person wants to play ? and i am NOT talking about buying a new comp, i am talking about upgrading an existing one.
Remember you can play games over the cloud if they use path tracing. You can play every PC game that way. Why get a PC? Consoles can steam games for cheaper. Just get a console. This is the conclusion of all the anti RT comments. Accept them and getting a console is the right choice. Xbox Cloud Gaming is for you.

Using Microsoft’s 54 Azure data centers, Xbox Cloud Gaming allows users to stream games originally made for Xbox One, Xbox Series X/S, and PC on their device of choice.
 
Last edited:
you still ignored or glossed over my point. why spend money on a console if some one has a comp that works just fine, ( or maybe needs an upgrade ) and can also play games over the cloud ? the money would be better spend on the comp for an upgrade if needed, or just saved if doent need an upgrade yet.

Make silly argument, get cornered with silly conclusions you dont like.
right back at you.
 
you still ignored or glossed over my point. why spend money on a console if some one has a comp that works just fine, ( or maybe needs an upgrade ) and can also play games over the cloud ? the money would be better spend on the comp for an upgrade if needed, or just saved if doent need an upgrade yet.


right back at you.
You can play every game on the cloud on any device. Why pay $2000 more for a PC gaming computer. A console can stream those games fine. Xbox Cloud Gaming supports games for the PC.

You accepted the slippery slope arguement, this is the conclusion. Get a console.
 
You can play every game on the cloud on any device. Why pay $2000 more for a PC gaming computer. A console can stream those games fine. Xbox Cloud Gaming supports games for the PC.

You accepted the slippery slope arguement, this is the conclusion. Get a console.
where did i say anyhing about a new comp that costs 2k? i mentioned an exisiting comp, or upgrading an existing comp. and an exisiting comp can stream those games just fine as well.
and you moved the goal posts by talkng about a new comp.

no slippery slope arguement here, i just presented a view, that goes against the " just get a console " arguement, and you didnt like it, cause its a valid reason not to get a console. as i said, its a situation i am in personally, there are no games on a console that i want to play, so buying a console would be a complete waste of money for me, as i have a comp that just could use a new vid card.
 
where did i say anyhing about a new comp that costs 2k? i mentioned an exisiting comp, or upgrading an existing comp. and an exisiting comp can stream those games just fine as well.
and you moved the goal posts by talkng about a new comp.

no slippery slope arguement here, i just presented a view, that goes against the " just get a console " arguement, and you didnt like it, cause its a valid reason not to get a console. as i said, its a situation i am in personally, there are no games on a console that i want to play, so buying a console would be a complete waste of money for me, as i have a comp that just could use a new vid card.
All the points are covered. Just get a console the gameplay is the same. Any games you want to play can be done via the cloud. No need for a new gpu. Ask yourself why there are cloud gaming services.

Features of Cloud Gaming Services

Cloud Gaming comes with numerous advantages over conventional gaming. Some of the best cloud gaming features include:

No need for high-end hardware​

As mentioned earlier, cloud gaming services don’t need high-end devices to let you play your favorite games. You can run the latest games on the oldest devices without paying a single penny on hardware upgrades. Opting for a cloud gaming service is the best option if you don’t have a high-end device but are passionate about gaming.

No need to download games​

Taking a couple of hundred gigabytes of storage space is not a big deal for modern AAA titles. Downloading such humongous files needs a lot of bandwidth and storage space on your device. You will also need to upgrade your device storage to have a significant library of modern games. On the other hand, you won’t need to download any game files to play a game with a cloud gaming service. Also, you have the most extensive game library at your end, which is practically impossible with conventional gaming.

Independence to game anywhere​

Gaming devices are generally not so portable. Even the most portable gaming laptops won’t let you game in the subway or a waiting room! Cloud gaming services make gaming extremely portable and give you the independence to game anywhere and anytime.

You can game on all devices.​

Cloud gaming services let you game through Android, iPhones, Windows, Mac, Linux, and more. These services support cross-platform gaming on devices, and the progress gets synced to all the devices logged under the same account. You can even continue your game on a smartphone while you don’t have access to a bigger screen.

Cloud gaming services are the future of modern gaming due to the fantastic features they bring to the table.

Let us now look at some of the best cloud gaming with their perks and caveats.
 
comparing a movie that uses a render farm, vs a game that has to do it in real time isnt really a fair comparison.
... its rendering a scene on a render farm that wont change once the movie is final. with a game, the GPU has to render each scene on the fly, you play the game once, it renders it, play the same game again, but in a different view point...each time the view you have, is different.
Again, you miss the point. I'll spell it out in painstaking detail. A computer GPU may render a frame in 1/100 a second, say, whereas a render farm may require 2 minutes. In both cases, once rendered the viewpoint is fixed,

Now, if your GPU -could- deliver Hollywood-level CGI effects in real time, what would it do differently? Would it render at higher resolution? No. Would it create more frames per second? No again. It would use full path tracing to light higher-detail meshes, and animate them with more accurate physics. That is the path forward to more realistic graphics in videogames -- not more pixels and frames.
 
That argument means get a console PC is dead. The whole point of path-tracin on PC is to create something consoles don't provide. If PC's are just a little better consoles, then just get a console and save the money.

nVidia understand this and its why ray tracing is a big PC feature. Its a way to differentiate PC's from console's and justify the higher asking price for PC hardware.

Saying that RT doesn't matter is a console plus PC gaming experience and the end of PC's. Its just cheaper to get a console, if RT doesn't matter. Why pay the extra for the PC. Raster performance wise the console is enough. A little RT and a console is enough. Why pay 1000's for a PC?
While I think typical ray tracing, as its done today, will become much more widespread, I don't think the extreme FULL ray tracing will happen when Steam Hardware survey still said a GTX 1650 was the most common graphics card until LAST MONTH. That means most users didn't even have a minimal ray-tracing capable card of an RTX 2060 from nearly five years ago.

Now that totally changed in March 2023--most gamers now have RT capable hardware. But it's still completely crippled at 1080p by full RT. Nobody's going to design games that need an RTX 4090 to be playable. They'll design for an RTX 3060.

The fact is there's nothing a GTX 1650 can do that a PS5 can't. But the PS6 isn't coming out until 2027, so for a couple of years, ray tracing will be a PC exclusive. So long as the RTX 5050 Ti is $350 or less, we'll probably start seeing full ray tracing available mainstream in 2024.
 
I personally haven't cared for PC graphics improvements since Battlefield 3. The only reason I upgraded my GTX 570 was because I was worried it would break after 5+ years. The only reason I upgraded my RX 480 was for the reason. And I'll probably ride my RTX 2080 Ti for four more years.

It's story, physics, interactions, and gameplay much more than pixels or lighting. I have a console. It's an XBox 360 with a Kinect--the newest console that provides something my PC doesn't. I also have an Oculus Quest 2--also providing something different.

The next steps in computer gaming realism isn't ray tracing, just like it wasn't tessellation. It's better interactions. We're not far from being able to speak to an NPC in game, using a microphone, then getting an AI supported reasonable text response (or maybe voiced?) right back from them. That's going to mater A LOT more than graphics for breaking immersion.

Seriously, use this extra graphics muscle for better physics destruction/collision modeling or automatic hitbox standardization.