News ’Atomic Heart” Arrives on PC Without Ray Tracing After Years of Nvidia Promotion

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Maybe a bit, but the game is getting great reviews. RTX doesn't make ANYTHING actually better. Well, it helps your power company a bit.
Guess we have been fed placebo that introduced hallucinations then because if you can't see at the very least RT reflections clear as day then maybe Nvidia and AMD should just stop working on RT althougher and just work on DLSS/FSR for their reduced latency and general improvement of game fps at a small cost in quality and a few graphical issues here and there.
As for me, I can clearly see RT on The Witcher, CP77, Metro Exodus, Spiderman, Bright Memory, Returnal, etc.

Where are you getting this information from?
Raytracing and DLSS are two different technologies.....
Technically RTX means the card is capable of RT and DLSS
 
And this is why Marketing departments are the bane of humanity, right after Patent Lawyers XD

It's definitely embarrassing as this game was plastered all over nVidia's marketing slides for the RT first and foremost and not DLSS. So whether or not nVidia's marketing "implied" (even explicitly said) "RTX" encompasses all the AI-shoved-in stuff doesn't mean they didn't* use this title for promoting Ray Tracing first and foremost for years.

Meh, I love Ray Tracing as a feature, even if it is hated, but this is still fumbling the ball, no matter what semantics want to be applied to ease the "swing and miss".

EDIT: Typos.

Regards.
 
Of course it's a brand name, but it still doesn't change the fact that Nvidia originally used the term "RTX" to refer
specifically to raytracing. Here's their RTX press release from almost five years ago...

https://nvidianews.nvidia.com/news/...alizes-dream-of-real-time-cinematic-rendering

Nvidia grouping in DLSS-enabled games to count as supporting "RTX" is mostly just them trying to give the impression that there are more games featuring raytracing than there really are. I suppose there's both the RTX technology as they originally defined it, and the RTX branded cards that support features like DLSS, but their original marketing takes precedence, and is what tends to be implied when one refers to "RTX" as a feature.
It's one of the interesting things to observe over the past few years. DLSS has arguably become more important than RT. I actually predicted this at the RTX 20-series launch. I don't know if I specifically wrote it in an article, but I remember thinking and saying something to the effect of, "the Tensor cores could prove more important than Nvidia's RT cores in the long run." All the deep learning and AI stuff that's going on in the data center is testament to their importance.

Anyway, DLSS is something that boosts performance quite a bit with minimal loss in image quality, and it's relatively easy to implement. RT requires a ton of extra work from the devs to get it working properly alongside the regular rendering, unless a game is RT exclusive (there are a few indie games in that category, and Metro Exodus Enhanced). There aren't many (any?) games that have promised DLSS support and then never delivered, as far as I'm aware, but more than a few games have had delayed RT support — or just quietly dropped RT plans altogether.

I still think RT is an important technology, but we really need games that wow you with graphics improvements by enabling RT or else it doesn't matter. DLSS (and FSR 2 and XeSS) are more in line with what a lot of gamers want — higher performance — than RT, which generally drops performance quite a lot. We're probably still years away from the point where devs can just do everything with ray tracing and not have to worry about a bunch of legacy rasterization code to make things run fast enough.
 
It's one of the interesting things to observe over the past few years. DLSS has arguably become more important than RT.
I would kind of agree in that it currently makes more of an impact in a larger number of games, but the same could be said of other new upscaling techniques that don't necessarily require specialized hardware. And really, the large price hikes for a given level of hardware arguably counteract the performance gains brought on by upscaling, so I'm not sure much performance has actually been gained by the end-user, over what would have been expected without upscaling at this point.

In the long-run, RT may become more relevant, though it will probably be some time before many games actually require hardware RT support to run. In the mean time, RT will create an increased workload for developers as they need to implement and test two different lighting models if they don't want to exclude the majority of their potential customers who either can't run games with RT effects enabled, or can't run them well. Upscaling can only get you so far without looking bad, and the increased input latency and potential for artifacts from frame generation prevents it from being a suitable solution for many games. And by the time RT becomes standard, earlier-generation high-end cards and current-generation mid-range cards might not even be able to run it well. Atomic Heart might have just delayed RT-support at launch because they knew their implementation ran bad on most hardware, and didn't want the performance issues affecting review scores.
 
I would kind of agree in that it currently makes more of an impact in a larger number of games, but the same could be said of other new upscaling techniques that don't necessarily require specialized hardware. And really, the large price hikes for a given level of hardware arguably counteract the performance gains brought on by upscaling, so I'm not sure much performance has actually been gained by the end-user, over what would have been expected without upscaling at this point.

In the long-run, RT may become more relevant, though it will probably be some time before many games actually require hardware RT support to run. In the mean time, RT will create an increased workload for developers as they need to implement and test two different lighting models if they don't want to exclude the majority of their potential customers who either can't run games with RT effects enabled, or can't run them well. Upscaling can only get you so far without looking bad, and the increased input latency and potential for artifacts from frame generation prevents it from being a suitable solution for many games. And by the time RT becomes standard, earlier-generation high-end cards and current-generation mid-range cards might not even be able to run it well. Atomic Heart might have just delayed RT-support at launch because they knew their implementation ran bad on most hardware, and didn't want the performance issues affecting review scores.
Oh, this is absolutely why they cut DXR. I'm sure there was some backroom conversations between Mundfish and Nvidia about the whole RTX On business.

Nvidia: "We've helped with development and promotion for years!"
Mundfish: "Ray tracing kills performance on all but the fastest cards, and it will absolutely affect our review scores!"

Back and forth until eventually the compromise is:
Mundfish: "Okay, look, we'll ship with DLSS and Frame Generation, and we'll enable DXR support in a future patch once the initial reviews are done."
 
Maybe, someday we will all know the tragic story of the development where ray tracing died along the way 😅

Ray tracing—don’t you give up on me man! don’t you give up!!!

they should make a mini series out of that
 
Maybe, someday we will all know the tragic story of the development where ray tracing died along the way 😅

Ray tracing—don’t you give up on me man! don’t you give up!!!

they should make a mini series out of that
I don't think so. RT has been a standard for decades.

On the other side...

Honestly, I wonder if someday we will see the RT calculations being accelerated in the gpu itself instead of badly using the CPU (like how hard it affects Nvidia cards.. where GPUs arent even used at 70%, pushed back by the RT cores).
 
I don't think so. RT has been a standard for decades.

On the other side...

Honestly, I wonder if someday we will see the RT calculations being accelerated in the gpu itself instead of badly using the CPU (like how hard it affects Nvidia cards.. where GPUs arent even used at 70%, pushed back by the RT cores).
It won't go away until nVidia stops being cheap and puts a hardware scheduler into the GPU design.

Regards.