News 3DMark Gets a Fully Ray Traced Feature Test

One concern is that hybrid rendering requires all the code for traditional rasterization, plus code for ray tracing, and that means potentially more work for the game developers. Then again, with support for ray tracing built into Unreal Engine and Unity, that's less of a problem than you might thing. Regardless, we're not likely to leave rasterization behind any time soon.
GPU designers could also allocate more silicon for ray tracing hardware to improve performance there, but that would come at the cost of shaders, if we assume the same transistor count. But then comes a problem: if it's balanced it out too far towards ray tracing, we'll start regressing in rasterized only performance.
 
Spent many many hours in the late 80s watching my Amiga of the day (020 then 030 upgraded) grind away ray tracing a single image. The holy grail of gaming was to ray trace a game at a playable frame rate. At those Amiga resolutions (SD PAL), extrapolating from the 1440P benchmarks, that we are there today!!! Obviously, our expectations have shifted to 1440P or 4k with HDR (and ultra wide for me) and there is a need to add a game engine on top of the ray tracing, but, it seems after waiting decades we are within a generation or two of GPUs being able to ray trace in real time, at modern refresh rates (100+fps), at consumer-level price points.

Bring it on!!!!
 
Spent many many hours in the late 80s watching my Amiga of the day (020 then 030 upgraded) grind away ray tracing a single image. The holy grail of gaming was to ray trace a game at a playable frame rate. At those Amiga resolutions (SD PAL), extrapolating from the 1440P benchmarks, that we are there today!!! Obviously, our expectations have shifted to 1440P or 4k with HDR (and ultra wide for me) and there is a need to add a game engine on top of the ray tracing, but, it seems after waiting decades we are within a generation or two of GPUs being able to ray trace in real time, at modern refresh rates (100+fps), at consumer-level price points.

Bring it on!!!!
We are already there. With my 3080 I can have RT on at 1440p and hit 100+ FPS. Even at 4K it stays over 60 in the games I tested.
 
We are already there. With my 3080 I can have RT on at 1440p and hit 100+ FPS. Even at 4K it stays over 60 in the games I tested.
Sorry I should have been clearer. I meant pure ray tracing, not hybrid, as per the current crop of games.
 
Right now its akin to the early days of 3d engines and the early days of vr. Starting requirements are low as hardware catches up. But there's a logarithmic like curve on complexity that grows quickly in the beginning.

We used to think a voodoo 2 would keep us ready for years after quake. We thought a R290 which maxed out steam vr test would surfice long term.

Todays cards will become quickly outdated. But if you choose one, go for as much power as you can. I consider the 3080 the minimum for a couple years of rt use. (3 years). After that you'll be looking at rasterized gaming. (Generally speaking on a broad sample of games)

I dont like saying that as I prefer AMD, but if Im going to spend this kind of scratch, I dont want to look at my card in 2 years and realize I cant do rt at reasonable rates even at low res. The games are already there to push the best system to its limits like Control, cyber punk 2077, and watch dogs:legion.
 
Last edited:
GPU designers could also allocate more silicon for ray tracing hardware to improve performance there, but that would come at the cost of shaders, if we assume the same transistor count. But then comes a problem: if it's balanced it out too far towards ray tracing, we'll start regressing in rasterized only performance.
Or more likely, rasterized performance would just stall, and not improve much from one generation to the next. A bit like Nvidia's 20-series at launch, for example. Only minor improvements in standard rendering performance over the 10-series at any given price point, but with some amount of raytracing hardware added. For that generation, the tradeoff did not feel particularly worthwhile, at least at launch, since there were no games utilizing the feature, and questionable performance for future games that eventually would. But if they were to do something similar for the 40 or 50 series, it might be taken better, assuming raytracing becomes a lot more common in games over the next couple years or so. If most newer big games support RT effects by that time, and improved RT hardware were to boost frame rates by 50% or more with those effects enabled, that might be viewed in a more positive light, even if the performance in games not utilizing the effects doesn't improve much.

And while RT will probably become the new standard for "ultra" graphics settings relatively soon, I suspect traditional rendering will still be supported for a number of years to come. Game publishers are not going to want to limit their sales to only those with higher-end RT hardware, and there's bound to be a large portion of systems without it for at least several years to come. And hybrid RT rendering will probably be the norm through at least this console generation. I suppose it's possible that PC releases could add the option for fully-raytraced rendering some years down the line, for newer, higher-end cards that might manage playable performance with the feature enabled, but I have some doubts that it would be worth the performance hit over a hybrid approach, for what would likely be very small improvements to visuals.
 
  • Like
Reactions: digitalgriffin
We used to think a voodoo 2 would keep us ready for years after quake. We thought a R290 which maxed out steam vr test would surfice long term.
I remember telling my brother. Hey, I need to buy a GeForce 256. It has the capability to offload work from the CPU, that means we never have to upgrade our computers again for a long time to come. lol. What was I even thinking back then :tearsofjoy: Then there was GeForce 2, 3, 4. Once the Geforce 2/3 were out and how they enhanced gaming performance, I realised how wrong I was.