cryoburner
Judicious
I must say, this article was kind of terrible. What's with that benchmark run? A relatively simple environment with zero enemies and the only action consisted of shooting at one set of exploding barrels? I know you're looking for something that's repeatable, but how is that representative of the game at all? This piece seemed to be formulated to make the "RTX ON" frame rates look artificially higher than they really are. In reality, even the $1300+ 2080 Ti can struggle to stay above 60fps at 1080p in this game with RTX enabled.
And why are there no comparison shots showing those effects set to different levels, along with off? And maybe have some non RTX-enabled cards thrown into the mix, so that people can see how enabling RTX, even on low, brings these 20-series cards down to the performance level of other cards costing a fraction of the amount, with the $800 RTX 2080 performing like a $200 graphics card. Other sites have done these RTX comparisons better, and this article seems to be downplaying the fact that RTX performance is abysmal. It clearly looks to me like the goal of this piece was to specifically search for a scenario where you could give the impression that RTX effects could average above 60fps on all of these cards.
For anyone wondering about how RTX effects might perform during actually gameplay, or how the settings compare visually, I thought the Hardware Unboxed video did a relatively decent job comparing them...
https://www.youtube.com/watch?v=SpZmH0_1gWQ
These kinds of effects might be more at home in something like a slow-paced adventure game, where maintaining high performance isn't quite as important. Of course, those kinds of games are probably not system sellers for high-end graphics hardware like this. Perhaps with 7nm cards they will be able to include enough RT cores at lower price points to make these effects worth considering, but on this current hardware, they absolutely ruin performance, while significantly driving up the prices of these cards.
Why would the number of rays be constant? To maintain sharp reflections at higher resolution, you would need more rays, otherwise it would be stretching the same reflections out over a larger area. Higher resolutions appear to show a similar, or perhaps slightly larger reduction in performance, as DXR Low will still cut frame rates to around half, and DXR High will cut frame rates to around one-third of what they would be without the effect enabled. So, if a particular card were averaging around 60fps at 4K without the effects, you could expect it to average around 30fps with them set to low, and 20fps at high.
And why are there no comparison shots showing those effects set to different levels, along with off? And maybe have some non RTX-enabled cards thrown into the mix, so that people can see how enabling RTX, even on low, brings these 20-series cards down to the performance level of other cards costing a fraction of the amount, with the $800 RTX 2080 performing like a $200 graphics card. Other sites have done these RTX comparisons better, and this article seems to be downplaying the fact that RTX performance is abysmal. It clearly looks to me like the goal of this piece was to specifically search for a scenario where you could give the impression that RTX effects could average above 60fps on all of these cards.
For anyone wondering about how RTX effects might perform during actually gameplay, or how the settings compare visually, I thought the Hardware Unboxed video did a relatively decent job comparing them...
https://www.youtube.com/watch?v=SpZmH0_1gWQ
These kinds of effects might be more at home in something like a slow-paced adventure game, where maintaining high performance isn't quite as important. Of course, those kinds of games are probably not system sellers for high-end graphics hardware like this. Perhaps with 7nm cards they will be able to include enough RT cores at lower price points to make these effects worth considering, but on this current hardware, they absolutely ruin performance, while significantly driving up the prices of these cards.
Giroro :
So what's causing the drop in framerates when ray tracing is enabled, since all that should be doing is turning on extra dedicated hardware.
It seems that the Ray Tracing hardware is still weak and out of balance compared to the powerful GPU, but if ray tracing is the bottleneck, then I wouldn't expect a higher resolution to has as much as a drop because the number of rays calculated should be constant?
It seems that the Ray Tracing hardware is still weak and out of balance compared to the powerful GPU, but if ray tracing is the bottleneck, then I wouldn't expect a higher resolution to has as much as a drop because the number of rays calculated should be constant?
Why would the number of rays be constant? To maintain sharp reflections at higher resolution, you would need more rays, otherwise it would be stretching the same reflections out over a larger area. Higher resolutions appear to show a similar, or perhaps slightly larger reduction in performance, as DXR Low will still cut frame rates to around half, and DXR High will cut frame rates to around one-third of what they would be without the effect enabled. So, if a particular card were averaging around 60fps at 4K without the effects, you could expect it to average around 30fps with them set to low, and 20fps at high.