Here's How Battlefield V Performs with Ray Tracing

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I must say, this article was kind of terrible. What's with that benchmark run? A relatively simple environment with zero enemies and the only action consisted of shooting at one set of exploding barrels? I know you're looking for something that's repeatable, but how is that representative of the game at all? This piece seemed to be formulated to make the "RTX ON" frame rates look artificially higher than they really are. In reality, even the $1300+ 2080 Ti can struggle to stay above 60fps at 1080p in this game with RTX enabled.

And why are there no comparison shots showing those effects set to different levels, along with off? And maybe have some non RTX-enabled cards thrown into the mix, so that people can see how enabling RTX, even on low, brings these 20-series cards down to the performance level of other cards costing a fraction of the amount, with the $800 RTX 2080 performing like a $200 graphics card. Other sites have done these RTX comparisons better, and this article seems to be downplaying the fact that RTX performance is abysmal. It clearly looks to me like the goal of this piece was to specifically search for a scenario where you could give the impression that RTX effects could average above 60fps on all of these cards.

For anyone wondering about how RTX effects might perform during actually gameplay, or how the settings compare visually, I thought the Hardware Unboxed video did a relatively decent job comparing them...

https://www.youtube.com/watch?v=SpZmH0_1gWQ

These kinds of effects might be more at home in something like a slow-paced adventure game, where maintaining high performance isn't quite as important. Of course, those kinds of games are probably not system sellers for high-end graphics hardware like this. Perhaps with 7nm cards they will be able to include enough RT cores at lower price points to make these effects worth considering, but on this current hardware, they absolutely ruin performance, while significantly driving up the prices of these cards.


Why would the number of rays be constant? To maintain sharp reflections at higher resolution, you would need more rays, otherwise it would be stretching the same reflections out over a larger area. Higher resolutions appear to show a similar, or perhaps slightly larger reduction in performance, as DXR Low will still cut frame rates to around half, and DXR High will cut frame rates to around one-third of what they would be without the effect enabled. So, if a particular card were averaging around 60fps at 4K without the effects, you could expect it to average around 30fps with them set to low, and 20fps at high.
 
They're not all using the same test run and settings. So their benches aren't wrong, but they're a slightly less strenuous test case. Even so, they show what we all predicted: RTX reflections BUTCHER performance. In no way do I feel are they casting these cards in a good light with regards to RTX.

Why do people think DLSS is some miracle? It's faster than other AA methods, yes. It's definitely not faster than no AA, and it won't help their RTX performance.
 
The recent statement from AMD indicates that while AMD will make ray-tracing available in their drivers, they don't believe the feature to gain popularity before a large enough proportion of GPUs can run it comfortably. I think that would mean mainstream GPUs at about $200 - so 2 - 3 years in the future, maybe?

"Popularity" here means two things:

    ■ adoption from the game developers: it needs to be an essential feature needed to have the game appear current, and not just some extra screen candy for the top 1% richest of the playerbase, and something to get good publicity stills out of;
    ■ players actually wanting to use it in-game: doesn't make sense for devs to spend time and resources on a feature that only a tiny minority will use. Lack of polish may also mean some of the early implementations will be buggy.


These both tie to the performance/visual quality compromise: In order for ray-tracing to become commonplace, you need to be able to run it with little FPS compromise in high end AND be able to run it with resonable compromise (>50 FPS in single player) with mainstream equipment.
 
The 20 series makes no sense - huge price premium over previous gen so you can what, play at much lower resolution/settings to get a few better reflections you won't likely even notice? I really wish they had made GTX20xx at their standard price😛erf points and then offered like an RTX Titan for folks made of money who want to be the frontrunners on a new tech. Bring RTX to everyone else the next gen when it would be more established and could (I would hope) be implemented without a drastic price increase like we see here. How much of the pricing is due to AMD's utter lack of competition, I don't know but is likely a large factor.
 


In general, that's true, but NOT applicable to my example of going from Native 4k to 1k. That's because 4k has literally four times the pixel count as 1080. That means for every one pixel of a 1080 image, it it fills up perfectly a square of four pixels on a native 4k monitor. That's what I mean by "evenly divisible", no interpolation in scaling required. Again, it's a mathematical perfect 1:4 ratio.

In fact, I'll go so far as to say setting the resolution of 1080 on a 4k native monitor is better than on a native 1080 monitor! That's because the dot pitch (pixel pitch) is much finer on the 4k monitor.
 
Buying into RTX series is equivalent to writing a check to Tim Cook or Jeff Bezos for 40% of your graphics card purchase, $200 to $550 depending how gullible you are. You'll get about the same return on investment.

Picked up a Vega 56 for $330, it does not perform 66% as well as an RTX 2070. It meets my needs for most 4k gaming, under-volted, oc'd and with reasonable and barely noticeable graphics setting adjustments.

You cannot argue that the Vega series cards are not worth a look due to the above compromises when buying into RTX means you are compromising 1) Currently all but one game do not support features you are paying for, near future maybe 95% will not 2) Tom's Hardware had to tune down graphics settings to make these features playable 3) You are compromising pixel density and the actual trends of gaming (2k, 4k) in order to support a feature that is simply not utilized and cannot be supported by current RTX hardware.
 


Nvidia needed to bring RTX to lower performance tiers too, or otherwise the market penetration would be so low that the tech would never be adopted by any game devs.



I'm fully aware of the mathematics in 3840x2160 versus 1920x1080. Problem is, real world doesn't work like ideal models.

You're assuming the scaling algorithm just copies the content of the closest pixel (technique called "nearest neighbor"), which it doesn't. It averages pixels from a slightly wider - probably non-rectangular - area, to make sure that in, say 1.5x enlargement, every other pixel doesn't get doubled and every other copied as is. (There's probably some weighting, so source pixels closer to the target pixel's center will affect the result more.)

(Your use of "1K" is confusing: "4K" refers to rounded (not truncated) horizontal resolution (of 3840 to 4096, so about four kilo), so the closest equivalent match to "1K" would be 1024x576.)
 
*JONSTEWART8601 day ago

What is going on with this website? Videos aren't playing smoothly and the whole website is lagging??*


They probably RayTraced the video and your hardware can't keep up.
Just buy it!! :)
 
*JONSTEWART8601 day ago

What is going on with this website? Videos aren't playing smoothly and the whole website is lagging??*


They probably RayTraced the video and your hardware can't keep up.
Just buy it!! :)
 

The problem with your post is that we're talking about high-end cards while you're citing mainstream usage.

I have news for you: 1080p users are not buying any RTX card. And people who can afford to blow > $1k on a GPU are more likely to be to have larger & higher res monitors. A lot of them might even be upgrading their GPU specifically to let them run higher resolutions.


I have to agree, here. Running 2560x1440 @ 27" is a good fit for me. I use small fonts, which would be unreadable at 4k on much less than 32". Since the main use case for me going to 4k would be to increase screen realestate, I wouldn't buy anything less than 30" or 32".



Because you're looking in the wrong segment. They don't put these in their professional monitors, just as there are professional features they leave out of their gaming monitors. They do make freesync and gsync monitors, however.

I think our best hope is to see VESA adaptive sync starting to appear in the professional segment. Hopefully, this is becoming a standard feature of the chipsets around which monitors are built.


Again, you're missing a couple of things. First, there are doubtlessly people who want to know how well games are running at 4k, since they might be contemplating an upgrade. So, the mere fact of them not currently having 4k monitors doesn't mean a lack of interest.

The second is just a hunch that the volume of software purchases by high-end hardware owners might be greater than at the lower end. So, if the idea is to cater to where the spending is happening, you would naturally see a bias towards the higher end.


*sigh*

How does that even make sense?
 

Actually, it is faster than natively rendering the target resolution with no AA, as shown here:

https://www.tomshardware.com/reviews/dlss-upscaling-nvidia-rtx,5870.html

Now, some will argue that it's cheating because DLSS is actually up-scaling, but I think that ties in with @maddogcollins' point - it should be possible to natively render w/ ray tracing at 1080p and use DLSS to upscale to 1440p with minimal further impact on frame rates.
 
ere ere, I think AMD now need to step up too the plate and deliver some well overdue competition. just like they have with thread ripper. Please AMD get of your back sides
 
The developer's recommendation is to use "Low" DXR settings. Will push the frame rate a bit in the right direction.
... and the frame rate is obviously much better than if using traditional ray-tracing...
 
I think that MADDOGCOLLINS is correct about DLSS. DLSS is needed to make the ray tracing viable. Jen-Hsun Huang states that today's silicon is orders of magnitude too slow to do real time ray tracing. DLSS is what makes ray tracing viable. DLSS reduces the number of rays that your need to calculate by orders of magnitude since it is so good at interpolating low resolution to high resolution. AFAIK, Battlefield V implements ray tracing but not DLSS so it is no surprise that it is slow. I think that we will be blown away by what RTX boards can so once the developers figure out how to fully exploit the capabilities.
 
I think that MADDOGCOLLINS is correct about DLSS. DLSS is needed to make the ray tracing viable. Jen-Hsun Huang states that today's silicon should be orders of magnitude too slow to do real time ray tracing. DLSS is what makes ray tracing viable. DLSS reduces the number of rays that your need to calculate by orders of magnitude since DLSS is so good at interpolating low resolution to high resolution. AFAIK, Battlefield V implements ray tracing but not DLSS so it is not surprising that it is slow. I think that we will be blown away by what RTX boards can do once the developers figure out how to fully exploit the capabilities.
 

Yes and no.

Writ large, deep learning is a huge asset to ray tracing. Particularly for global illumination, it does an excellent job of interpolating lighting maps. However, AFAIK, Battlefield V is not even using global illumination.

With regard to DLSS, specifically, I would not say it substitutes "orders of magnitude" of rendered pixels. So far, they seem to use it to upsample from about half the number of non-antialiased pixels. So, if you can get decent 1080p framerates out of a RTX 2080 Ti without DLSS, then maybe DLSS will deliver similar results at 1440p. Still not 4k. For that, probably sit tight and wait for the RTX 3000 series.

That said, I'm skeptical how well even DLSS can antialias reflections and refractions. At best, it might just blur the reflections a bit to make aliasing less noticeable.
 


 
We are still far away from it being used any where near it's potential. Tomb Raider is using Ray tracing for shadow/illumination. I forgot how many different aspects ray tracing can be used on but if you used them all at the same time there's no card that can handle this. It does make me extremely happy that they are working on new graphic and game enhancements that are making the actual picture more life like other then just now making power hog cards so it can do 4K resolution.
 
We need better tessellation and environment textures.

Raytracing is like icing on a cake, the finishing touch. Right now we don't have the technology to implement ray-tracing without taking massive performance hits.

The performance hit vs photo-realism is just not good enough in raytracing. There are many other parameters which will increase the image quality.
 
At DXR launch 1-25 fps [~20fps] dxr low 4kUltra TAA low on a 2080
DXR Tides of War Patch 1-60fps [~40fps] still with the constant hitching and complete hangs [during big explosions] definitely not worth it