Here's How Battlefield V Performs with Ray Tracing

The framerates are really a massive buzzkill considering that we are talking about medium reflection settings, with very few reflections and at 1080P.

The way I see it, it's not even something to consider on set-ups that people actually currently use. It's unrealistic for people to buy a 2080Ti for a 1080P monitor just to run a scarce amount of medium quality RTX reflections at stable 60fps in a few select games (because even the regular 2080 is too weak to accomplish that!)

Since some people are already switching to 4K monitors, it looks like RTX is clearly not going to be playable for them for a very long time. Makes me wonder what's the adoption of the technology going to look like on dev's side, knowing the technology would require significantly more GPU firepower to be actually usable.
 
So what's causing the drop in framerates when ray tracing is enabled, since all that should be doing is turning on extra dedicated hardware.

It seems that the Ray Tracing hardware is still weak and out of balance compared to the powerful GPU, but if ray tracing is the bottleneck, then I wouldn't expect a higher resolution to has as much as a drop because the number of rays calculated should be constant?

I don't know, between the ill performing ray tracing and useless AI hardware... it's almost like the RTX series is chopped-down enterprise GPU that was designed to live inside of a render farm, not a gaming computer.
 
In other site test, those RTX cards really tumbled when there are a lot of surfaces that offer reflections. So average frame rate tell very little. When there is little or almost none reflections, the game is just slow, but playable. When there are a lot of reflections, the game is really bad!
Yep... These cards are not fast enough for ray tracing yet... well the 2080ti is good for 60fps in almost all situation at 1080p, but that is no good for first person shooters...
Ray Tracing is ok for word adventures and turn based strategy games at this moment, but next generation may actually be useful in more wider gaming situations! Also these cards allows programmers to experience with ray tracing, so that it will be better in the next generation!
 


I prefer gaming at 1K (1080) anyways as I'll prefer a higher FPS (minimum of 60) if that means sacrificing a higher resolution. The good news is, you can switch a 4K monitor to 1080 without any down-scaling effects as the pixel count divides evenly. So, no "fuzzy" interpolation effects will occur. Now whether or not enabling AA will offset the gains of staying at 1K...well...I suppose that's situational to the game you're playing. Meaning, 4K without AA might be preferred. Maybe.
 
Even though the performance cost is high, I'm still surprised the 2070 did as well as it did. I was sure it would be a total dud. If they include rt cores on the 2060 that's going to be a joke though looking at these numbers with only medium settings.
 


That's not how hardware scalers work. Instead of having special cases for doubling pixels, they always apply the same interpolating scaling algorithm. So regardless of whether the scaling factor is an integer, there will always be blurriness.
 
While I'm looking forward to ray tracing in games, this is just one company's attempt at it, and in the end, none of the reflections in the meager video Tom's shared had any real definition or clarity to them, other than the obvious changes in coloration to the reflective surfaces. Care to show off some vehicle mirrors or glass reflections so we can get a better idea of whether this stuff is going to look gimmicky? I already don't like how the ground looks so artificially reflective. It reminds me of when pixel shaders were new, and devs gave everything a plastic sheen to it because they could.
 
Just seems a waste to spend that much for something barely usable.

Buy something cheaper like a 1080ti which will give great frame rates sans RT.

It seems nVidia tried to create an awe factor which is impractical.
 
Huh? Even Oblivion and Fallout 3 can do dynamic reflections on water and specular lighting on other surfaces (like snow). The ability to make a flat surface (like a mirror) do a realistic reflection has long existed too.

What ray tracing can do is make one surface reflect the reflections of other surfaces and have their lighting influenced by the light reflected from other surfaces. Was that shown in the video? The big orange/yellow explosion reflections on water were possible 12 years ago. I would like to know what in that game is only possible with ray tracing?

In other words, if current games were actually written to max out the current abilities of DirectX and the fastest video cards on PCs (instead of being console ports) we would get a lot more than we get now.
 
Everyone should note that in addition to the test setting DXR reflections to Medium, that these subpar results are after they already toned their raytracing efforts down.

https://www.tomshardware.com/news/battlefield-v-ray-tracing,37732.html

So uh, yeah. Maybe next gen will have enough RT muscle? Double the size of the RT blocks? Anyway, I'd like to see how it runs with DXR reflections set higher, even if it's a bit cringey to look at the graphs.
 
Oh look. Almost 2019 and we’re benchmarking a $1200 video card at 1080p. Don’t forget, folks. Just Buy It!

I’ve bought into every single GeForce generation since the GTX 590. Always in SLI too. But this new card from nvidia is just garbage. 2 years since the last card they released for a wimpy 25-30% performance boost. And a ray tracing feature that’s unusable at any resolution but 1080p.

Absolute garbage. You need at least 30-40 gigarays for 4k at 60Hz or 1440p at 120-144Hz. Nvidia can shove it with this crap.
 
We'll get it next time, just like 2k, just like 4 K just like 1080 just like crysis. If you have a card that can almost do something let it be know, ARTX feels more accurate, should be functional and working at 1080p on a mainstream card in two more generations and fully working at 4 K in the following generation. Stick with your 9xx and 10xx folks
 
As thought Ray Tracing is not feasible for gaming yet, who buys a 2080Ti to game at 1080p 60fps. It is a mere gimmick or toy at this stage, maybe in a few more gpu generations it will become practical.

Problem for Nvidia is if people start realising this then the RTX pricing looks horrific. Paying through the nose to add a toy that’s not really usable.
 
There's no issue with Nvidia for advancing the feature set with hardware RT, but to use a clearly underpowered implementation in order to solely justify extortionate price increases for the new series won't be lost on us all for a very long time. They will have known this in their early prototype - the results here will instruct scaling up or further design changes.

With the festive period approaching, they had an option to properly and permanently reduce 10xx series prices in order to speed up stock removal, and then introduce this new series - but at a price that wont back-fire on them. They had plenty time AND FUNDS to wait till the old stock cleared to an acceptable level (or set acceptable pricing on new series). AMD/Intel still has some work to do to compete fully.

Unfortunately, and unbelievably, Nvidia chose poorly. The terms 'cash-grab' and 'raising the future price bar' spring strongly to mind. Let's hope AMD can make some epyc changes here too.
 


 

Are you sure it won't just pixel-double, if you configure the display @ 1080p in Windows' settings?
 
Not sure if in the future battlefield will have dlss, if that can be used with ray tracing it could help. dlss sounds like it should be more talked about with the new cards then ray tracing. Ray tracing better be easy to implement into games for the developer, few game buyers are going to be buying games because of ray tracing due to not owning rtx cards. 1250 for a card at 1080p? Please AMD get off your buts.
 
@simon7 ROFL@ people switching to 4k...Stop, you're killing me. What are you Ryan Smith? He been claiming this crap forever first with 1440p claims clear back to 660ti. It still isn't the norm today. Never mind people claiming 4k...ROFLMAO.

I just bought a 1070ti not long ago for 1200p and my next monitor will be 1600p if I have anything to say about it. 1070ti still can't manage 1080p in EVERYTHING if maxed out. If I can't get 1600p it will be 1440p if forced. I like my games above 60fps (MINS, not avg) and MAXED like the devs designed their games. Turning stuff down is dumb or you're poor 😉 I'm not alone, look what tomshardware just said about BF5 and raytracing. Turn it off for better gaming at least in this game. Yeah, I like my FPS too.

Until they start selling 32in 4k monitors that look great (and with gsync) I'm out. As you age, that puny print sucks and I don't have all day to mess around trying to get things to look right on a 27 or less at 4k. A 27-30in at 1600p or worst case 1440p is just fine for these eyes. I buy QUALITY though, so again 4k is not in my range with the stuff I want in a monitor anyway. The 1600p monitor I'd buy from dell is over a grand and it DOES NOT have freesync or gsync (UP3017). Hopefully Dell wakes up one day and puts freesync or gsync (I'm in for a 7nm card anyway, so I can go either way next year for xmas). Until then, 1200p looks great to me and I've see 4k monitors and was left unimpressed if not 30 or bigger (30 is pushing it for me already at 4k, I'd want bigger). I don't have space on my desk for 2x32 anyway, so again, I'm stuck with 2x30 or 27inchers. I can't work with one screen. I'd jump off a building...LOL.

Monitor makers might be trying to push 4k, AMD/NV might too, review sites, but not the people reading the sites...LOL. Check steam survey. All this WIDE crap is useless to me also. I read the web too much, I need TALLER not wider...LOL. Wider is only better for spreadsheets at work, otherwise...blah.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
64% running 1080p. Now add up all the resolutions below that...LOL@4k. Heck throw in my 1200p also and I laugh at 1440p+ group. You wouldn't even be right if you said 1440p was adopted massively...ROFL. 4k shows 1.3% and 1440p show 3.5%...ROFLMAO. It's a waste of time to even benchmark this stuff for every review. 90% of us would rather see you test 2x more games at 1080p. Because that is where most people play (or below...LOL) 😉 Not saying it shouldn't be tested at all, but only at new gpu tech launch or something, not constantly. More games tested=more knowledge for buyers. Testing 4k for 1.3% of the audience is STUPID. They just proved 1080p is barely workable for 2080ti with new RTX tech...LOL. Whatever. I laugh every time I see toms/anandtech claim people don't buy top of the line cards for 1080p...Uh, yeah, they do. They're are FAR more buyers of GTX 1080 and up cards that are running resolutions that are NOT 1.3% 4k or 3.5% 1440p. 72% of the multimonitor setups are not even 4k...ROFL They're 3840x1080p. Again, 4k is a joke still and so is 1440p for that matter. Should you even bother testing 1440p in every vid card review when it's only 3.52% of the readers? I digress...FAKE NEWS.
 
Oh, and I don't even have a steam account (many people on GOG refuse, just like me), so we don't even show up in the surveys. But we aren't running 4k either, but many of us have top end rigs (we just hate DRM)... :)
 


If take a 4k monitor and set my resolution in Windows to 1080, the GPU would perform no better or worse than if it was connected to a native 1080 pixel count monitor. As such, just because you have a 4k monitor doesn't restrict you to gaming at 4k; just change the resolution.
 


The resolution that you set in Windows in your graphics card's settings application will set the resolution for the outgoing signal. That is, if you set 1920x1080 in Windows, a 1920x1080 signal is sent to the monitor. The monitor then scales the image to fit the screen - that's why I mentioned "hardware" scalers.

The other option is to use GPU scaling, but you have to manually enable it in the driver.



Why bring up performance? I was talking about the scaling quality; if you use anything but the native resolution, the monitor's (or GPU's) scaling will cause softness in the image. Certain scaling factors may have less of it, but you can't get rid of it.