News Battlefield 2042 PC Benchmarks, Performance, and Settings

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
I just saw HUB's video of the CPU scaling and benchmark for this game and GPU power is not the biggest issue in this game.

The amount of tests to get the data for all the CPUs on 128 player matches was incredible, but this game is both crazy heavy on the CPU and needs a lot more optimization too.

I love how the 0.1% Lows show the 5600x matching the new "sweetheart" 12600k and having only 6 fps less than the "new king" (lmao) 12900k. And that's with an RTX 3090.

Does it really matter if the averages differ by 15 fps when the 0.1% Lows are the same?



Link to the full HUB video.
 
  • Like
Reactions: Makaveli

larkspur

Distinguished
I just saw HUB's video of the CPU scaling and benchmark for this game and GPU power is not the biggest issue in this game.

The amount of tests to get the data for all the CPUs on 128 player matches was incredible, but this game is both crazy heavy on the CPU and needs a lot more optimization too.

I love how the 0.1% Lows show the 5600x matching the new "sweetheart" 12600k and having only 6 fps less than the "new king" (lmao) 12900k. And that's with an RTX 3090.
I think that when you get to the top-end of that graph, the 3090 becomes the limiting component. There isn't much difference between the three AL and five Zen 3 chips there but (despite similar core counts) the older architectures seem to be performing significantly worse - indicating some heavy CPU dependency. It kind of tells us what we already know - a 6+ core AL or 6-core Zen3 is perfectly adequate for gaming.
 
  • Like
Reactions: VforV

vinay2070

Distinguished
Nov 27, 2011
255
58
18,870
I think that when you get to the top-end of that graph, the 3090 becomes the limiting component. There isn't much difference between the three AL and five Zen 3 chips there but (despite similar core counts) the older architectures seem to be performing significantly worse - indicating some heavy CPU dependency. It kind of tells us what we already know - a 6+ core AL or 6-core Zen3 is perfectly adequate for gaming.
It would be interesting to see how adequate would the 6 core be with newer console games using 8Core AMD CPUs start migrating to PCs and with RDNA3/ADA LL level GPUs running them (assuming if they are as fast as the leakers claim them to be and be less of a GPU bottleneck). For now, they are perfectly adequate. I am tempted to upgrade my 3700X to 5600X for little price difference, but I will hold off and see how good the 6600X/6800X are gonna be.
 
Meh! Some aspects of the game visually are good. But the game itself is not enjoyable to play. My 3060ti plays it roughly in line with the charts, but that only tells one side of the story for this game. There are major issues that need work , like massive server lag, which still exists after update, the horrible 'bloom' effect when shooting. The rubber banding is still going on too.
It also seems to poorly optimized, as my 3060ti never goes above 70-75% usage on Ultra! Maybe with a few more updates it might be better, but for now I enjoy BF V more, despite the rampant cheating.
 

sstanic

Distinguished
Aug 6, 2016
66
28
18,560
it's a heavily CPU bottlenecked game it would seem. I'd really like to see some testing and optimizations in that sense. for example, Ryzen 3700X and 3600X are very common, I've a 3800X for now, and some other CPUs are common of course, and it'd be very interesting to see how different settings help the situation. faster RAM, lower timings in particular on Ryzens, is that the/a bottleneck? does the higher power of the 105W 3800X make a difference vs 65W 3700X, and similar comparisons would be interesting to many I feel.

in this review, I feel that omitting DLSS for 4K missed the point of DLSS, as it seems that Nvidia is primarily optimizing it precisely for 4K. also, by now many people have a LG OLED 4K TV, with a 120Hz panel, and 120Hz HDMI 2.1, there's a 42" model coming soon it seems too. your comparisons for 4K were great to see, as were various optimizations and image quality comparisons. nice review, many thanks for the hard work 🙂
 

Dantte

Distinguished
Jul 15, 2011
161
58
18,760
We tested Battlefield 2042 performance, including ray tracing and DLSS, and provide a detailed look at how the settings impact performance and visual quality. Here's our hHB9WNpnrdimdEXnxjMSrMe to tuning performance to get the most out of the game.

Battlefield 2042 PC Benchmarks, Performance, and Settings : Read more

You missed a big one!

By default BF2042 runs using DX11 and there is no in-game setting to enable DX12. Open the file "PROFSAVE_profile" in your Documents/Battlefield 2042\settings folder, and change the value for GstRender.Dx12Enabled from a 0 to a 1. This will give you an additional 5% or better performance!
 
it's a heavily CPU bottlenecked game it would seem. I'd really like to see some testing and optimizations in that sense. for example, Ryzen 3700X and 3600X are very common, I've a 3800X for now, and some other CPUs are common of course, and it'd be very interesting to see how different settings help the situation. faster RAM, lower timings in particular on Ryzens, is that the/a bottleneck? does the higher power of the 105W 3800X make a difference vs 65W 3700X, and similar comparisons would be interesting to many I feel.

in this review, I feel that omitting DLSS for 4K missed the point of DLSS, as it seems that Nvidia is primarily optimizing it precisely for 4K. also, by now many people have a LG OLED 4K TV, with a 120Hz panel, and 120Hz HDMI 2.1, there's a 42" model coming soon it seems too. your comparisons for 4K were great to see, as were various optimizations and image quality comparisons. nice review, many thanks for the hard work 🙂
The 4K charts have DLSS Quality tested on the various RTX GPUs, and the point of the DLSS analysis section was to show scaling with the different modes. At 4K, the scaling might be a bit better than at 1440p, but it's not going to radically alter the charts. Basically, Quality mode on the 3060 boosted performance 24% at 4K, so Balanced and Performance modes will scale from there. It makes 4K @ >60 fps viable, but only if you have a card with more than 8GB VRAM.
 
  • Like
Reactions: sstanic
You missed a big one!

By default BF2042 runs using DX11 and there is no in-game setting to enable DX12. Open the file "PROFSAVE_profile" in your Documents/Battlefield 2042\settings folder, and change the value for GstRender.Dx12Enabled from a 0 to a 1. This will give you an additional 5% or better performance!
I can confirm that this information is definitely not correct when using current generation RX 6000 and RTX 30 series graphics cards. I never bothered to set DX12 mode, because that was the default for all the GPUs I tested. Here's an RTX 3090 launching the game, for example, with OCAT overlay mode running:

108

With the exception of Crysis Remastered, there are no games with DirectX Raytracing support that use DX11 mode. Now, if you have a non-RTX and non-RX 6000 GPU, maybe the game will default to DX11. More likely, DX11 is only the default on Nvidia GTX GPUs, and there's a very good chance DX11 still runs better on Pascal and Maxwell GPUs. If you have an older AMD GCN GPU, though, and it defaults to DX11, you might get a minor benefit by switching to DX12.
 

Dantte

Distinguished
Jul 15, 2011
161
58
18,760
I can confirm that this information is definitely not correct when using current generation RX 6000 and RTX 30 series graphics cards. I never bothered to set DX12 mode, because that was the default for all the GPUs I tested. Here's an RTX 3090 launching the game, for example, with OCAT overlay mode running:

View attachment 108

With the exception of Crysis Remastered, there are no games with DirectX Raytracing support that use DX11 mode. Now, if you have a non-RTX and non-RX 6000 GPU, maybe the game will default to DX11. More likely, DX11 is only the default on Nvidia GTX GPUs, and there's a very good chance DX11 still runs better on Pascal and Maxwell GPUs. If you have an older AMD GCN GPU, though, and it defaults to DX11, you might get a minor benefit by switching to DX12.
You going to test it? You said "I can confirm that this information is definitely not correct", so you have tested and there is no performance difference or you just making a blind statement with no proof or testing?

Check the file settings, 0 or a 1, run your benchmark. Change the setting to a 0 or 1, and rerun the benchmark. Let us know what, if any difference there is. Also, if there is a performance difference and this isnt affecting the DX## API in use, then what is it changing?
 
Last edited:
You going to test it? You said "I can confirm that this information is definitely not correct", so you have tested and there is no performance difference or you just making a blind statement with no proof or testing?

Check the file settings, 0 or a 1, run your benchmark. Change the setting to a 0 or 1, and rerun the benchmark. Let us know what, if any difference there is. Also, if there is a performance difference and this isnt affecting the DX## API in use, then what is it changing?
I can definitely confirm that your assertion that the game defaults to DX11 mode is incorrect, at least insofar as I tested with the GPUs in this article. That was the point.

I never manually edited the configuration files, and as shown in the screenshot, it's running in DX12 mode. I assume it will always default to DX12 if it detects an RTX card or RX 6000 card. But as noted in the article, testing this game is a pain, so I'm leaving further results to the cloud. I'm sure there are cards where DX12 runs better, and other cards where DX11 runs better. With a margin of error between test runs of 5%, however, it's very time consuming to collect enough data to be sure that you're not just seeing run-to-run variations.
 

Dantte

Distinguished
Jul 15, 2011
161
58
18,760
I can definitely confirm that your assertion that the game defaults to DX11 mode is incorrect, at least insofar as I tested with the GPUs in this article. That was the point.

I never manually edited the configuration files, and as shown in the screenshot, it's running in DX12 mode. I assume it will always default to DX12 if it detects an RTX card or RX 6000 card. But as noted in the article, testing this game is a pain, so I'm leaving further results to the cloud. I'm sure there are cards where DX12 runs better, and other cards where DX11 runs better. With a margin of error between test runs of 5%, however, it's very time consuming to collect enough data to be sure that you're not just seeing run-to-run variations.
I try to avoid the "cloud", its one of the reason I comes to Tom's, I generally trust the reporting here and play devils advocate to make you better and I would like to see you try this and publish the results. Just do it with a single configuration, if the results intrigue you go further, or if not then stop and say this setting is BS and doesnt nothing - tell me I'm completely wrong.

I personally have a RTX2070 Super, by default this setting in the config file is set to 0 (DX12 disabled) or atleast I think it means disabled. There is no in-game setting that will change it, you have to do it manually via the file. I will say from both a FPS counter and general feel of the game, changing this setting to a 1 (DX12 enabled) makes a huge difference, it just feels smoother and the frame counter is 5-10FPS higher with everything else left the same. So, like I said before, if this isnt changing or updating the API and by default its running in DX12 already, I would really love to know what is it doing, and would be great for you to do some investigation since you have the tools to go much further than myself.
 
Last edited:
I try to avoid the "cloud", its one of the reason I comes to Tom's, I generally trust the reporting here and play devils advocate to make you better and I would like to see you try this and publish the results. Just do it with a single configuration, if the results intrigue you go further, or if not then stop and say this setting is BS and doesnt nothing - tell me I'm completely wrong.

I personally have a RTX2070 Super, by default this setting in the config file is set to 0 (DX12 disabled) or atleast I think it means disabled. There is no in-game setting that will change it, you have to do it manually via the file. I will say from both a FPS counter and general feel of the game, changing this setting to a 1 (DX12 enabled) makes a huge difference, it just feels smoother and the frame counter is 5-10FPS higher with everything else left the same. So, like I said before, if this isnt changing or updating the API and by default its running in DX12 already, I would really love to know what is it doing, and would be great for you to do some investigation since you have the tools to go much further than myself.
Can you enable ray traced ambient occlusion? Because that should require DX12.
 

Dantte

Distinguished
Jul 15, 2011
161
58
18,760
Can you enable ray traced ambient occlusion? Because that should require DX12.
I set the file back to the default GstRender.Dx12Enabled 0
Loaded the game and enabled ray traced ambient occlusion. Started a match for good measure.
Closed game, Check settings file, the setting for GstRender.Dx12Enabled remains a 0, setting GstRender.RaytracingAmbientOcclusion is now a 1 (enabled)

GstRender.Dx12Enabled is making a very big difference in the game, if its not affect the DX API, I would love to know what this is doing!
 

notea

Distinguished
Dec 23, 2011
586
0
19,165
You missed a big one!

By default BF2042 runs using DX11 and there is no in-game setting to enable DX12. Open the file "PROFSAVE_profile" in your Documents/Battlefield 2042\settings folder, and change the value for GstRender.Dx12Enabled from a 0 to a 1. This will give you an additional 5% or better performance!
BF2042 is DX12 only, as stated in the minimum requirement specifications, Also this issue was fixed in the day 1 patch
min requirement screenshot