News Microsoft Flight Simulator Performance and Benchmarks: Your PC May Need an Upgrade

It is definitely NOT using Bing maps. If you look at Disney World in Flight Sim it's well over 5 years out of date. Sorcerer's hat is still there! No Toy Story or Star Wars. Go to Bing maps and it's pretty recent, Toy Story and Star Wars is there.
 
Yes the gap is there is you are playing at low settings.

But don't really know many people that will be using a 2080Ti at 1080p and low settings.
Doesn't matter what the GPU is, won't change the fact that based on these results, Intel's 9600k gives you a ~20% higher chance of maintaining a solid 60+FPS than AMD's 3900X - the GPU cannot put out more frames than what the CPU is able to send to it. Finding that limit is the whole point of using the fastest GPU available in CPU benchmarks.
 
You should also do a test with various RAM configurations of quantity, speed, 2 channel versus 4 channel, ect.

I pulled almost 26GB utilized yesterday while flying around Dubai, which is the most I have ever seen any game use. I'm about to run a few of my own tests and see just how far I can utilize this 32GB RAM.

I've seen lots of people with 16GB complaining about performance tanking, and I suspect it has to do with online data cache.
 
You should also do a test with various RAM configurations of quantity, speed, 2 channel versus 4 channel, ect.

I pulled almost 26GB utilized yesterday while flying around Dubai, which is the most I have ever seen any game use. I'm about to run a few of my own tests and see just how far I can utilize this 32GB RAM.

I've seen lots of people with 16GB complaining about performance tanking, and I suspect it has to do with online data cache.
It's actually on my list of things to test this coming week -- I want to do Ryzen + 5700 XT and multi-GPU. Figured best to post this first still pretty in-depth look and the run more tests, rather than keep waiting.
 
I have a new i9-10900K (liquid cooled) and a RTX 2080 Super Hybrid liquid cooled card (standard config, no over clocking) and I run it at top settings to by LG CX 65' OLED in 4K as smooth as butter.
 
I have a new i9-10900K (liquid cooled) and a RTX 2080 Super Hybrid liquid cooled card (standard config, no over clocking) and I run it at top settings to by LG CX 65' OLED in 4K as smooth as butter.
At 4K ultra? What does “smooth as butter” mean to you? 30 FPS? Because no way are you getting anything close to 60 FPS at those settings with a 2080 Super.
 
  • Like
Reactions: clsmithj
Doesn't matter what the GPU is, won't change the fact that based on these results, Intel's 9600k gives you a ~20% higher chance of maintaining a solid 60+FPS than AMD's 3900X - the GPU cannot put out more frames than what the CPU is able to send to it. Finding that limit is the whole point of using the fastest GPU available in CPU benchmarks.
That "limit" can vary depending on the GPUused. For example, if a 3600/2070 stock is able to get 50FPS and a 9900K@5/2070 is able to get 65 FPS is a game X does not mean that a 3600/2080Ti will still be giving out 50FPS because the CPU is limited to that. It will be well over 90 FPS for sure - unless its a really badly optimised game - say like crysis 1 and very few others. And this gap further reduces with the quality settings increased. That being said, intel is still a king when it comes to gaming, no doubt.
 
You PC doesn't need an upgrade. The software needs to be fixed. DX11 in 2020 should be considered a crime.
I don't know about it being a crime, but it will impact the number of draw calls and potentially allow for more driver optimizations. I thought DX12 was required for games on the Microsoft Store (I know Gears 5, Gears 4, Gears Ultimate, Forza Horizon 4, and others are all DX12 only), but I guess Microsoft at some point changed things and DX11 is an option. Which makes me wonder if it will actually help shift the bottleneck away from the CPU, or if the problem is just inherent in the game's engine.
 
The sim currently only utilizes DX 11 not DX 12. That's regardless of where one purchased it.

To an extent…Yes. This is why it needs Direct X12.
One being bottle-necked by the IPC and GHz of ones processor. Everyone is. DX11 is limited in it’s multi-threading capability and generates too many draw calls that exceed any CPU in existence today. Even if it scaled to other threads, the main-thread is really the cause of your lower GPU usage. It is the case in P3D, XP11 and FSX. This has always been the #1 reason why frames are always lower in flight sims. You would think that P3D, as being a DX12 application, would fair better but the legacy ESP engine code is still way to single threaded focus. It’s very easy to see. Load a PMDG 777 in P3D. The frame will vary to 40-50FPS. Pause the sim and your frame will jump by almost 80%. This is because simulation code and draw calls are on the same thread and limited your CPU to drive your GPU.
This is not rocket science. Asobo needs to port their engine to a proper DX12 engine. (not like LM did) and implement DLSS. This is the only way we will ever get 60 fps in this sim. A RTX 3000 series will not help anyone. We need overall lower draw calls combined with spreading the load across enough threads that it doesn’t bottleneck the GPU. When you increase the render scale, all you are doing in loading higher resolution to your GPU (just like DSR), it doesn’t provide you more frames.
 
The sim currently only utilizes DX 11 not DX 12. That's regardless of where one purchased it.

To an extent…Yes. This is why it needs Direct X12.
One being bottle-necked by the IPC and GHz of ones processor. Everyone is. DX11 is limited in it’s multi-threading capability and generates too many draw calls that exceed any CPU in existence today. Even if it scaled to other threads, the main-thread is really the cause of your lower GPU usage. It is the case in P3D, XP11 and FSX. This has always been the #1 reason why frames are always lower in flight sims. You would think that P3D, as being a DX12 application, would fair better but the legacy ESP engine code is still way to single threaded focus. It’s very easy to see. Load a PMDG 777 in P3D. The frame will vary to 40-50FPS. Pause the sim and your frame will jump by almost 80%. This is because simulation code and draw calls are on the same thread and limited your CPU to drive your GPU.
This is not rocket science. Asobo needs to port their engine to a proper DX12 engine. (not like LM did) and implement DLSS. This is the only way we will ever get 60 fps in this sim. A RTX 3000 series will not help anyone. We need overall lower draw calls combined with spreading the load across enough threads that it doesn’t bottleneck the GPU. When you increase the render scale, all you are doing in loading higher resolution to your GPU (just like DSR), it doesn’t provide you more frames.
If you refresh, I actually just fixed this error. I mistakenly assumed the MS Store required DX12, but it certainly doesn't in this case. Whether DX12 will substantially improve performance or not is a different matter. I've seen many DX12 games perform worse than in DX11 mode. What we need is a DX12 version of MSFS to test with. I'm not sure if that's in the works or not.
 
If you refresh, I actually just fixed this error. I mistakenly assumed the MS Store required DX12, but it certainly doesn't in this case. Whether DX12 will substantially improve performance or not is a different matter. I've seen many DX12 games perform worse than in DX11 mode. What we need is a DX12 version of MSFS to test with. I'm not sure if that's in the works or not.
Asobo has confirmed it will get DX 12 at some point. With DX 12 being in it's infancy it may be sometime before it true potential will be realized.
 
Last edited:
Asobo has confirmed it will get DX 12 at some point. With DX 12 being in it's infancy it may be sometime before it true potential will be realized.
DX12 is hardly in its infancy -- the first DX12 games arrived in 2015 (Ashes of the Singularity), and we've had quite a few major game engines supporting it for years now (Unreal Engine 4, some of the Ubisoft engines, Rise of the Tomb Raider and its sequel, etc.) Or maybe you mean Asobo's use of DX12 is in its infancy, in which case you're absolutely right. We saw last week with Horizon Zero Dawn how easy it is to end up with a poorly optimized result for older hardware with DX12.