News AMD's FSR 2.0 Even Worked With Intel Integrated Graphics

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,189
1,091
4,070
1
Now to wait for Devs to tell us, with a straight face, they can only implement DLSS due to technical reasons XD
Seriously, at this point, if a game supports DLSS, it really should be trivial to add FSR 2.0 support as well. And it goes the other way, too: if something supports FSR 2.0, adding in DLSS support should be pretty simple.

Now, the real question I have is how much effort beyond the base integration is required to get good results. I've seen games that just don't seem to benefit much from DLSS — Red Dead Redemption 2 is a good example. It doesn't get a big boost to performance. Deathloop is a good example of an FSR 1.0 game where the integration also didn't work well for whatever reason. So, getting DLSS or FSR working shouldn't be too difficult. Getting it working well is another story.
 

evdjj3j

Reputable
Aug 4, 2017
48
21
4,535
0
"Most of my subsequent attempts to run the game resulted in a variety of rendering errors."

I imagine if Intel ever releases the rest of the desktop Xe GPUs this will be a common experience for people who buy them.
 

tommo1982

Distinguished
Dec 5, 2010
78
30
18,560
0
That's some unexpected benefit for Intel.

I'm surprised Iris iGPU can't properly render a game. I use AMD's APU and didn't encounter any problems, playing old games, even. It doesn't bode well for Arc. Intel unable to get their drivers right is not something I expect from such big company.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,189
1,091
4,070
1
"Most of my subsequent attempts to run the game resulted in a variety of rendering errors."

I imagine if Intel ever releases the rest of the desktop Xe GPUs this will be a common experience for people who buy them.
I'm sure this is part of the reason for the Arc delays. Intel is probably finding out there are minor bugs in the drivers that need fixing for a bunch of games to work properly. In theory, at some point the drivers will address nearly all of the critical issues and things should just work, but it might take a while to get there. Look at my Intel DG1 story from nearly a year ago:

Which is not to say the DG1 had a strong showing at all. Besides generally weak performance relative to even budget GPUs, we did encounter some bugs and rendering issues in a few games. Assassin's Creed Valhalla would frequently fade to white and show blockiness and pixelization. Here's a video of the 720p Valhalla benchmark, and the game failed to run entirely at 1080p medium on the DG1. Another major problem we encountered was with Horizon Zero Dawn, where fullscreen rendering failed but the game could run in windowed mode okay (but not borderless window either). Not surprisingly, both of those games are DX12-only, and we encountered other DX12 issues. Fortnite would automatically revert to DX11 mode when we tried to switch, and Metro Exodus and Shadow of the Tomb Raider both crashed when we tried to run them in DX12 mode. Dirt 5 also gave low VRAM warnings, even at 720p low, but otherwise ran okay.

DirectX 12 on Intel GPUs has another problem potential users should be aware of: sometimes excessively long shader compilation times. The first time you run a DX12 game on a specific GPU, and each time after swapping GPUs, many games will precompile the DX12 shaders for that specific GPU. Horizon Zero Dawn as an example takes about three to five minutes to compile the shaders on most graphics cards, but thankfully that's only on the first run. (You can skip it, but then load times will just get longer as the shaders need to be compiled then rather than up front.) The DG1 for whatever reason takes a virtual eternity to compile shaders on some games — close to 30 minutes for Horizon Zero Dawn. Hopefully that's just another driver bug that needs to be squashed, but we've seen this behavior with other Intel Graphics solutions in the past, and HZD likely isn't the sole culprit.
Testing Deathloop, a DX12-only game, all of the above issues were once again in play. Ironically, when installing Intel's latest drivers, Intel even has the audacity to have a little blurb saying, "Did you know Intel was the first company to have a fully DirectX 12 compliant GPU?" I'm not sure what world Intel is living in, but that's absolute bollocks. Even if Intel was the first with DX12 drivers, I can pretty much guarantee the hardware and drivers didn't work properly in quite a few cases.
 
Seriously, at this point, if a game supports DLSS, it really should be trivial to add FSR 2.0 support as well. And it goes the other way, too: if something supports FSR 2.0, adding in DLSS support should be pretty simple.

Now, the real question I have is how much effort beyond the base integration is required to get good results. I've seen games that just don't seem to benefit much from DLSS — Red Dead Redemption 2 is a good example. It doesn't get a big boost to performance. Deathloop is a good example of an FSR 1.0 game where the integration also didn't work well for whatever reason. So, getting DLSS or FSR working shouldn't be too difficult. Getting it working well is another story.
Good point. It does go both ways there. FSR2.0/DLSS should be equivalent going forward, from a developer's perspective (effort-wise; in theory).

As for why it's better implemented in some games and not others. I'd imagine it has to do with how the engine behaves and handles the rendering pipeline, plus how they apply the texture filtering. From a very high level, DLSS and FSR need to be put in the right place for the libraries to do their work correctly. That place is still decided by the developer and attached to their engine rendering pipeline. The devil is always in the details, but I think I'm not that off the mark. Specially with the AA and texture filtering techniques used.

Regards.
 
Apr 21, 2022
14
4
15
0
That's why I love AMD, even if FSR 2.0 is lower than the older version, but I'm sure, soon when it comes with a stable version, the 2.0 version will give us more performance
 
Reactions: artk2219

artk2219

Distinguished
Jun 30, 2010
668
138
19,240
38
That's some unexpected benefit for Intel.

I'm surprised Iris iGPU can't properly render a game. I use AMD's APU and didn't encounter any problems, playing old games, even. It doesn't bode well for Arc. Intel unable to get their drivers right is not something I expect from such big company.
Unfortunately it's intels biggest problem, I have no doubt that they are able to build some decent GPU's on a hardware level, but on a driver level they've never had to keep up. They've had issues patching their existing crummy products for general compatibility since they've existed, and they figured that so long as their gpu's displayed a mostly stable image it was "good enough". Now they'll actually have to work to get a driver thats more than "it displays an image mostly correctly", and keeping up to date GPU drivers is something they've never done a good job of before.
 
Last edited:

jkflipflop98

Distinguished
Feb 3, 2006
1,699
144
19,970
3
Seriously, at this point, if a game supports DLSS, it really should be trivial to add FSR 2.0 support as well. And it goes the other way, too: if something supports FSR 2.0, adding in DLSS support should be pretty simple.
You know enough to see where this is going. Microsoft will implement "DirectScaling" or something to directx and that will be the end of it. Just like every other graphics tech in the last 20 years.
 

blppt

Distinguished
Jun 6, 2008
506
31
19,010
0
Now, the real question I have is how much effort beyond the base integration is required to get good results. I've seen games that just don't seem to benefit much from DLSS — Red Dead Redemption 2 is a good example.
I get the feeling that RDR2s limitations sometimes involve CPU rather than GPU---this Rockstar engine (GTAV/RDR2) has always scaled well with more powerful CPUs.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,189
1,091
4,070
1
I get the feeling that RDR2s limitations sometimes involve CPU rather than GPU---this Rockstar engine (GTAV/RDR2) has always scaled well with more powerful CPUs.
The problem with RDR2 is that even at higher resolutions, where the GPU should be the bottleneck, and running a slower GPU (e.g. RTX 3070 at 4K), DLSS Quality mode only gives you maybe 15%. In that same scenario, other games like Horizon Zero Dawn get around 35% more performance. It's also not CPU limited, as a faster GPU will still get much higher performance. So, something odd is going on, but I'm not precisely sure what it is. Then again, Rockstar is one of the few companies still supporting MSAA, which makes me think it had to do some extra stuff to make sure DLSS didn't screw anything up. But RDR2 has had TAA support since the beginning, which usually has the same basic requirements as DLSS/FSR2 (depth buffer, frame buffer, and even motion vector buffer IIRC).
 

Sleepy_Hollowed

Honorable
Jan 1, 2017
215
51
10,670
1
You know enough to see where this is going. Microsoft will implement "DirectScaling" or something to directx and that will be the end of it. Just like every other graphics tech in the last 20 years.
While I agree with you that this may happen, this is big enough that it also works natively in macOS (MoltenVK) and Linux via Vulkan, which is a much bigger deal, and much more impactful.
 

rluker5

Distinguished
Jun 23, 2014
128
39
18,610
0
I'm sure this is part of the reason for the Arc delays. Intel is probably finding out there are minor bugs in the drivers that need fixing for a bunch of games to work properly. In theory, at some point the drivers will address nearly all of the critical issues and things should just work, but it might take a while to get there. Look at my Intel DG1 story from nearly a year ago:



Testing Deathloop, a DX12-only game, all of the above issues were once again in play. Ironically, when installing Intel's latest drivers, Intel even has the audacity to have a little blurb saying, "Did you know Intel was the first company to have a fully DirectX 12 compliant GPU?" I'm not sure what world Intel is living in, but that's absolute bollocks. Even if Intel was the first with DX12 drivers, I can pretty much guarantee the hardware and drivers didn't work properly in quite a few cases.
If you have one available, have you tried an Intel desktop IGPU? Like the UHD770 in 12600k, 12700k,12900k?
The one in my 12700k is the only Xe I have to try and maybe because it isn't power starved or something, but I get much better results.
My Asus Prime z690-P won't apply any IGPU voltage adjustments for some reason so I can only run mine stably at 1950mhz for just under 1 tflop.
It still has some issues, like in the third scene of the canned SOTTR bench I get some weird faded persistent background image, but only in DX12.
I also have a 3080, but disable that in device manager (and have the hdmi plugged into the mobo ofc) when I test.
Maybe the disabled 3080 is helping somehow IDK, but I see 53fps av in SOTTR integrated bench at 720p low, 36fps av in HZD game bench 720p low, 66fps av in Div2 game bench 720p low when I disable vsync, and 27 av fps in CP2077 game bench at 720p low.
They aren't awesome scores, but good for 256 shaders and I'm suspicious of power throttling on the larger Intel mobile IGPUs after having my own experiences with my Dell Venue 11 pro 7140 with a m5y71 that only did well if you considered it's 6w TDP.

And I just ran this Timespy as some evidence because anonymous people can say anything, and I didn't want to fill up this place with my screenshots of those scores: Intel UHD Graphics 770 (12th gen desktop) video card benchmark result - Intel Core i7-12700K Processor,ASUSTeK COMPUTER INC. PRIME Z690-P (3dmark.com)

The IGPU in my desktop gives me a more hopeful picture of ARC that this relatively bleak stuff I'm hearing about these Iris IGPUs.

Edit: The SOTTR was at lowest settings. Reran it at low below.
 
Last edited:

rluker5

Distinguished
Jun 23, 2014
128
39
18,610
0
The IGPU in my desktop gives me a more hopeful picture of ARC than this relatively bleak stuff I'm hearing about these Iris IGPUs.
I just used the Xbox recorder to record a couple game benches with my 12700k IGPU. I set it to high quality, but the videos still have some stuttering that I don't see in game. It looks like what the Afterburner overlay is saying to me. Also Youtube compression hasn't been kind, but I'm glad they put out the accessible platform. I also turned down the power in my CPU a bit but it isn't holding back my IGPU at all.
Shadow of the Tomb Raider on 12700k IGPU - YouTube
Horizon Zero Dawn Complete Edition on 12700k IGPU - YouTube
If the A770 is clocked the same at 1950mhz, it could be 16x as fast if it scales perfectly and 20x as fast if it can go 2438mhz and scales perfectly (it probably won't scale perfectly but hopefully isn't too far off).

Too many delays.
 
Last edited:

jkflipflop98

Distinguished
Feb 3, 2006
1,699
144
19,970
3
While I agree with you that this may happen, this is big enough that it also works natively in macOS (MoltenVK) and Linux via Vulkan, which is a much bigger deal, and much more impactful.
Did you really just try to say that gaming on QuackOS and Linux is a "much bigger deal and much more impactful" than gaming on Windows? Really? Really?
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,189
1,091
4,070
1
If you have one available, have you tried an Intel desktop IGPU? Like the UHD770 in 12600k, 12700k,12900k?
The one in my 12700k is the only Xe I have to try and maybe because it isn't power starved or something, but I get much better results.
My Asus Prime z690-P won't apply any IGPU voltage adjustments for some reason so I can only run mine stably at 1950mhz for just under 1 tflop.
It still has some issues, like in the third scene of the canned SOTTR bench I get some weird faded persistent background image, but only in DX12.
I also have a 3080, but disable that in device manager (and have the hdmi plugged into the mobo ofc) when I test.
Maybe the disabled 3080 is helping somehow IDK, but I see 53fps av in SOTTR integrated bench at 720p low, 36fps av in HZD game bench 720p low, 66fps av in Div2 game bench 720p low when I disable vsync, and 27 av fps in CP2077 game bench at 720p low.
They aren't awesome scores, but good for 256 shaders and I'm suspicious of power throttling on the larger Intel mobile IGPUs after having my own experiences with my Dell Venue 11 pro 7140 with a m5y71 that only did well if you considered it's 6w TDP.

And I just ran this Timespy as some evidence because anonymous people can say anything, and I didn't want to fill up this place with my screenshots of those scores: Intel UHD Graphics 770 (12th gen desktop) video card benchmark result - Intel Core i7-12700K Processor,ASUSTeK COMPUTER INC. PRIME Z690-P (3dmark.com)

The IGPU in my desktop gives me a more hopeful picture of ARC that this relatively bleak stuff I'm hearing about these Iris IGPUs.

Edit: The SOTTR was at lowest settings. Reran it at low below.
Xe 32 EU variants like the UHD 770 should be slower than the Iris Xe, though power constraints do come into play. But what I mentioned here with Deathloop isn't a power issue, it's a drivers issue. I'm sure if you try to run Deathloop on UHD 770 you'll get similar issues. After all, I got virtually the exact same behavior on Gen11 and Gen12 laptop graphics. Deathloop is very much pushing higher settings and graphics quality than other older games, and even at its very low preset and 1280x720, it wants to use about 5GB of VRAM.
 
Reactions: rluker5

ASK THE COMMUNITY