News Intel Gen11 Graphics Will Support Integer Scaling

Jun 23, 2019
11
1
15
Finally... Pixel perfect 1080P content for 4K displays... Been asking for it for years...

Consider that 10nm Intel integrated graphics could do 1080P normal settings for most new games just fine, AND playback 4K HDR videos just fine, no rush, or even no need for a seriously expensive gaming rig for my 4K TV anymore...
 

joeblowsmynose

Distinguished
So this gives more jagged edges while upscaling? That doesn't sound useful ... at all! Unless you need a pixelated image - like the one they posted in the article? I don't get it ....

Edit: I guess if smoothing was applied after the upscaling ... but wouldn't tat defeat the feature? Someone explain, maybe I'm just missing something ...
 
I don't get it ....
It's so that you can run a lower resolution that is evenly divisible by your screen's native resolution, while looking more like that resolution is the native resolution of the display. It effectively makes the pixels larger without needlessly blurring them in the process. This is arguably preferable for keeping the image looking sharp. Without integer scaling, if you run a game at 1080p on a 4K screen, for example, it will typically look a bit worse than if you were to run it on a 1080p screen of the same size, since pixels are bleeding into one another during the upscaling process. That blurring is necessary for resolutions that can not be evenly divided into your screen's native resolution, but for those that can, you are probably better without it.

Unlike the example image, the pixels are still going to be relatively small if you are running, for example, 1080p on a 4K screen. You can still use anti-aliasing routines as well, but you won't have everything getting blurred on top of that when the image is scaled to fit the screen.
 

bit_user

Polypheme
Ambassador
AMD and Nvidia employ bilinear or bicubic interpolation methods for image scaling. However, the final image often turns out blurry, too smooth or too soft. The integer scaling method should help preserve sharpness and jagged edges. Pixel-graphic game and emulator aficionados will surely love the sound of that.
Actually, Nvidia recently added DLSS, which is fundamentally a very sophisticated image scaling algorithm.

However, one could argue that they got too ambitious with it, trying to make it output near-native quality, by requiring a game-specific model. What they should do is make a few generic DLSS filters that can be applied to different styles of games, including retro. Then, you could have sharp, crisp graphics, with minimal jaggies and blurring.

According to Pearce, the older Intel parts with Gen9 graphics lack hardware support for nearest neighbor algorithms.
At some level, this is utter nonsense. All GPUs support nearest-neighbor sampling. The only difference is that they're probably doing the scaling while sending the video signal out to the monitor, whereas a software approach would need to add a post-processing pass after the framebuffer has been rendered.

However, since you get some performance benefit by rendering at a lower resolution, the benefits should still outweigh the added cost of this trivial step.
 
Last edited:
  • Like
Reactions: koblongata
Jun 23, 2019
11
1
15
So this gives more jagged edges while upscaling? That doesn't sound useful ... at all! Unless you need a pixelated image - like the one they posted in the article? I don't get it ....

Edit: I guess if smoothing was applied after the upscaling ... but wouldn't tat defeat the feature? Someone explain, maybe I'm just missing something ...

Err... No, even at 80 inch, it's not easy to tell "native" 1080P & 4K apart. BUT displaying 1080P on a 4K, everything just becomes a blurry mess with no real contast.
 
So, this is a way to render downscaled images without performing mode switching? And the "missing hardware implementation" would be a way to reduce framebuffer size when doing so? What a non-event... Especially since the retro games and emulators that would be interested in that would implement nearest neighbor interpolation anyway.

And what about non-integer? Say, a game running in 800x600 rendered on a 4K screen? There's no way to do an integer upscale without getting black borders...

No, the only use case I see for this "feature" is when plugged into a buggy screen that doesn't support mode switching to one's preferred rendering resolution, to allow GPU-side upscaling using a nearest neighbor algorihm. And if Intel were dumb enough to implement only bilinear or bicubic interpolation algorithms in their chips up until now, then it's nothing to brag about : they should be ashamed instead.
 
  • Like
Reactions: bit_user
Jun 23, 2019
11
1
15
So, this is a way to render downscaled images without performing mode switching? And the "missing hardware implementation" would be a way to reduce framebuffer size when doing so? What a non-event... Especially since the retro games and emulators that would be interested in that would implement nearest neighbor interpolation anyway.

And what about non-integer? Say, a game running in 800x600 rendered on a 4K screen? There's no way to do an integer upscale without getting black borders...

No, the only use case I see for this "feature" is when plugged into a buggy screen that doesn't support mode switching to one's preferred rendering resolution, to allow GPU-side upscaling using a nearest neighbor algorihm. And if Intel were dumb enough to implement only bilinear or bicubic interpolation algorithms in their chips up until now, then it's nothing to brag about : they should be ashamed instead.

Being able to play 800*600 content pixel perfect on a TV is like dream come true...

Say, even perfect 24001800 display of 800600 would be very nice, but remember, we are talking about 1080P here, which is what matters the most.

Non feature? users have been asking for it since LCD displays first came out. Yes, for different aspect ratios to be able to go full screen, of course interpolations are needed, but what has been a huge annoyance is that even if it can be scaled perfectly which is what most users expect, it always end up a blurry mess, it's truly one of those WTF moments. I think you really underestimate the significance of this.
 
Last edited:

joeblowsmynose

Distinguished
Err... No, even at 80 inch, it's not easy to tell "native" 1080P & 4K apart. BUT displaying 1080P on a 4K, everything just becomes a blurry mess with no real contast.

I don't have a 4k screen so I guess I never noticed that all the upscaling sampling done by GPUs and TVs/Monitors is so shitty ... super simple fix, weird this has even been a problem ever ...
 
Jun 23, 2019
11
1
15
I don't have a 4k screen so I guess I never noticed that all the upscaling sampling done by GPUs and TVs/Monitors is so shitty ... super simple fix, weird this has even been a problem ever ...

Oh... well, if you have 1080P you can run games at 540P, if 1440P then 720P... perfectly sharp and vibrant.

My god, I mean, at 720P you can run latest games with this tiny 10nm iGPU, FLUIDLY.
 

joeblowsmynose

Distinguished
Oh... well, if you have 1080P you can run games at 540P, if 1440P then 720P... perfectly sharp and vibrant.

My god, I mean, at 720P you can run latest games with this tiny 10nm iGPU, FLUIDLY.

I have 1080p run my games at 1440p and downscale ... have you ever done that? Looks frickin amazing and you don't need AA. Upscaling will never ever look anywhere close to what can be achieved by downscaling.

I guess its just to compensate for low performance then. I guess that makes sense, but as far as amazing visuals go, I wouldn't get too excited until you actually see it.
Seems this is similar to the new sharpening algorithms that AMD worked in Navi?
 
Jun 23, 2019
11
1
15
Of course you can have more detailed visuals by rendering at even higher resolution and down sample it to native resolution of your display; provided that you have a really beefy and expensive gaming rig...

Yes, Intel's solution here is exactly for low power and LOW COST, the mainstream, the mobile.

Really, I honestly cannot tell anything that's lacking gaming at 32" 720P that I have been using for a long time, fluidity, sharpness and vibrancy at native resolution is always the best way to game, I would go and buy another bigger 720P with better black anytime if it's still being made. But instead, I was forced to saving up for a 4K gaming rig for my new 4K TV (I cannot even buy 1080P with good specs now!) But after seeing this news, I give up the idea, see, if I can just run a game and display it perfectly at 1080P on a 4K, why should I bother with a 4K gaming rig anymore?

I have always against sharpening... always turn sharpness all the way down on my TV or driver as much as possible or you will get messed up hairy ghosting pixels. Based on the screenshots been released by AMD, it has pretty much the same effect, it still looks like it is faking more details with less details, it introduces more noises and artifacts in the process too.
 
Last edited:

joeblowsmynose

Distinguished
Of course you can have more detailed visuals by rendering at even higher resolution and down sample it to native resolution of your display; provided that you have a really beefy and expensive gaming rig...

Yes, Intel's solution here is exactly for low power and LOW COST, the mainstream, the mobile.

Really, I honestly cannot tell anything that's lacking gaming at 32" 720P that I have been using for a long time, fluidity, sharpness and vibrancy at native resolution is always the best way to game, I would go and buy another bigger 720P with better black anytime if it's still being made. But instead, I was forced to saving up for a 4K gaming rig for my new 4K TV (I cannot even buy 1080P with good specs now!) But after seeing this news, I give up the idea, see, if I can just run a game and display it perfectly at 1080P on a 4K, why should I bother with a 4K gaming rig anymore?

I have always against sharpening... always turn sharpness all the way down on my TV or driver as much as possible or you will get messed up hairy ghosting pixels. Based on the screenshots been released by AMD, it has pretty much the same effect, it still looks like it is faking more details with less details, it introduces more noises and artifacts in the process too.

AMD demo's on their sharpening looked pretty great to me - Maybe we should wait until either hardware actually launches before drawing any conclusions? Just an idea ...

I don't like seeing pixels - I don't want to see bigger clearer pixels, but then again I won't be gaming on a tiny weak laptop that needs to upscaled either ... so I guess this feature isn't for me.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
In theory, yes. In practice, from the comparisons I've seen, DLSS ends up looking similar or worse to regular upscaling. e.g. https://www.techspot.com/article/1794-nvidia-rtx-dlss-battlefield/
Interesting.

Yeah, I feel like they're asking DLSS to do too much, in that case. I'm skeptical it would ever do a good job of filling in texture details, and they probably tuned the loss function to minimize artifacts, in order to keep its errors from being too blatant.

However, when used with old-school graphics, I think it might have some real potential. Here's one of the first search hits I got:

https://www.theverge.com/2019/4/18/...hms-video-games-mods-modding-esrgan-gigapixel

And not quite the same thing, but related:

 
  • Like
Reactions: TJ Hooker

bit_user

Polypheme
Ambassador
Jun 23, 2019
11
1
15
AMD demo's on their sharpening looked pretty great to me - Maybe we should wait until either hardware actually launches before drawing any conclusions? Just an idea ...

I don't like seeing pixels - I don't want to see bigger clearer pixels, but then again I won't be gaming on a tiny weak laptop that needs to upscaled either ... so I guess this feature isn't for me.

OK... But imagine, wouldn't it be nice to have a lightweight laptop with long battery life and play games with perfect upscaling and fluidity just with the iGPU on the go? and have a Thunderbolt eGPU to plug in to when "really" needed. And with Intel's Thunderbolt which is built right inside the CPU, no other implementations of Thunderbolt can beat it in efficiency (speed/power consumption/latency), this is like a total solution for all my mobile/desktop/computing/gaming needs.
 
Last edited:

joeblowsmynose

Distinguished
OK... But imagine, wouldn't it be nice to have a lightweight laptop with long battery life and play games with perfect upscaling and fluidity just with the iGPU on the go? and have a Thunderbolt eGPU to plug in to when "really" needed. And with Intel's Thunderbolt which is built right inside the CPU, no other implementations of Thunderbolt can beat it in efficiency (speed/power consumption/latency), this is like a total solution for all my mobile/desktop/computing/gaming needs.

Well maybe ... but I hate laptops and certainly wouldn't game on one ... so, like a I said, I would et no advantage from this. Retro game players I guess would get the most benefit - I don't do that either.