Intel will add integer scaling support in its graphics driver by the end of August.
Intel Gen11 Graphics Will Support Integer Scaling : Read more
Intel Gen11 Graphics Will Support Integer Scaling : Read more
It's so that you can run a lower resolution that is evenly divisible by your screen's native resolution, while looking more like that resolution is the native resolution of the display. It effectively makes the pixels larger without needlessly blurring them in the process. This is arguably preferable for keeping the image looking sharp. Without integer scaling, if you run a game at 1080p on a 4K screen, for example, it will typically look a bit worse than if you were to run it on a 1080p screen of the same size, since pixels are bleeding into one another during the upscaling process. That blurring is necessary for resolutions that can not be evenly divided into your screen's native resolution, but for those that can, you are probably better without it.I don't get it ....
Actually, Nvidia recently added DLSS, which is fundamentally a very sophisticated image scaling algorithm.AMD and Nvidia employ bilinear or bicubic interpolation methods for image scaling. However, the final image often turns out blurry, too smooth or too soft. The integer scaling method should help preserve sharpness and jagged edges. Pixel-graphic game and emulator aficionados will surely love the sound of that.
At some level, this is utter nonsense. All GPUs support nearest-neighbor sampling. The only difference is that they're probably doing the scaling while sending the video signal out to the monitor, whereas a software approach would need to add a post-processing pass after the framebuffer has been rendered.According to Pearce, the older Intel parts with Gen9 graphics lack hardware support for nearest neighbor algorithms.
So this gives more jagged edges while upscaling? That doesn't sound useful ... at all! Unless you need a pixelated image - like the one they posted in the article? I don't get it ....
Edit: I guess if smoothing was applied after the upscaling ... but wouldn't tat defeat the feature? Someone explain, maybe I'm just missing something ...
So, this is a way to render downscaled images without performing mode switching? And the "missing hardware implementation" would be a way to reduce framebuffer size when doing so? What a non-event... Especially since the retro games and emulators that would be interested in that would implement nearest neighbor interpolation anyway.
And what about non-integer? Say, a game running in 800x600 rendered on a 4K screen? There's no way to do an integer upscale without getting black borders...
No, the only use case I see for this "feature" is when plugged into a buggy screen that doesn't support mode switching to one's preferred rendering resolution, to allow GPU-side upscaling using a nearest neighbor algorihm. And if Intel were dumb enough to implement only bilinear or bicubic interpolation algorithms in their chips up until now, then it's nothing to brag about : they should be ashamed instead.
In theory, yes. In practice, from the comparisons I've seen, DLSS ends up looking similar or worse to regular upscaling. e.g. https://www.techspot.com/article/1794-nvidia-rtx-dlss-battlefield/Actually, Nvidia recently added DLSS, which is fundamentally a very sophisticated image scaling algorithm.
Err... No, even at 80 inch, it's not easy to tell "native" 1080P & 4K apart. BUT displaying 1080P on a 4K, everything just becomes a blurry mess with no real contast.
I don't have a 4k screen so I guess I never noticed that all the upscaling sampling done by GPUs and TVs/Monitors is so shitty ... super simple fix, weird this has even been a problem ever ...
Oh... well, if you have 1080P you can run games at 540P, if 1440P then 720P... perfectly sharp and vibrant.
My god, I mean, at 720P you can run latest games with this tiny 10nm iGPU, FLUIDLY.
Of course you can have more detailed visuals by rendering at even higher resolution and down sample it to native resolution of your display; provided that you have a really beefy and expensive gaming rig...
Yes, Intel's solution here is exactly for low power and LOW COST, the mainstream, the mobile.
Really, I honestly cannot tell anything that's lacking gaming at 32" 720P that I have been using for a long time, fluidity, sharpness and vibrancy at native resolution is always the best way to game, I would go and buy another bigger 720P with better black anytime if it's still being made. But instead, I was forced to saving up for a 4K gaming rig for my new 4K TV (I cannot even buy 1080P with good specs now!) But after seeing this news, I give up the idea, see, if I can just run a game and display it perfectly at 1080P on a 4K, why should I bother with a 4K gaming rig anymore?
I have always against sharpening... always turn sharpness all the way down on my TV or driver as much as possible or you will get messed up hairy ghosting pixels. Based on the screenshots been released by AMD, it has pretty much the same effect, it still looks like it is faking more details with less details, it introduces more noises and artifacts in the process too.
Interesting.In theory, yes. In practice, from the comparisons I've seen, DLSS ends up looking similar or worse to regular upscaling. e.g. https://www.techspot.com/article/1794-nvidia-rtx-dlss-battlefield/
It's often referred to as FSAA, SSAA, or over-sampling.I have 1080p run my games at 1440p and downscale ... have you ever done that? Looks frickin amazing and you don't need AA. Upscaling will never ever look anywhere close to what can be achieved by downscaling.
AMD demo's on their sharpening looked pretty great to me - Maybe we should wait until either hardware actually launches before drawing any conclusions? Just an idea ...
I don't like seeing pixels - I don't want to see bigger clearer pixels, but then again I won't be gaming on a tiny weak laptop that needs to upscaled either ... so I guess this feature isn't for me.
OK... But imagine, wouldn't it be nice to have a lightweight laptop with long battery life and play games with perfect upscaling and fluidity just with the iGPU on the go? and have a Thunderbolt eGPU to plug in to when "really" needed. And with Intel's Thunderbolt which is built right inside the CPU, no other implementations of Thunderbolt can beat it in efficiency (speed/power consumption/latency), this is like a total solution for all my mobile/desktop/computing/gaming needs.