News Microsoft Unveils Its Own Version of Nvidia's RTX Super Resolution

Status
Not open for further replies.

bit_user

Titan
Ambassador
I'd hazard a guess that the resolution limit is due to not wanting to burn too much power upscaling videos that would benefit less from the process. I hope it's configurable.

I also really hope it includes a decent deinterlacer, since some DVDs aren't simply 24p and there's plenty of terrestrial broadcast that's still 1080i.

And what's the deal with DRM-protected content? That would include not just many DVDs, but any video from like Netflix, Amazon, etc. right? Pretty big limitation, there.
 

ThisIsMe

Distinguished
May 15, 2009
197
51
18,710
Microsoft's implementation is designed specifically to reduce the amount of internet bandwidth required to stream videos to your PC, and is limited to just 720P videos or lower. Microsoft is using an AI upscaler to do the work, focusing on removing blocky compression artefacts to improve image quality.

VSR's limitation to sub-HD resolutions (for now?) targets customers with bad internet connectors and older videos recorded before 1080P and 4K became the norm.

FYI - 720p is HD resolution

HD = 720p = 1280x720p = High Definition
FHD = 1080p = 1920x1080p = Full High Definition
QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)

Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.
 

evdjj3j

Distinguished
Aug 4, 2017
371
396
19,060
FYI - 720p is HD resolution

HD = 720p = 1280x720p = High Definition
FHD = 1080p = 1920x1080p = Full High Definition
QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)

Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.

I came to say the same 720p is HD, they don't even try. TIme for a lesson authors 720 is HD and Intel's first dGPU was the i740 from 1998.
 
  • Like
Reactions: PEnns

bit_user

Titan
Ambassador
Often 4K and UHD are used interchangeably,
I hate it when people use "2k" to refer to 2560x1440. It's not - that's more like "2.6k"! If "4k" is 3840x2160, then "2k" should mean 1920x1080, as that's exactly half in each dimension. Better just not to use "2k", at all, since using it to mean 1920x1080 at this point will likely invite confusion.

and sometimes UHD is used to refer to 4K+HDR.
I can't endorse that, either. Resolution shouldn't be conflated with HDR.

Intel's first dGPU was the i740 from 1998.
Except they weren't even called GPUs, back then.
 
Last edited:

edzieba

Distinguished
Jul 13, 2016
589
594
19,760
FYI - 720p is HD resolution

HD = 720p = 1280x720p = High Definition
FHD = 1080p = 1920x1080p = Full High Definition
QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)
1920x1080 is HD. 720p was a compromise (like EDTV) pushed by marketers wanting to sell lower quality display panels rather than by standards bodies setting transmission and storage resolutions (e.g. SMPTE, ITU-R, DCI).
The situation was similar to the "HDR" screens available now that can accept a High Dynamic Range input, but only display it either squashed or clipped to fit within a standard dynamic range (and maybe with some global dimming thrown in to make things look even worse).

Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.
Completely false.
UHD = 3840x2160 broadcast standard resolution. Standardised before HDR was even a consideration.
4K = 4096x2160 Digital Cinema Initiatives standardised resolution (and other requirements, such as encoding, subsampling, etc).
'4K' = Colloquially adopted term for consumer UHD. Maybe it markets better, who knows, but we're stuck with the pointless confusion now.
 
  • Like
Reactions: voyteck
720p is definitely "HD" resolution, just like 1080p is "FHD" and so forth.

https://www.tomshardware.com/reviews/what-is-hd,5745.html

2160p being called "4K" is 100% a marketing gimmick, it sells better then saying "this device is in full 2160p". Video resolutions are measured on their vertical axis because that's how scanline rendering works and is the number of lines required for a single frame. The horizontal is measured as a multiplier of the vertical scanlines and we call that the Aspect Ratio (4:3, 16:9, 16:10, etc.).

For those interested here is an organized list of resolutions and their actual names.

https://en.wikipedia.org/wiki/Graphics_display_resolution
 

g-unit1111

Titan
Moderator
Another crappy app store feature?

funny-kill-it-with-fire.gif
 

bit_user

Titan
Ambassador
This might be the worst comment I've read in my entire life.
Apparently that one hit close to home. What a way to call yourself out, though.

Anyone who refers to 1920x1080 as 2k and 2560x1440 as 2.6k should leave the internet,
I don't. It was an appeal for consistency. In particular, using the term "2k" to mean 2560x1440 is highly misleading, because the width is 2/3rds of 3840 which most people seem to call "4k", regardless of whether you think they should.

Mostly, what I want is for nobody to use the term "2k". Ever.

PS: 3840x2160 is not 4k, 4096x2160 is true 4k but that would require a read up on resolutions which would be too easy.
I wouldn't mind seeing people stop calling it 4k, but that doesn't seem realistic. So, at least be consistent.
 

bit_user

Titan
Ambassador
Video resolutions are measured on their vertical axis because that's how scanline rendering works and is the number of lines required for a single frame.
That's only true since we moved past the NTSC/PAL era. I think the main reason it got adopted as a shorthand is that it was the number which came right before the "i" or the "p", which indicates whether the signal is interlaced or progressive.

If you go back further, you'd see people talking about "lines of resolution" in a very different way. Because the vertical resolution was set by the video standard (486 visible lines in NTSC, 576 in PAL & SECAM) the only variable was the level of horizontal detail you could see. And that was determined by the signal bandwidth (e.g. transmission, video tape, digital video sampling frequency, etc.).

To be specific, lines of resolution determined the maximum horizontal frequency you could discern (I forget how much attenuation was acceptable) per picture-height.



It's irrelevant for our current purposes. But, perhaps anyone who's into vintage video gear (including retro gaming) will perhaps appreciate the history lesson. Because you might see some gaming console or video mode described this way, and now you'll understand exactly what it means.
 
Apparently that one hit close to home. What a way to call yourself out, though.


I don't. It was an appeal for consistency. In particular, using the term "2k" to mean 2560x1440 is highly misleading, because the width is 2/3rds of 3840 which most people seem to call "4k", regardless of whether you think they should.

Mostly, what I want is for nobody to use the term "2k". Ever.


I wouldn't mind seeing people stop calling it 4k, but that doesn't seem realistic. So, at least be consistent.
What really doesn’t make sense is that NewEgg and other E-tailors classify 3440x1440p ultra wide as 2k. Not that I endorse this terrible naming scheme, but it seems like 3440x1440’s 5 million pixels merits a “k” rating more in between 2 and 4k. But again I base this on nothing more than my opinion.
 
  • Like
Reactions: bit_user

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
"We suspect more GPUs will be supported down the line, but this will depend on how GPU-intensive Microsoft's AI upscaler will be to run. "

It can be pretty intensive from my limited testing, in particular if you push down to 288p or below, an old Youtube video at 384x288 pushed my RTX 2060 at 90% load consistently.
 

edzieba

Distinguished
Jul 13, 2016
589
594
19,760
Are you sure you're not thinking of 1366x768?
No.
1280x720 was the intermediate pushed by panel manufacturers in at attempt to co-opt the transition to 1080i broadcast by offering a system that used basically the same pixel clock rate (and thus same broadcast hardware) but a lower panel resolution, dangling the carrot of progressive rather than interlaced scan as an incentive to water down the standard by allowing cheaper panels and omission of deinterlacing processors. It was rapidly rendered moot by 1080p being adopted mid 1080i rollout (except in the US, where the insistence on using ATSC slowed things down vs. the rest of the world on DVB or MUSE), but sadly we are still stuck with the malformed remnants of the concept pretending to be 'HD'.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
Because clearly the tons of Javascript bloat and ad garbage doesn't make the internet laggy enough yet.

Thank god this battery-draining crap is not on Firefox.
 

bit_user

Titan
Ambassador
Intel "dibbled" in GPU discreet going back to the early -80's with co-processers with graphic engines
Huh? Do you mean the i860? That was very late 80's (1989).



I really liked the idea of it, but I never had to deal with its various quirks and rough edges. In some ways, you might even look at it as an ancestor of projects like Larrabee or perhaps Falcon Shores.

It's easy to forget, but early 80's coprocessors were still limited to things like basic floating point arithmetic (i.e. 8087). There were no graphics coprocessors, in the early 80's. Only entire boards full of chips.
 

bit_user

Titan
Ambassador
1920x1080 = 1k
2560x1440 and 3440x1440 = 2k.
Yeah, and that's just dumb.

You know why. Because people say so. Who are you to tell others what they can or can't say.
Not everything has to be pinpoin technically accurate if 99.9% of consumers can get behind the same agreement
I'm just calling out nonsense where I see it. I think that misnaming is due to intellectual laziness. And it's not 99.9%, either.

You don't have to agree with me, of course. However, I'd point out that my posts in this thread have some votes and yours don't. A touch of humility seems in order.

while 1 person sits in the back whinning and kicking about something so tiny and irrelevant.
In science and engineering, details matter. So does logic and consistency. What might seem innocuous and trifling to you jumps out to me as a glaring inconsistency.

So no, we won't call 1440p 2.6k and we will continue to call it 2k.
I originally wrote 2.5k, FWIW. But call it what you want. I just spoke my mind about the issue, which it's my right to do. I'm not forcing anyone to do anything, but you should consider that I'm as entitled to my opinion as you are to yours.
 
Last edited:

PiranhaTech

Reputable
Mar 20, 2021
136
86
4,660
Sounds good. 480p and 720p are nice mostly because of Comcast limits, which I'm more likely to hit because I run videos in the background while working.
 

voyteck

Reputable
Jul 1, 2020
58
27
4,570
It doesn't make much sense to use "p" after 1080 etc. in contexts other than broadcasting. In PC display technology it's just Full HD, (W)QHD, UHD/ Ultra HD (not to confuse with 4K) or 1920x1080, 2560x1440 etc.
 

bit_user

Titan
Ambassador
It doesn't make much sense to use "p" after 1080 etc. in contexts other than broadcasting.
It serves the purpose of telling the reader that you're referring to image height. There are a couple reasons you might want to do that. First, as a shorthand. The second I mention below.

In PC display technology it's just Full HD, (W)QHD, UHD/ Ultra HD (not to confuse with 4K)
I'd be interested in seeing a poll to gauge how many of us know that. Back in the analog days, I remember a lot of people seemed to think SVGA meant 1024x768, but that was actually XGA, while SVGA actually meant 800x600.

When you're just referring to specific resolutions, it's easy enough to write them out. Where the screen height becomes a useful shorthand is when you're collectively referring to a class of resolutions, such as using 1440p to refer to both QHD and WQHD.

Anyway, I'm not dug-in on these points. Just some thoughts & observations.
 
Status
Not open for further replies.