How Has Nvidia Managed to Push 60Hz 4K Over HDMI 1.4?

Status
Not open for further replies.

Quarkzquarkz

Distinguished
Sep 18, 2013
445
18
18,965
70
@WiseCracker, no not 75%. Keep in mind compression loss and resolution distortion are two completely different things. In this case, the sample packets are down from 4:4 to about half so you're essentially looking at a very poor 1440p. And not even 2k at that, you would expect a range between 1k and 2k and no one respects downsampling at any rate.
 
:lol: 36 pixels 'compressed' to 9 pixels = 4:1 = 75% reduction

Feel free to rationalize that any way you wish in your defense of pseudo "Voodoo 4k"
If you had read his post you would have seen he was simply correcting your misinformation, not trying to defend anything.
 

Ikepuska

Reputable
Jun 23, 2014
3
0
4,510
0
I object to the use of "a lot of video files are even encoded with the 4:2:0 preset in order to reduce the file size". In fact the use of chroma subsampling is a STANDARD and MOST video files including the ones on your commercially produced Blu-ray movies were encoded using it.

This is actually a really useful corner case for things like HTPCs or if you have a 50ft HDMI to your TV or projector, because there really is no loss of fidelity. But for desktop use it's just a gimmick.
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
11
Well, at least on the surface Nvidia has a superior marketing claim here; no doubt that's all they care about anyway. Just let display port and HDMI v2.0 take over and do it right, no since in milking the old standard that can't.
 

Keyrock42

Honorable
Jan 15, 2014
8
0
10,510
0
Anyone hooking up a 4K TV or monitor really should do their research and make sure it has a display port input (why some 4k TVs or monitors are even manufactured without a display port input is beyond me). That said, it's nice that there is at least a dirty hack like this available for those that need to connect to a 4K TV/Monitor via HDMI. It's far from ideal, but better than nothing, I guess.
 

thundervore

Distinguished
Dec 13, 2011
1,030
1
19,460
52

hannibal

Distinguished
Apr 1, 2004
2,472
86
19,890
14
This is good for videofilms, not so good to the text based material. Its is useable alternative to some material and does not reguire new hardware. But yeah display port is much better!
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960
68
Idk I would try to find a different compression method. Color is very important, and I personally don't think 4K is enough of an upgrade over my color-accurate IPS 1080p monitor. It actually felt like a decent upgrade over my old LED TV due to the vibrant colors and black levels alone...
 

Blazer1985

Honorable
May 21, 2012
206
0
10,690
1
It is not about color precision, it is about color compression as someone noted. Since human eye is capable of noticing luma changes between 4 pixels but not so much for the color information. Proof is that you'll get 4:4:4 10bit only (almost) on cinema screens while the tv - documentaries are filmed in 4:2:2 8bit at most and anyway you see them (even movies) after a 4:2:0 compression. I could post some links about it but they would be so boring :-D
 

wuzelwazel

Honorable
Jun 18, 2013
2
0
10,510
0
I think most of this has been covered already but it's important enough to mention again.

The chroma in a video signal is far less important than the luma. Human vision is much much more sensitive to changes in brightness than changes in color. In addition there is no loss of color depth; only a loss in the resolution of the least important part of the signal. Also, the effective resolution of the chroma at 4:2:0 sampling on a 4K display is 1920x1080 which is by no means low resolution.

Of course 4:4:4 would be the best option but I'd call 4:2:0 a no-brainer to allow double the refresh rate for some users.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
9
if you have black text on a white background, or vice versa, it will look fine in 4:2:0 since there is a luma sample for every pixel. It is when either the foreground or background are not black and white that reading text becomes a problem.
 

SteelCity1981

Distinguished
Sep 16, 2010
1,129
0
19,310
12
it's all just a marketing ploy by NVidia to say we were able to push 4k at 60mhz out first on our gpu even though it's really a half ass way of doing it on 1.4. until hdmi 2.0 comes out on gpus then i'll pass.
 

nottorp

Reputable
Mar 16, 2014
5
0
4,510
0
I think they can do better! Let's go back to the 16 EGA colors! Then we'll be able to have 16K displays over HDMI 1.0!
 

geok1ng

Distinguished
Jun 25, 2008
108
0
18,690
2
gaming at 4:2:0 would lower image quality to below console levels. 4:2:0 is good for video and video only. also 4:2:0 breaks subpixel text displaying like cleartype.
 

ravewulf

Distinguished
Oct 20, 2008
931
1
18,985
0
Almost all videos use 4:2:0 (including DVD, Blu-Ray, online video sites, video cameras, TV/Cable etc) with the exception of DV videos (think old mini-DV tape cameras) which uses 4:1:1. Usually only high end professional videos use 4:4:4, but are later exported in 4:2:0
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS