Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Status
Not open for further replies.
I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.

But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
 

compton

Distinguished
Aug 30, 2010
197
0
18,680
Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.

My desktop, which uses a 460 really suffers from the lack of noise reduction options.
My Samsung BD player looks less spectacular that the others.
My Xbox looks a little better than the BD player.
My PS3 actually looks the best to me, no matter what display I use.

I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.

Thanks for the awesome article.
 

cleeve

Illustrious


That's Definitely on our to-do list!

Trying to organize that one now.
 

lucuis

Distinguished
Apr 21, 2008
1,048
0
19,310
Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.

Noise reduction did next to nothing. And in many cases causes blockiness.

Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.

Color correction seemed to make anything blueish bluer, even purples.

Edge correction seems to sharpen some details, but introduces noise after about 20%.

All in all, bunch of worthless settings.
 

killerclick

Distinguished
Jan 13, 2010
1,563
0
19,790
[citation][nom]jimmysmitty[/nom]Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements[/citation]

ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]killerclick[/nom]ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.[/citation]

i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
this made me lol ...

1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.

2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.

3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
 
"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.

It's only worth more than half the world population, after all.
 

cleeve

Illustrious
[citation][nom]mitch074[/nom]"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...[/citation]

You misunderstand the text, I think.

To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
 

caeden

Distinguished
Oct 14, 2009
83
0
18,640
Now I am not sure how the radions stack up, but using these settings on my 9800GT made my video quality noticeably worse than normal. Especially around text (white over black background was painful) where noise was added. After messing with it a little I found that using the video defaults under the image settings was best. And checking the dynamic contrast under color settings helped the contrast a little, but not much, and I think it is adding noise, so I am going back to defaults on this as well.

More important than your driver would be calibrating your monitor. PC monitors are very different than HDTVs (higher brightness and lower contrast, square pix to optimize text over video, and most of us have tn screens which are very limited on color range), but in spite of their shortcomings, if properly adjusted they can still look quite good.

A note to those who are putting your DVDs on your computer, there is a very powerful program called AVS which can do much of this cleanup for you during the ripping process. This allows for smaller file size with good compressors (h264 for storage, Lagarith for editing), as well as making upscaling less of a problem. It is not a particularly intuitive tool, but there are many good tutorials out there on it. This is especially handy with old DVDs that were not encoded well in the first place, or super long DVDs (lord of the rings) where it was encoded poorly in order to fit the content on the disc.
As for blue-Ray content, the quality you get will depend mostly on the software you are using. The driver options (at least for my card) hinder more than help, so it then depends on the software you are using.
For HD content in general you will notice a huge difference between that nicely encoded music video you download vs Netflix HD streaming. Part of this is because Netflix is (for the most part) only 720p, and 2ndly because they have an automated process which will work better for some movies than others. It would seem their biggest flaw is color banding. This is still worlds better than cable, and other web services, but you are just not going to get that BD quality.
 

cleeve

Illustrious
[citation][nom]haplo602[/nom] cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.[/citation]

True enough, although I'm a US writer primarily tasked to write for the US audience. I do think readers from PAL countries can understand the implications, however.

[citation][nom]haplo602[/nom] 2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.[/citation]

I understand the argument, and frankly the fact that user is free to turn it off is good enough for me. Regardless of whether or not we agree with its inclusion, it is something that's included in some high-end video processors and is therefore a point of comparison.

[citation][nom]haplo602[/nom] 3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...[/citation]

For me it was sufficient to use a good-quality display for testing the different graphics hardware. Frankly, the HQV test can be used for displays, too. But with our test display capable of differentiating contrast enhancement without overexposure I think that's the best way to test the video processor. Whether other folks have worse displays is kind of outside the scope of the test.
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
[citation][nom]mitch074[/nom]"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.It's only worth more than half the world population, after all.[/citation]

but 85% of the worlds computers are in the US and so your countries dont really matter.
 

intelx

Distinguished
Sep 17, 2009
176
0
18,680
[citation][nom]shin0bi272[/nom]but 85% of the worlds computers are in the US and so your countries dont really matter.[/citation]

can you prove your statement?
 

andrewcutter

Distinguished
Sep 10, 2009
179
0
18,690
[citation][nom]shin0bi272[/nom]but 85% of the worlds computers are in the US and so your countries dont really matter.[/citation]

why am i not suprised by this statement. i hope you are not representative of your whole country
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
[citation][nom]shin0bi272[/nom]but 85% of the worlds computers are in the US and so your countries dont really matter.[/citation]

I'm not sure if he is a troll or simply a dumb ass (probably both). The opposite is actually closer to reality, that is 85% of personal computers are outside of the US. In 2005-2004 only 28.4% of the world computers where in the US. Since then, countries like China or India have grown significantly faster than western countries where the market was already saturated 5 years ago. By the way the countries with more computers (per capita) are Switzerland: 864.584 per 1 million people, San Marino: 857.143 per 1 million people & Sweden: 763.012 per 1 million people.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
[citation][nom]Cleeve[/nom]True enough, although I'm a US writer primarily tasked to write for the US audience. I do think readers from PAL countries can understand the implications, however.[/citation]

I do not dispute the results as they speak clearly. I do have a problem with you downplaying an Nvidia disadvantage just because you are a US resident. The tone of the comment was not neutral (i.e. just to highlight that one cadence is used for 60Hz and the other for 50Hz).

Nvidia does not issue non-US driver versions or non-US cards, so they should be blamed for the lack of features in this case.

I just want to make you aware that some of your comments might not come across as correct to non-US residents. After all, Internet knows no borders :)
 

rhino13

Distinguished
Apr 17, 2009
590
0
18,980
Lets be honest though who would use anything but an nVidia 5xx series for video? They're just so loud.

I know that everyone complains about ATI drivers, but I continue to be more and more impressed with them. Those guys are putting in the hours.
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
Tom's again trying to review something video-related? Your incompetence is just sad. Here is clear example: 'Dynamic range' setting.

If you know anything about digital video, it's obvious that this setting selects what kind of range to use: the so called "PC range" where all values from 0 to 255 describe different colors, and "TV range" that is legacy from analog days and values below 16 are all pure black and above 235/240 are the same pure white. Most video today still uses TV range.

From previous paragraph it's clear that this option should be selected to match the kind of video you are watching. Select it wrong - and you'll have poor contrast or lose details in dark/bright areas. Effect is clearly visible unlike some obscure noise reduction. It has zero impact on processing time, so "if your card is powerful enough to handle it" sounds ridiculous.

And the most fun thing: nVidia drivers in english somehow mismatched strings "full range" and "limited range" (for example in russian they are placed correctly). Only someone completely incompetent in digital video could not notice it in review about video quality and relevant driver settings.
 

misiu_mp

Distinguished
Dec 12, 2006
147
0
18,680
"Most films are recorded at 24 FPS and this is converted with the 3:2 pulldown cadence, while the 2:2 cadence is only used in countries following the PAL and SECAM standards that shoot film destined for television at 25 FPS. As such, I question the wisdom of assigning these two cadences the same point value. The 3:2 cadence should be worth more points."

All my DVDs are PAL. Than again I don't live in the USA, thus I don't exist...
 
Status
Not open for further replies.