Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

caeden

Distinguished
Oct 14, 2009
83
0
18,640
So I am wondering: As using the driver settings makes worse quality than the software settings how do these tests match up using software settings? Please note that the software setting does not render the video 'in software' but rather that the software gets to use it's own algorithms for video cleanup which still run on the cards, but are more specialized than the basic cleanup built in the driver. Thus the reason why there are so many differences in quality depending on if you are running your video through WMP, old versions of WMP, Quicktime, VLC, etc. So what software runs best? And what cards do they run best on?

I think these results would be much more interesting, and much more piratical for both our everyday life, as well as high-end video.
 

rgladiator

Distinguished
Feb 2, 2011
1
0
18,510
I think it's important to point out that not all these cards support HDMI 1.4a. Even though the AMD 6850 and 5750 both score the same only the 6850 supports full 1.4a specs. Important to know if you're buying now.
 

+1

It's too bad Tom's didn't use a 880G/890GX motherboard in the article, I would also like to see how the HD4250/HD4290 IGP stack up to the discreet cards.
 

deanjo

Distinguished
Sep 30, 2008
113
0
18,680
[citation][nom]killerclick[/nom]ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.[/citation]

There should be a big huge asterdisk on that statement. Many HTPC's are using linux as their OS and in that arena AMD gets stomped all over on as they have very poor video playback capabilities in any other OS other then windows.
 

campb292

Distinguished
Mar 18, 2010
50
0
18,630
This story suffers badly from lack of calibration understanding. There are very few, if any, models for calibration with forced "noise reduction" and "edge enhancement". Proper calibration generally refers to an attempt to replicate the video signal intention as close as possible. Testing a manipulation of a signal is severely flawed.
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
caeden, actually it's indeed very player-dependant. And usually you want to do all processing in your player software while keeping sure that video drivers don't wreck something (like the mentioned skin color correction), so best videocard here is which shows video as is without any processing. It can be very cpu-demanding but that's what you have to pay for better quality.

If your cpu is not fast enough - that's when you going to try inferior but usually fast hardware implementations. It's make sense for computational-heavy tasks like deinterlacing.

If you are interested in software processing - take a look at AviSynth and ffdshow interface for it.
 

cleeve

Illustrious


Not the kind of video you're watching, moreso the kind of display you're using. Monitors tend to be 0-255 and TVs tend to be 160-235 (although some newer TVs can switch between them).

I've cleared this up with some edits.


 

preolt

Distinguished
Oct 31, 2010
264
0
18,810
[citation][nom]rootheday[/nom]Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.[/citation]

They wont run any more tests on the new cpu's and chipsets until the bugs are fixed and they are ready to be put back on the market. So you are gonna have to wait a while for those results. I found some on overclokers but they where taken down.
 

cleeve

Illustrious


I have no problem doing a Sandy bridge review. The problem is only with the 3.0 GB SATA ports, the 6.0 GB ports are just fine.

That's the extent of the issue. Using the 6.0 GB ports completely bypasses it.
 

bluestar2k11

Distinguished
Feb 1, 2011
145
0
18,680
I 2nd the point of testing the Sandy Bridge IGP, and the 5x0 Geforce cards. After all, the 6xx0 Radeon is there, why not the Geforce?

I can understand in a way why the SB may not have been tested, but considering the importance Intel has placed on the IGP's video abilities, I think it's worth testing anyway.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
Interesting but i use MadVR with 3dlut so any of those settings even if turned on wouldn't do anything on my computers. Also what is your talk on the default settings as that is what most people are likely to leave it at. for my 5770 edge enhancement is at 10 by default and brighter whites is on. I would look at nvidia defaults for my 8800GTS but that's in another computer right now.
 

Chris_TC

Distinguished
Jan 29, 2010
101
0
18,680
[citation][nom]campb292[/nom]This story suffers badly from lack of calibration understanding. There are very few, if any, models for calibration with forced "noise reduction" and "edge enhancement". Proper calibration generally refers to an attempt to replicate the video signal intention as close as possible. Testing a manipulation of a signal is severely flawed.[/citation]
This sums it up. Instead of turning on a bunch of nonsense features like edge enhancement, noise reduction, skin correction etc. you should educate your readers that the correct setting is the one that doesn't change the source material.
 

cleeve

Illustrious


I disagree. That assumes source material is perfect.

Low-resolution and poorly compressed video will show a gerat deal of improvement with many of these features.

Even older films that haven't been properly cleaned up before digitization will improve a lot with noise reduction and dynamic contrast.

To suggest it's nonsense without considering these scenarios is unrealistic and maybe even a little bit of videophile snobbery. ;)
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
Cleeve, even if you interpret it this way - edit is still no good. PC doesn't provide "wider range of brightness between white and black" than TV - it only provides better precision in the same range.

And applying those processings to everithing as you indicated at the beginning (user won't change driver settings for each video) IS nonsense. More than that, for really low quality SD etc. you need way more agressive filtering than there is in drivers.
 

cleeve

Illustrious


I think you might be nitpicking here. Better precision = more levels and that can certainly be termed as a wider range.



Totally disagree with you here, based on experience with the way that these features can actually clean up sub-par quality video.

I simply leave the features enabled all the time. It certainly doesn't make good video appear bad, but enhancements can definitely make sub-par video appear much better.
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
Cleeve, yes, I might be nitpicking here. It's just "wider range of brightness between white and black" sounds to me like "better contrast". Ridiculous "if your card is powerful enough to handle it" about full range still remains.

It's hard to argue about quality of postprocessing because it's very subjective. But I can say for sure that for example custom scaling of SD using nnedi improves quality on a whole different level compared to what you can get from driver's 'make everything better' options. On the other hand, applying skin color correction and sharpening to well mastered BD seems to me as just ruining the director ideas: red people should be left red.
 

thegreathuntingdolphin

Distinguished
Nov 13, 2009
256
0
18,780
I question the need for many of these settings. I had many of them turned on and most of my videos looked like crap. After turning them off my videos looked much closer to the source material.
 
G

Guest

Guest
"Most films are recorded at 24 FPS and this is converted with the 3:2 pulldown cadence, while the 2:2 cadence is only used in countries following the PAL and SECAM standards that shoot film destined for television at 25 FPS. As such, I question the wisdom of assigning these two cadences the same point value. The 3:2 cadence should be worth more points."
This is not true, you don't understand how transfer from film to tv works. PAL cannot use 3:2 transfer for 24fps films like NTSC, because PAL is 50i system. So 2:2 cadence is used for 24 fps films too. 24 fps film is played faster - at 25 fps and than converted to 50i system.
 

thegreathuntingdolphin

Distinguished
Nov 13, 2009
256
0
18,780
I am glad color correction was included in the review. Although one could make the argument that it is what the director intended, that is not the case all the time. In cheaper films or films that have not been preserved well, the prints might have serious color issues. Also, just because something has been released on blu-ray or dvd does not mean a good job was done in transferring it. There are tons of really crappy transfers that kill flesh tones or make them look too red.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
[citation][nom]xsc[/nom]This is not true, you don't understand how transfer from film to tv works. PAL cannot use 3:2 transfer for 24fps films like NTSC, because PAL is 50i system. So 2:2 cadence is used for 24 fps films too. 24 fps film is played faster - at 25 fps and than converted to 50i system.[/citation]

When playing DVDs, I play them using WinDVD's PAL TruSpeed setting which drops the framerate from 25 to 24. 25 is noticeably quicker and higher in pitch, and I prefer it as it's intended.
 

cleeve

Illustrious


I don't think it does. What page are you talking about?

The only reference to that I can see refers to dynamic contrast and color enhancements on the Geforce driver settings page.




That's certainly a valid opinion. I think it's equally valid to consider that poorly mastered source might benefit from the feature. Despite what you or I think its value is, it's worth testing the feature form those who find it useful. I don't think it would be right to ignore the test entirely because we might not agree with its usefulness. It's part of the HQV 2.0 suite, and I tested the suite. I think that's the right thing to do in this case.
 

taso11

Distinguished
Aug 27, 2008
134
0
18,690
I think campb292 hit the nail on the head. This review was to test video quality reproduction, not make a bad source look better. They are two separate things. The settings were set to get the best score on HQV 2.0 as to compare the scores for different cards.

On a side note. I currently have a 5570 in my low profile htpc. Do you guys think upgrading to 5750 would be worth it for the better image quality? I mainly watch blu-ray content. Thanks!
 

cleeve

Illustrious


I wouldn't bother, not for good blu-ray source. You probably wouldn't notice any difference.

You might make a case for the 5750 if you wanted to clean up poorly compressed source, or if you want to step up the system's gaming ability.
 
Status
Not open for further replies.