Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

taso11

Distinguished
Aug 27, 2008
134
0
18,690


Thanks Cleeve. I don't game on my htpc. I bought the 5570 based on another sites HQV 2.0 results months ago. The 5750 low profile wasn't available then darnit lol. I'll most likely upgrade because I "need" ;) to have the best picture quality. I run the hdmi to my Denon 5308ci. I know that the Denon does video processing but I don't know what it's doing exactly. It's like a "black box". lol
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
Cleeve, the very end of page 3:
'Dynamic range can be left at the “Limited” setting, as changing this value doesn’t seem to affect the HQV test results. But if your card is powerful enough to handle it, the Full setting is more attractive and provides brighter whites and darker blacks.' - it clearly refers not to dynamic contrast and color enhancements.
 

senti

Distinguished
Jan 4, 2011
7
0
18,510
Oh, and about "Dynamic Range" in the table for GeForce 210: it can't be "disabled" - it's "default".
 

pyrrocc

Distinguished
Feb 2, 2011
9
0
18,510
Please include the latest integrated graphics chips in your next tests (eg. 880G, ION2, Sandy Bridge, Brazos, etc). The whole point of home theater PCs is small, cool and quiet.
 
People have requested cards such as the 5xx series and the intel sandy IGPs. You did include older an older 9800GT from Nvidia, I was wondering if when you do do the additional tests that you would slap in a radeon 48xx series card in your testing?
 
G

Guest

Guest
AH, but which cards actually allow you to select the correct refresh rate for video playback, i.e. will any of the cards actually let you lock in a multiple of 23.976fps, such as 71.928Hz so that you get smooth playback?
 

mdsiu

Distinguished
Oct 1, 2010
448
0
18,860
[citation][nom]shin0bi272[/nom]but 85% of the worlds computers are in the US and so your countries dont really matter.[/citation]

LOL
 

cleeve

Illustrious
AH, but which cards actually allow you to select the correct refresh rate for video playback, i.e. will any of the cards actually let you lock in a multiple of 23.976fps, such as 71.928Hz so that you get smooth playback?

It's my understanding that only Intel graphics has a problem with 23.976 FPS, but I'll check on that.
 

vancouverboy

Distinguished
Apr 28, 2009
36
0
18,530
.
to make the chart up to date I'd like to see GTS cards to be included as well, as they are sufficient modern and affordable for HTPC builds.
 

alextheblue

Distinguished
[citation][nom]lucuis[/nom]Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.Noise reduction did next to nothing. And in many cases causes blockiness.Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.Color correction seemed to make anything blueish bluer, even purples.Edge correction seems to sharpen some details, but introduces noise after about 20%.All in all, bunch of worthless settings.[/citation]

Maybe you'd get better results with a mid-range or better Radeon. They certainly seem to have the Nvidia drivers beat when it comes to noise reduction, maybe they would allow you to tap into some of the image enhancements without introducing lots of noise.
[citation][nom]jimmysmitty[/nom]I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there[/citation]
Decoding, maybe. Encoding? Not even close. The dedicated encoding hardware is good for a quick (QUICK!) and dirty encode. Like if you need to downscale a 1080p video to toss on your cell phone, it will blaze through it and spit out a lower res, lower bitrate video your phone can crunch.

But for archiving and other high-res encodes? No way. Pure software produces the best looking results. Who cares if it takes a little longer, if you're archiving a blu-ray video and storing it on a home server or NAT for future playback? AMD's hardware-assisted encodes are also very good, from what I've seen. Very close to pure software results. Nvidia's are not quite as good, but they're still a lot better than Sandy Bridge. Sandy Bridge encodes pull up the rear, in terms of quality.
 

damasvara

Distinguished
Jul 20, 2010
831
0
19,060
[citation][nom]shin0bi272[/nom]but 85% of the worlds computers are in the US and so your countries dont really matter.[/citation]Congratulations... Your blind chauvinistic comment just won the "Most Quoted Post of the Article"!!!

:lol:
 

bastard

Distinguished
Jan 22, 2009
6
0
18,510
Have you installed Purevideo HD software for the Nvidia cards?
I didn't see in the article but if you didn't you should write that on the first page with bold letters.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
Confirms my admittedly mostly seat-of-the-pants opinion; (at least part of it): Radeons edge out Geforces in video quality, and Geforces edge out Radeons in overall gaming quality.

;)
 

emergancy exit

Distinguished
May 5, 2008
44
0
18,530
i just installed a HIS 5670 and was blown away by the color by just the default settings. normally i HATE graphic card's default settings and have to spend hours adjusting them the 5670 i just left alone. out of all these cards it looks like the 5670 is the best value for image quality for HTPs hope they make a low profile 512mb version
 

cleeve

Illustrious
[citation][nom]briovaz[/nom]where is the 5770?[/citation]

I don't see the necessity. If a 5750 matches a 6800, the 5770 would be redundant.

Anything over a 5750 would deliver the same ultimate quality for a Radeon at this point.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
Cleeve: thanks for the edit in the cadence detection tests, but it seems the complaints got you confused :)

It's always about 24fps video. Just the conversion from 24fps film to 25fps (PAL/SECAM) TV or 30fpf (NTSC) TV broadcast is different. I sugest you read on Wikipedia the nice description: http://en.wikipedia.org/wiki/3:2_pulldown#23pulldown

That's why the HQV test does not disciminate the 2:2 pulldown test with less points (and they are correct there).
 
G

Guest

Guest
Does the new test really top out at "very good" results? It seems that in several tests there is room for improvement, yet the cards are given scores of 5. What happens when next year's cards improve in those areas? There will be no way to meaningfully compare score results. IMO, 5 should have been reserved for excellent results. In the random noise test, the reviewer awards everyone a 5 even though the GForces are admittedly, noticeably noiser. And yet the 5 is undeserved for the higher end Radeons, which were unable to de-block. IMO, the reviewer should take it on himself to interpret "substantially reduced" as meaning unnoticeable. It's apparent that no card returned such good results, since the Radeons couldn't de-block. The higher Radeons would then get a 4, not a 5, the GForces, should get a 3, and the test still leaves room for improvement to "unnoticeable."

Similarly, if the Radeon show visible compression artifacts, just because they set the standard doesn't mean you have to award a 5. The Radeons should have gotten 4 here, too.

 

smhassancom

Distinguished
Feb 5, 2011
1
0
18,510
The opposite is actually closer to reality, that is 85% of personal computers are outside of the US. In 2005-2004 only 28.4% of the world computers where in the US.
 

WINTERLORD

Distinguished
Sep 20, 2008
1,775
15
19,815
great article. i think but not sure that alot of the picture quality may be related to the drivers/software for the paticuler companies. i bet if Nvidia reads this report they might do more. would be alswome to redo the test later down the rd in the future.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
[citation][nom]ChuckDarwin[/nom]Does the new test really top out at "very good" results? It seems that in several tests there is room for improvement, yet the cards are given scores of 5. What happens when next year's cards improve in those areas? There will be no way to meaningfully compare score results. IMO, 5 should have been reserved for excellent results. In the random noise test, the reviewer awards everyone a 5 even though the GForces are admittedly, noticeably noiser. And yet the 5 is undeserved for the higher end Radeons, which were unable to de-block. IMO, the reviewer should take it on himself to interpret "substantially reduced" as meaning unnoticeable. It's apparent that no card returned such good results, since the Radeons couldn't de-block. The higher Radeons would then get a 4, not a 5, the GForces, should get a 3, and the test still leaves room for improvement to "unnoticeable."Similarly, if the Radeon show visible compression artifacts, just because they set the standard doesn't mean you have to award a 5. The Radeons should have gotten 4 here, too.[/citation]

you cannot do better with subjective evaluation tests ... you can only pick between good enough and not good enough from the current crop of cards. future cards will slot in above the current and current cards will get lower. simply because there is no comparison (ok you can do software render, but that won't make much sense).
 

orleans704

Distinguished
Jan 19, 2010
12
0
18,510
I purchased a XFX 5750 specifically because of this review. It got returned today. I experienced noticable flicker at the start and end of any WMV movie (where GPU acceleration was used). It flickered the entire screen. Also flicker with YouTube videos. Did not flicker with other formats or other video players not using the GPU. XFX support was clueless. My old ATI 4670 was reinstalled -- no flicker.
 

taso11

Distinguished
Aug 27, 2008
134
0
18,690
It has to do with video acceleration. I have noticed much better picture quality, since this article, with hardware acceleration off. Exactly the findings of this article. Before I was under the impression that video accleration was better too. I would get frustrated from looking for a solution when I got blockiness in blu-ray playback with hd capable high end graphics.
 
Status
Not open for further replies.