Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

I don't think its necessarily that. Reviews are supposed to give insight and hopefully not bias, but this entire article screams with bias hatred against AMD starting with the misleading title that turns out to be a simple software setting.
 
[citation][nom]therabiddeer[/nom]Is it just me or is toms heavily biased towards nvidia? We see tons of articles for the Nvidia 6xx but very few for the 7xxx. Nothing negative for nvidia, but an article like this for AMD's, which is already being fixed even though it is undetectable... and the fix doesnt even yield a real change in framerates.[/citation]

You are right. I felt the same way. In all AMD reviews, they'll bring up a negative point. And the are tons of round up articles for nvidia cards from different board partners, but there is not even a single one for AMD.
 
[citation][nom]PCgamer81[/nom]Come on! The 7870 comes pretty darn close to a single 6970. But considering that it has 30% fewer steam processors, fewer texture units, less memory bandwidth, etc - but still performs very close to the 6970, well...Concessions had to be made. It ultimately comes down to...900>800, irrespective of series. What did you expect?[/citation]Read the conclusion, damnit. Driver fix! Problem solved! No concession.

See THIS is why the headline needs to change. It's not a bad article, and it's not that I don't want tech rags to look at IQ. I agree that image quality is important, but the headline gets stuck in people's heads. They can't be bothered to read the whole article, and next thing you know it's parroted all over the internet by fanboys.
[citation][nom]noob2222[/nom]So is this part of dons or toms testing methods not to check the settings before just jumping right into benchmarking?[/citation]Now we have the OTHER end of the didn't-read-the-article spectrum. The "setting" in question that caused the texture quality issue? Internal to the driver! Only AMD can fix it, and they DID fix it. It is NOT a setting Tom's can play around with - the quality slider did NOT fix the image quality at DEFAULT, and further more jacking up the quality slider impacts performance. The driver fix on the other hand, does NOT significantly change performance.

This is important, and the article was important because forcing vendors to acknowledge and fix image quality issues prevents them from sacrificing quality for speed across the board, for better benchmark scores. The issue I have with the article is the misleading title, because they should absolutely know by now that the average Joe fails at reading comprehension.
 
I own a 7970, and I have to say the early drivers have been absolute trash. It's possible they tried to inflate their benchmarks, but I think they just were in too much of a hurry to beat nVidia to market. Slight image quality variations were probably the least of their concerns... That being said, it is articles like this that keep companies honest. Don't be such haters.

Also, for all you people saying that Tom's loves nVidia.
1) Check out ANY of their best graphics cards for the money articles, they are pretty much pure AMD all the time.
2) Honestly, I doubt anyone who has tried both a launch 7970 and a launch 680 could be anything but an nVidia fanboy.
 


after a while, reading about bashing AMD, some people get tired of reading it over and over and over. thumbed through it to see that the entire issue was driver and setting related. still doesn't change the fact that the entire article was written in an anti-AMD tone all the way through it. Its not necessarily reading comprehension but stupidity that starts off bashing AMD for 4 pages before actually saying " by the way, here is the issue." by then your probably already pissed off at AMD, then find out the issue is the way the article is written, so ya, Im now pissed at the person writing the article.

Should have been a quick 1-2 page article simply stating that the "quality" setting for radeon 7xxx series is now functioning.

Did Intel get a 8 page article on the socket 1155 MB issues? Did Nvidia get an 8 page article on the gpus that would "unsolder" themselves?

No, but AMD gets an 8 page article for early drivers having a very minor issue with one of the settings that is nearly undetectable GETTING FIXED. (Hence the electron microscope comment)
 
I'm sure I saw a similar range of articles when AMD brought out the 5000 series a few years ago. Seems whenever AMD bring out a good range of cards they get slammed.

I guess this is down to whoever has the most vicious marketing/PR team.

I remember some guy posted up closeup screen shots of the same scene from an Nvidia and AMD card and pointed out the so called differences. I couldn't for the life of me tell the difference. I concluded that some folks have far too much time to waste.
 
Well as for AMD or Nvidia wanting to 'massage' the FPS figures on release I can see why.

When you read the release review comments the number of folks who 5 months previously spent $500+ on the then current top of the range card that are keen to then buy the next fastest (BY A WHOLE 2FPS) over the card they have is surprising.
 
But the NVidia image that is shown here is pixilated - As if it is not properly antialiased. And now AMD has fixed their drivers to make stuff look the same. Is there maybe some room for personal preference in these debacles?
 
[citation][nom]scotty99[/nom]It does not matter the series you nitwit, they are saying the 7000 series MIGHT have lost quality on picture to gain benchmark numbers. I cannot tell a difference between my gt 240 and gtx 465, an enormous difference in performance but image clarity is the same (this is how it should be).[/citation]

err... 2 series on there should be improved image quality...
 
Very interesting read.
Personally I don't mind vendors playing around with algorithms for image quality renders however, yes they should let users know and offer an option if that's what it was.
Anyone remember the days of ASCII art and grey-scale weight by using different characters?
A quick reminder is the characters that filled up more of the screen in pixels like a G Vs say the . or full stop were the difference between black and less black.
It really is a type of shading from a limited palette of characters.
Lots of people used different techniques/characters that they thought were better or worse at shading for representing the image. Then there was the use of dots and we got the famous Floyd-Steinberg dither patterns and others that were applied to limited colour pallets to trick the eye in to believing there were more colours.

Maybe that's what was happening here.
Perhaps the devs were fooling around with different texture algorithms to try and improve anti-aliasing even further, make stuff look less pixelated.
Yep, it should have been an option to try, like turning on/off Floyd-Steinberg dithering was but I don't mind them trying new things either if it is better.
Good spot Don! keep up the great work Toms and AMD, keep improving so the digital future continues to look and feel more real. That's what I would like to see 😉
 
I don't think Don is pro-AMD or pro-NV ...

Interesting how a toms article causes a major manufacturer to push out a new driver ... thats got to be good for the users ... whatever fanboi persuasion.

 
No way would I have noticed at home had I upgraded to a 7870. Still, it would probably influence my decision if I was purchasing a card right now.

Great catch Tom's, thanks for the thorough testing to show those screenies. Glad it was just a driver problem.
 
[citation][nom]jgu143[/nom]Thanks Toms for keeping these guys honest.[/citation]
Honest?? that comment isn't honest. You infer that they purposely were misleading people. If you read the article, even with the slant, it is obvious that it was an HONEST oversight. One that was promptly fixed after it was brought to their attention.
 
Funny how AMD has a driver oversight, promptly fixes it after knowing about it, and they get a misleading article title urging passers-by to think that AMD is cheating.

NVidia has 80% lower compute performance for sake of 10% improved game framerates, yet they are the 'king'.

If that isn't bias, what is?
 
Obviously these differences can't be noticed when someone is playing a game.Who on earth will stop a game take a still screen shot and then compare it to 6000 series image.What's the puprose of all this? It just makes me laugh guys,I really respect TH team,but this article just gives ''food'' for fanboys.Even the AMD hadn't noticed anything till the article came out!haha
 
[citation][nom]therabiddeer[/nom]Is it just me or is toms heavily biased towards nvidia?

No, they are not, but they are beaming mind control rays at your head.
 
I think the title of this article is a bit sensational.

That being said, I fail to see the cause of the outrage. If Tom's didn't point this out to AMD, would anyone? Would it ever be fixed?

It's not a big difference in quality. Neither is the difference between $5.00 and $4.99. But if you take away one cent at a time, each step is hardly noticeable and before long you're left with jack shit. So while I agree that the headline and sometimes the tone of this article smell like yellow journalism, the point about quality is entirely valid.
 
For those who are bashing AMD, why don't you go after Intel's SB's integrated GPUs? I've read on anandtech that their image quality in games such as Portal 2 failed DX9 standards.
 
Tomshardware do PR for nVidia, being reading article about video cards for years and there is always a bias against Radeon and AMD cards. And you can also see it in there Video charts!!!!! BIAS!!!
 
[citation][nom]therabiddeer[/nom]Is it just me or is toms heavily biased towards nvidia? We see tons of articles for the Nvidia 6xx but very few for the 7xxx. Nothing negative for nvidia, but an article like this for AMD's, which is already being fixed even though it is undetectable... and the fix doesnt even yield a real change in framerates.[/citation]

I'm not sure about that one, I mean, for a long while, AMD cards have been the most recommended, on their Best Graphics Cards for the Money articles. Who knows.
 
Why can't we get comparisons on images that are more naturally recognizable such as human faces? Then there would be less hem and haw about which has the correct quality. The human face and its features are the most readily identifiable of all images for us. If it's a picture of bark then some might say it should be pixilated and some might say it should be heavily anti aliased.
 
Thank you, Don for checking on this. The only way we ever have truly accurate comparisons of GPUs is through scientific testing based upon controlled testing scenarios, and that means anything that affects IQ is to be dealt with with utmost of seriousness. While some may see this as nVidia-bias (it's not), I thank you all for your hard work. Toms' is a great resource for us all and keep up the good work.
 
Status
Not open for further replies.