Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Thanks for all this work. It's good to keep all companies honest.

And it was a relief to know that AMD didn't intentionally sacrifice quality for performance.

But I have to agree with those saying that the title makes it sound like the answer is "yes", or at least that's what people will think (evidence: read some of the comments)
 
Could do with a more accurate title, but I've come to expect many titles to be misleading here at Tom's, especially after that "get a free iPhone 4S" article that turned out to be an outright lie.

As for the article, it wasn't bad. It definitely showed us that AMD made a mistake and fixed it without complaining and that's something to admire. As for the AMD bashers here, the problem was solved. SOLVED. There is no trade off to fix it either. The title is misleading, so get over yourselves.

For the people who managed to convince themselves that there is a Nvidia bias here. NO. When a site practically recommends buying an AMD card over a Nvidia card at almost every price point, there is not a bias towards Nvidia. Yes, there are more articles about the 680 NOW. Back when it first came out, there were plenty of articles about the Radeon 7970 and a lot of people were complaining about the surplus of AMD articles despite the fact that there was simply nothing from Nvidia to write about, so the AMD cards was all that there was.



AMD got what might be their entire GCN line-up out before Nvidia got a single Kepler card out. Hopefully there will be a successor to the 7970 in the form of a 7980 or something like that, but there probably won't be despite the Tahiti having 2304 shaders and the 7970 only using 2048 of them and the 7970 having such a low clock frequency. AMD isn't a company out to have the performance crown. AMD's focus is and has been for quite a while on the low end market with the high end only being an after thought in comparison. AMD makes far more money in the low end where they have practically no competition from Nvidia that isn't more than two or three years old. It would be nice to see AMD's low end lineup consisting of GCN or at least VLIW4 cards for the Radeon 7600 and below, but at least new cards come out (even if they are rebrands, they still have new features), whereas Nvidia's only DX11 low end offerings are GT 400 cards.

[citation][nom]SuperVeloce[/nom]could not agree more! There is no reason for a single person to defend any capitalist company, regardless of product or profits they make.[/citation]

If someone is bashing a company, there is no reason to defend the company from that person, even if by defending them from wrongful bashing, you bash them for things that they should be bashed for? For example, some people go on and on about how AMD CPUs are absolute garbage and should never be used for any reason. However, for highly threaded performance, AMD CPUs are currently better than Intel CPUs at all price points that have AMD CPUs.

For anyone who uses a lot of highly threaded software (archiving, web browsing with some web browsers with several highly CPU intensive tabs, rendering, media work, servers, compiling, etc), AMD is a far better option if their budget doesn't allow for an i7 or better. AMD even provides a more energy efficient option for some of their 95w CPUs than Intel's competing 95w CPUs for this type of work (specifically, 95w FX-8xxx, 6xxx, and Phenom II x6s).

I think that it's obvious that there can be good reason to defend a company in some cases and they aren't even very rare cases.

[citation][nom]fyasko[/nom]it seems as though tom's readers are pro amd/ati based on the thumbs down.[/citation]

Actually, the thumbs down are because the article paints AMD in an unfair light. Okay, so there was a very minor issue that was so minor that it went almost completely unnoticed and even in the blown up pictures, is difficult for many readers here to see (I had no trouble seeing it, but I'm usually good at that. Whether or not I would've seen it during gameplay is much less likely). However, not only did AMD fix the problem without much fuss, but they did so without sacrificing performance as the title suggests. The article starts off on a seemingly AMD bashing tone, but really, reading through the whole article seems to dispel that.

That a previous AMD generation was in the comparison and considered equal to the previous Nvidia cards kinda throws out any bias against AMD (at least, it does in my opinion). That the GTX 680 wasn't included is a little annoying, but it seems as though it does not have such a problem, so no big deal there either. In fact, I'm surprised that none of the fanboys mentioned this to flame the other posters and/or the article.
 
[citation][nom]rantoc[/nom]Its not unusual that image quality suffers to provide performance boosts, good article. Proves who cares more about img quality over performance. Ohh and the quality winner also provide the best performance for a cheaper price with the 680gtx. I would call that a field day, bought two and have em in sli. Got tiered of the 7970's small issues here and there with the Amd drivers. I hope they improve and Amd will have a very nice contender, the hardware is really nice but the driver support for new releases are lagging![/citation]

Did you bother to try the new Catalyst 12.3 and up drivers on those 7970s? Besides that, the improvement in image quality from fixing the problem with the GCN Radeon picture quality did not decrease performance. Besides that, we all knew what purchasers of Radeon 7000 cards were getting into when they bought them before there was proper driver support.
 
I am in college and I write papers all the time and try to come up with witty and interesting titles all the time. This title in my opinion was developed to catch the readers eye> IN MY OPINION if someone read just the title of this article who was a tad slow, they might conclude themselves that AMD is cheating. But the article was helpful and useful to me and I did not feel as if the article was misleading at all. I like toms hardware, DO I AGREE WITH TOMS ALL THE TIME, NO! But I do sometimes, you cannot make everyone happy someone is always going to feel wronged no matter how hard you try.

THIS ARTICLE IN NO WAY MAKES ME FEEL THAT AMD IS NOT AN OUTSTANDING A+ COMPANY, every company makes mistakes. Nvidias super hot 400 series was a quick one that comes to my mind, that was bad.

AMD IS STILL AN EXCELLENT COMPANY!

Thank you Toms staff for always trying to be genuinely helpful.
 

you really think that AMD didn't already have the fix in the works? I guess its possible considering you have to have side-by-side machines to examine pixel-by-pixel.

problem is its the comments like this that make the article stink of AMD bashing

a workaround is purportedly possible.

a workaround? fixing a slight glitch in a driver setting is not a workaround, its a oversight correction if anything.

And as for the title, there was NO PERFORMANCE DIFFERENCE between the fixed driver and the bugged setting driver.
 
In the last image you claim that "the Radeon HD 6970 demonstrates inferior [anisotropic] filtering compared to the 7000 series". I beg to differ. There's a huge distortion (ring near the bottom right corner), much larger than the one that shows with the 6970. If anything, it's the 6970 that is closer to the reference image.
 



Seriously? You really just said that? :pfff:
The explicitely said they set CCC to identical settings for all the cards - as in, DEFAULT. The DEFAULT setting produced lower texture quality between the 6000 and 7000 series cards. Now with the fix, they are the same again. Get it?

Why are they comparing the 6900 series vs the 7800 series?

Tom's said. "We discovered blurry textures when we reviewed the Radeon HD 7800s", and, "Why does the Radeon HD 6900 series demonstrate crisper image quality?"

Why not compare the 6900 series vs the 7900 series? Or better yet, the 6800 series vs the 7800 series? The very fact that they don't, tells me that the 7800 is beaten only by the 6900 series in terms of image quality, which is to be expected. After all, 6800s vs 7800s would be far more relevant.

*sniff sniff*

I smell something...

What? Performance wise, the 7870 is faster than a 6970. Not to mention they clearly included the 7970 in all the screenshots.

Image quality does not ever have to be sacrificed on lower end cards. Setting for setting, they should produce identical high quality images with the only difference being framerate. Whether you look at a 6570 or a 7970. The changes that must be made should be done by the user, by setting game settings to lower quality or reducing driver settings ("Performance" vs "HQ").

On topic: Good article, glad to see it's a simple fix. Sucky for AMD that they will likely get some bad PR for this, seems like a pretty honest mistake given that the totally fixed driver has identical performance - there's no reason for them to have done it on purpose. Not surprising to see the rage in the comments though. Nvidia fanboys calling out fails left and right... :sarcastic:
 
Just wow at all the delusional AMD fanboys spamming this comments section. STOP TYING YOUR SENSE OF SELF WORTH INTO A TECH COMPANY!

So the 7000 series didnt trade image quality for performance, the catalyst drivers were just crap. Not the first time its happened, and it wont be the last time.
 
[citation][nom]edlivian[/nom]who cares anyways, only cheapskates by amd, if you want quality drivers, you buy nvidia.[/citation]

Nvidia has their problems jsut as AMD has theirs. For example, for late 2011 up to the Nvidia price cuts last month, Nvidia was consistently overpriced (except for the 560 TI) by significant margins compared to similarly performing AMD cards.

Lets have a good comparison of the entire GTX 500 family to Radeon 6000.

Radeon 6770 versus GTX 550 TI
The 6770 is far cheaper, supports 3/4 way CF (four dual dongle cards or two dual dongle and two single dongle or three dual dongle and one single dongle for quad CF, three dual dongle or two dual dongle and one single dongle or one dual dongle and two single dongle for three way CF),

Radeon 6870 versus GTX 560
Radeon 6870 uses far less power, supports 3/4 way CF (same as 6770, except there are also 6870X2 cards that can be in 3 way CF with single/dual dongle 6870s and two 6870X2s can be in quad CF), and is cheaper.

Radeon 6950 versus GTX 560 TI
Radeon 6950 supports 3/4 way CF (all 6950s should be dual dongle), has better CF scaling (dual 6950 performs similarly to dual GTX 570). All in all, the 560 TI was Nvidia's only card that truly competes with AMD as it should because it had the proper amount of memory, power usage, and price, all for it's performance.

Radeon 6970 versus GTX 570
Radeon 6970 has much more memory (570 is memory capacity bottlenecked going beyond 1080p in many games), uses less power, and has better CF scaling (two 6970s are almost as good as two GTX 580s for performance).

GTX 580 had no competitor from AMD in the Radeon 6000 series, so it was overpriced extremely. However, it too has it's fair share of problems including high power usage and too little memory capacity, especially for multi GPU setups.

However, the Nvidia cards were better for compute and had stuff such as CUDA/PhysX support for the people who used it. They also usually had better cooling than competing Radeons. There's also the drivers argument, but that really isn't nearly as bad as it used to be and is far over-exaggerated by most people when they refer to it. Most of the people complaining about drivers are actually seeing problems caused by other issues with their systems and/or are using out-dated versions.

It seems that both companies have their problems.
 
my games look fine on my system, im not gonna glue my face to my screen and magnify everything 300x to see blurriness..... Cat 12.4 better fix other issues though.
 
well i just hope AMD gets their S**T right on 12.4 ,12.3 was a disaster for me on my 6950 2gb(sapphire)
,corsshair iv formula mobo, phenomII x4 970 cpu. with msi afterburner on (no OC only fan control) it would idle at 60 (normal is 42-46), if not msi afterburner start on system startup it would idle at 45 but after playing something like deus ex hr (even when in CCC set fan control to costant 50%) it hit 90 and after exiting went no lower than 60!!! back on 11.12 rocking great no problems with any latest releases (mass effect3). PS toms fix your S**T with IE9 i had to submit this comment on chrome!!!
 
Well, well if there would have been speen difference, but no, the difference was null, so no gains in there. But it is good to check out these things. I remember situation (both sides) where there vere actually faster frame rates at the expence of guality. In this case there was just lover guality without speed, so normal buck situation.
But yeah, better wording in the article would have been proper in this case. Keep on checking! We need guality checkers in this industry. Just to keep those companies on their toes!
 
[citation][nom]nameon[/nom]well i just hope AMD gets their S**T right on 12.4 ,12.3 was a disaster for me on my 6950 2gb(sapphire),corsshair iv formula mobo, phenomII x4 970 cpu. with msi afterburner on (no OC only fan control) it would idle at 60 (normal is 42-46), if not msi afterburner start on system startup it would idle at 45 but after playing something like deus ex hr (even when in CCC set fan control to costant 50%) it hit 90 and after exiting went no lower than 60!!! back on 11.12 rocking great no problems with any latest releases (mass effect3). PS toms fix your S**T with IE9 i had to submit this comment on chrome!!![/citation]

Maybe you should stop using IE. Maybe it's M$ that needs to fix their browser. Unless you can prove that it's Tom's fault, don't yell at Tom's for such a problem.
 
[citation][nom]therabiddeer[/nom]Is it just me or is toms heavily biased towards nvidia? We see tons of articles for the Nvidia 6xx but very few for the 7xxx. Nothing negative for nvidia, but an article like this for AMD's, which is already being fixed even though it is undetectable... and the fix doesnt even yield a real change in framerates.[/citation]

Well, you have to remember that AMD has all these cards already out for their 7000 line while nVidia only has its 680 out right now. For less than AMD's top card (7970), the gts 680 handily beats it in every way that matters to a gamer. So you have one product that beats the entire AMD lineup. It's hard for Tom's not to seem like they're biased towards nVidia right now - nVidia simply has the (single) better product right now.

Once the 680's finally come into stock again and the rest of the 600 line is released AND the price of the 7900's goes down, then things will seem a little less biased. Besides, Tom's already released their "best graphics card of april" article - clearly saying that AMD offers a ton of value for the low-mid range area. I'm pretty sure AMD has more reccomendations than nVidia overall as well.
 
[citation][nom]therabiddeer[/nom]Is it just me or is toms heavily biased towards nvidia? We see tons of articles for the Nvidia 6xx but very few for the 7xxx. Nothing negative for nvidia, but an article like this for AMD's, which is already being fixed even though it is undetectable... and the fix doesnt even yield a real change in framerates.[/citation]

Are you blind, or can you not count?

Being a fanboy is bad because it makes you see what you want to see. Since the release of the 7970, there has been 4 articles total about Nvidia products, and 9 about AMD graphics products. (one I counted for each because both were involved.)

If you just count Nvidia's 6xx series and AMD's 7xxx series, there have been 2 articles about the GTX 680 and 4 about the various 7xxx cards.

 
Unfortunately, this is a pattern with AMD; reducing image quality to gain some FPS for a release card was also well researched throughout the web for the 6850 and 6870. The whole idea is to have it not be noticeable until you look really close. They are trying not to get caught. It's a bad precedent, and fans should be wary to support this practice.
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
 
[citation][nom]Marcus52[/nom]Are you blind, or can you not count?Being a fanboy is bad because it makes you see what you want to see. Since the release of the 7970, there has been 4 articles total about Nvidia products, and 9 about AMD graphics products. (one I counted for each because both were involved.)If you just count Nvidia's 6xx series and AMD's 7xxx series, there have been 2 articles about the GTX 680 and 4 about the various 7xxx cards.[/citation]

I count four GTX 680 articles on Tom's so far (might be even more). The initial review, it's follow-up, the article with a list of several GTX 680 boards, and another one dedicated to the Gigabyte GTX 680 Wind Force or whatever it's name was.

There are also a lot more than 4 articles about the Radeon 7000 cards. 7000 does have more articles than the 680, but you miscounted by a wide margin. I also only counted articles that were centered only around one side or the other, IE that's at least 4 articles centered on the 680, and a lot more than 4 centered on Radeon 7000 cards. Counting the articles that include them both, the numbers go even higher.

Does this scream AMD or Nvidia bias? No, not really. There are more AMD articles because there are more AMD cards and they have all been out longer than the GTX 680 has. Considering that AMD cards have dominated the recommended cards in the best graphics for the money articles for quite a while now and the reasons are given for why, I can't help but think that this is a fairly unbiased site. Sometimes a writer might seem to have an unfair title and tone (such as part of this article) or some things that should have been done weren't (often, this is the case when articles are made with the intent to have one or more follow up articles), but the information that they give us is unbiased.
 
[citation][nom]LaHawzel[/nom]These differences are things that no one would ever notice if tech review sites didn't point them out. Well, not that I mind knowing that it can be fixed with a driver update, but I find it unnecessary for the average gamer to worry about these minor differences with image quality (knowing it's "fixed" is more of a placebo than an actual improvement of gaming experience). Not to mention that the typical gamer plays on 6-bit TN-panel monitors because "HURR 1ms RESPONSE TIME HOLY SHIT BEST SCREEN EVER" and they in turn elect to give up the superior color gamut and viewing angles conferred by IPS panels. They ought to the last ones who deserve to complain about image quality, at any rate.[/citation]

I disagree.

Qualitative differences aren't always something a person can put his finger on; often, it's just an impression, a "feeling" if you will. Sometimes it's not better, it's just different, other times it's actually better on a specific card or line of cards. If Nvidia or AMD made a regular practice of shaving off quality in order to show higher frame rates than the other guy, people in general would start preferring one over the other, over time.

😉
 
[citation][nom]matto17secs[/nom]Unfortunately, this is a pattern with AMD; reducing image quality to gain some FPS for a release card was also well researched throughout the web for the 6850 and 6870. The whole idea is to have it not be noticeable until you look really close. They are trying not to get caught. It's a bad precedent, and fans should be wary to support this practice.http://www.guru3d.com/article/expl [...] mizations/[/citation]

There was no performance benefit to this problem today and the image quality fix did not hamper performance over the unpatched drivers at all, so you're blatantly wrong and you failed to read the article properly.

Also, Guru3D is known to have a pro Nvidia bias and that has come up time and time again.

What fans should be weary of is people like you spreading misinformation. Yes, AMD did make trade offs in perofrmance and image quality in the past... However SO DID Nvidia. This was an honest (and very minor) mistake in AMD's drivers that was fixed.
 
Kinda hard to write anything about Kepler when you cant get them to the masses,

The 28nm process isnt full speed and even then the yields were less than desired.
 
[citation][nom]blazorthon[/nom]I count four GTX 680 articles on Tom's so far (might be even more). The initial review, it's follow-up, the article with a list of several GTX 680 boards, and another one dedicated to the Gigabyte GTX 680 Windsor or whatever it's name was.There are also a lot more than 4 articles about the Radeon 7000 cards. 7000 does have more articles than the 680, but you miscounted by a wide margin. I also only counted articles that were centered only around one side or the other, IE that's at least 4 articles centered on the 680, and a lot more than 4 centered on Radeon 7000 cards. Counting the articles that include them both, the numbers go even higher.[/citation]

Well, it depends on the link you follow. This is where I got my numbers:

http://www.tomshardware.com/reviews/Components,1/Graphics-Cards,4/

If you check the "News, Graphics" instead of the "Articles, Graphics" you get numbers more like yours.

http://www.tomshardware.com/news/Components,1/Graphics-Cards,4/

I would say that "Articles" are what is most important here, but in any case, the fanboys have been put in place by the facts.

😉
 
I really would like to have seen frame rate benchmarks done on a screen that could actually give these cards a run for their money. 1920x1080 is more or less a standard size, but not enough to show up real differences (if there are any).

I would also like to have seen minimum frame rates shown.

Very nice article though, it helps us all to have these things explored properly, and in a public way.

😉

 
Status
Not open for further replies.