Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You are completely misunderstanding how this stuff works. First of all, a display does not display in FPS, it displays in Hz. The graphics cards output in FPS. Hz and FPS are not the same. The difference between 60Hz and 75Hz is minimal, so you should not see the difference. The difference between 60Hz and 120Hz is far greater and if you didn't see it then you may need to get your eyes checked (I'm not being rude or obnoxious; just stating a fact).

If a graphics card is not outputting more than 60FPS, then going from 60Hz to 75Hz will offer zero difference. If your game settings are set so it can run at over 120FPS, then going from a 60Hz screen to a 120Hz display will show an important improvement. Increasing refresh rate above 60Hz when you don't get FPS higher than 60FPS will not cause much change.

Also, if your shooters are going at 75FPS, then the graphics cards work jsut as hard regardless of the display being set to 60Hz or 75Hz. FPS is not going to change from that, only the perceived FPS is. The graphics cards are still outputting 75FPS, the displays just then don't display it.

Here are some examples that may help.

30FPS on a 30Hz screen looks roughly identical to 30FPS on a 60Hz, 75Hz, or 120Hz and up display.

120FPS looks far better on a 120Hz display than on a display with lower Hz. Without Vsync, having FPS that much higher than the refresh rate could actually cause visual tearing and decrease picture quality.

75FPS on a 60Hz or 75Hz screen is just too close to tell much difference (usually). Going above 60FPS and Hz, it takes larger and larger jumps to continue noticing improvements. For example, going from 60FPS on a 60Hz to 75FPS on a 75Hz is nothing like the change from a 60FPS on a 60Hz to a 90FPS on a 90Hz or a 120FPS on a 120Hz.

Also, keep in mind that FPS is almost never exactly constant. It continuously fluctuates during game-play, although not always by large amounts. Some times it is by large amounts. The closer your minimum frame rate is to your average frame rate, the better.

One of the few things worse than elitism is misunderstanding something that is not elitism because you don't understand how graphics and displays work, but managing to think that it is elitism anyway.
 


According to these pictures, the Nvidia card did not have a picture quality problem and the AMD cards did. Calling AMD out on this so that AMD could fix it (which they did) is not being biased. Would you have preferred that Tom's just let this problem slide? Thanks to Tom's, the newest Radeons have been improved slightly. Instead of thanking them for their hard work and contribution to the improvement of the newest Radeons, you mock them for helping the company that you seem to favor. Not even fanboy logic works with this, so you're not only a fanboy, but a stupid one at that if you truly are calling Tom's biased due to this article.

Just looking at these pictures, the Nvidia GTX 580 obviously doesn't have picture quality problems. If you can prove this accusation has merit, them I will give it due credit. Until that time comes, stop wasting the time of Tom's readers with your crap.
 


Please explain to me how a company who recommends almost purely AMD graphics cards has a pro Nvidia bias. Also, what does this have to do with ANY bias? Tom's clearly states that the older AMD and Nvidia cards did not have a problem, only that the new AMD cards did. Tom's then contacted AMD on the matter and after a while, AMD fixed it. Is Tom's biased for telling us that there was a problem? Not in the least, at least not in my opinion. Had Tom's said something about this prior to AMD fixing it, then you might have had a point. However, Tom's went so far as to wait for AMD to fix it before saying something about it. If Tom's hadn't waited, then AMD would need to deal with a bad PR situation until they fixed the problem and possibly well after the fix too. Tom's did the best possible thing for AMD and their readers at the same time.
 

i feel not pro nvidea but pro intel and the most expensive highest ones well that my 3 cents
 
[citation][nom]blazorthon[/nom]You are completely misunderstanding how this stuff works. First of all, a display does not display in FPS, it displays in Hz. The graphics cards output in FPS. Hz and FPS are not the same. The difference between 60Hz and 75Hz is minimal, so you should not see the difference. The difference between 60Hz and 120Hz is far greater and if you didn't see it then you may need to get your eyes checked (I'm not being rude or obnoxious; just stating a fact).If a graphics card is not outputting more than 60FPS, then going from 60Hz to 75Hz will offer zero difference. If your game settings are set so it can run at over 120FPS, then going from a 60Hz screen to a 120Hz display will show an important improvement. Increasing refresh rate above 60Hz when you don't get FPS higher than 60FPS will not cause much change.Also, if your shooters are going at 75FPS, then the graphics cards work jsut as hard regardless of the display being set to 60Hz or 75Hz. FPS is not going to change from that, only the perceived FPS is. The graphics cards are still outputting 75FPS, the displays just then don't display it.Here are some examples that may help.30FPS on a 30Hz screen looks roughly identical to 30FPS on a 60Hz, 75Hz, or 120Hz and up display.120FPS looks far better on a 120Hz display than on a display with lower Hz. Without Vsync, having FPS that much higher than the refresh rate could actually cause visual tearing and decrease picture quality.75FPS on a 60Hz or 75Hz screen is just too close to tell much difference (usually). Going above 60FPS and Hz, it takes larger and larger jumps to continue noticing improvements. For example, going from 60FPS on a 60Hz to 75FPS on a 75Hz is nothing like the change from a 60FPS on a 60Hz to a 90FPS on a 90Hz or a 120FPS on a 120Hz.Also, keep in mind that FPS is almost never exactly constant. It continuously fluctuates during game-play, although not always by large amounts. Some times it is by large amounts. The closer your minimum frame rate is to your average frame rate, the better.One of the few things worse than elitism is misunderstanding something that is not elitism because you don't understand how graphics and displays work, but managing to think that it is elitism anyway.[/citation]
No, V-Sync limits how hard my graphics cards work when the refresh rate is set lower. I understand how it works. You listen to what I am saying.

When I set the refresh rate at 60Hz and enable V-Sync, it looks EXACTLY the same as when I have my refresh rate at 75Hz w/ V-Sync. I mean, not a lick of difference. NONE.

So attempting to game on an even higher refresh rate, effectively netting me more visable frames per second, would be an exercise in both futility and stupidity.

Capiche?
 
I am tickled by the fact that you think I am misunderstanding. Well, I think you are imagining things, so I guess we're even.

Let me explain it to where even you can understand it:

I can notice a difference in frame rate up to 50fps, and barely notice a difference up to 60, where it I stop can no longer see a difference.

I have a 75Hz monitor, and because it's 1080p and I am gaming with dual 6970s, I use V-Sync. V-Sync does limit how hard my PC works, because when V-Sync's not on it sounds like it'll bring the house down, but I digress. In any case, the frames my GPU is putting out is different than my monitors refresh rate, as such, 120fps would be stupid with 60Hz, as all you would be getting is 60fps and a lot of screen tearing, you with me so far?

Now, as I said earlier, I notice a difference in frame rate up to 60. Past that, nothing. I see no difference whatsoever. I have V-Sync on at all times. When my refresh is at 75Hz, it is a constant 75, and when I have it set to 60Hz (which I do all the time), it is a constant 60.

I see no difference between 60fps and 75fps. NONE. WHAT. SO. EVER.

So far me to throw down for what would ultimately equal even MORE frames per second would be RETARDED. You got that?

And as for those that do? Well, you're all either superhuman or you're all seeing things. My money's on the latter.

I think we understand each other, now. For better or worse.
 


I already said that you shouldn't see a difference between 75 and 60FPS. It's too close. Unless you have seen a 120FPS and 120Hz setup yourself, you don't know the difference. I have and I do.

At 1080p, is there much point to have more than one 6970 even with V-Sync? One 6970 should be able to do it all at 1080p.

I mentioned screen tearing already and it clearly says in my post that much more than 60FPS on a 60Hz or 75z display would get plenty of screen tearing.

Also, you can edit your post in articles comments. It's in Tom's guide, It Pro, etc that you can't. Go to the top of the comments and click the link that says read the comments on the forum. It turns the article comments in to a forum post where you can edit your posts.

My point was that 120+FPS with V-Sync on a 120Hz display does show a difference between 60/75FPS on a 60/75Hz display. Your post implied that you did not know the difference between Hz and FPS, either by poor wording or typos, and that is why my reply was so basically explanatory.
 
[citation][nom]blazorthon[/nom]I already said that you shouldn't see a difference between 75 and 60FPS. It's too close. Unless you have seen a 120FPS and 120Hz setup yourself, you don't know the difference. I have and I do.At 1080p, is there much point to have more than one 6970 even with V-Sync? One 6970 should be able to do it all at 1080p.I mentioned screen tearing already and it clearly says in my post that much more than 60FPS on a 60Hz or 75z display would get plenty of screen tearing.Also, you can edit your post in articles comments. It's in Tom's guide, It Pro, etc that you can't. Go to the top of the comments and click the link that says read the comments on the forum. It turns the article comments in to a forum post where you can edit your posts.My point was that 120+FPS with V-Sync on a 120Hz display does show a difference between 60/75FPS on a 60/75Hz display. Your post implied that you did not know the difference between Hz and FPS, either by poor wording or typos, and that is why my reply was so basically explanatory.[/citation]
I know the difference. I have a 75Hz screen and the max I can see is 75fps, as the monitor can only "refresh itself" 75 times in a second. Anything more can cause tearing. Been there, done that, bought the T-Shirt.

My point is, when switching from 60Hz and 75Hz on my monitor (VSync always on), I cannot tell the difference between 60fps and 75. But considering that I can tell a difference between 50 and 60, but not 60 and 70 - or 75, indicates to me that any noticeable increase in quality going even further than 75fps (which I can't tell from 60) is minimal if at all. You say it's too small a difference to notice. Well, here is what I have to say to that...

15fps is 25%. That is significant. Considering that you claim 120fps is discernible, I would think 60 and 75 would still be small enough numbers that I could notice a visual increase in quality. I mean, going from 30 to 37.5 is big. Going from 20 to 25 is noticeable. And at even higher frame rates, 25% is even bigger. I just can't take your advice and step out on faith and go with 120Hz when I can't tell 60 from 75. It's just too close to call. Going even higher would only accentuate the lack of disparity, not remove it. It is common sense. And to look at it any other way would be a step of faith that I have no reason to make. I am pleased at 60fps. If I am going to invest in something, it will be eyefinity. That is worthwhile.

As to why I have two 6970s, well...I think you give a single 6970 a little too much credit. I couldn't fill my refresh rate maxing out the more demanding games. That is a big issue with me. Plus, I run a lot of mods, like Crysis 2 DX11/MaLDo's/Blackfire's, Crysis Real Lifesis, and others. Crysis 2 totally modded brings my dual crossfire rig to it's knees. I get like 50fps on average - won't even fill my refresh rate.
 


The problem is THG pretend to be professional and people just take it for granted that THG's review be trustworthy.

THG thinks most of people are stupid.

But after reading tons of these crap articles, I found that they have a number of formula , for example: AMD cards is not bad but I would choose Nvidia card because nVidia cards function better, perform better, and are overall better ?

What a contradiction and full of shxxt !!!
 


A tactic to dilute their bias behavior only !!!
You won't trust if he tap at the target very often.
A more cunning way only!!
 
Let me put it this way, when I bought a nVidia card, I just paid nVidia to do the dirty work , like(1) paying nVidia to bribe game compay implant some bad source code in the way that AMD perform poorer, and (2) contribution writers to spread the FUD (Fear, Uncertainty, Doubtful) towards AMD card.
So the slogan "the Way Gamers Meant to Played" was finally achieved through this two mounted guns.
 


I'll let you in on a little secret: the monthly 'Best Graphics Cards For The Money' article is probably one of THG's most read, influential, and pervasive articles. It consistently gets FAR more hits on a monthly basis than any other article on the site I'm aware of.

So you're suggesting that Tom's is using their most read and popular article to recommend AMD cards in order to cover up their pro-Nvidia bias that they plan to show through much less read articles like this one?

That doesn't make sense.

By your logic, Tom's is obviously Pro-AMD, and use less-read articles like this to cover up their anti-Nvidia bias.
 
[citation][nom]hanson308[/nom]A tactic to dilute their bias behavior only !!! You won't trust if he tap at the target very often.A more cunning way only!![/citation]

[citation][nom]hanson308[/nom]Let me put it this way, when I bought a nVidia card, I just paid nVidia to do the dirty work , like(1) paying nVidia to bribe game compay implant some bad source code in the way that AMD perform poorer, and (2) contribution writers to spread the FUD (Fear, Uncertainty, Doubtful) towards AMD card.So the slogan "the Way Gamers Meant to Played" was finally achieved through this two mounted guns.[/citation]

There's fanboy logic, and then there's the stupid fanboys that can't even apply any logic to their opinions besides saying that everyone who doesn't agree with them completely is wrong. That is beyond a fanboy. That is a fanatic. Yes, Nvidia does some anti-competitive things. Are they even very effective? Well, considering that some of the titles that Nvidia worked with actually don't even favor Nvidia all of the time, Nvidia *optimizations* don't seem to make a huge difference every time. Furthermore, if I remember correctly, the slogan is (or at least it was) "Nvidia, the way it's meant to be played".

Telling us something so ridiculous is pointless if you don't give proof. Give links that prove you right or GTFO.
 
[citation][nom]The Halo Don[/nom]Im a Performance freak, so I'd be okay with this.Not sure if other people would like it as much though.[/citation]

Like what? The article clearly states that image quality is not being traded for performance, that it was a driver issue that made this apparent, that AMD made a fix for it, and that the fix does not degrade performance despite increasing picture quality. Did you just read the title and comment without reading the article?
 



Prove what ? Prove nVidia is an expert of TROLL ??? Prove THG is one of the members of the TROLL ?

Did you ever see NAZI kill people ? If you didn't see the "proof" by your naked eyes, why you believe they kill people in WWII ?
 
[citation][nom]PCgamer81[/nom]I know the difference. I have a 75Hz screen and the max I can see is 75fps, as the monitor can only "refresh itself" 75 times in a second. Anything more can cause tearing. Been there, done that, bought the T-Shirt.My point is, when switching from 60Hz and 75Hz on my monitor (VSync always on), I cannot tell the difference between 60fps and 75. But considering that I can tell a difference between 50 and 60, but not 60 and 70 - or 75, indicates to me that any noticeable increase in quality going even further than 75fps (which I can't tell from 60) is minimal if at all. You say it's too small a difference to notice. Well, here is what I have to say to that...15fps is 25%. That is significant. Considering that you claim 120fps is discernible, I would think 60 and 75 would still be small enough numbers that I could notice a visual increase in quality. I mean, going from 30 to 37.5 is big. Going from 20 to 25 is noticeable. And at even higher frame rates, 25% is even bigger. I just can't take your advice and step out on faith and go with 120Hz when I can't tell 60 from 75. It's just too close to call. Going even higher would only accentuate the lack of disparity, not remove it. It is common sense. And to look at it any other way would be a step of faith that I have no reason to make. I am pleased at 60fps. If I am going to invest in something, it will be eyefinity. That is worthwhile.As to why I have two 6970s, well...I think you give a single 6970 a little too much credit. I couldn't fill my refresh rate maxing out the more demanding games. That is a big issue with me. Plus, I run a lot of mods, like Crysis 2 DX11/MaLDo's/Blackfire's, Crysis Real Lifesis, and others. Crysis 2 totally modded brings my dual crossfire rig to it's knees. I get like 50fps on average - won't even fill my refresh rate.[/citation]

I already said that as the frame rates increase, it takes larger and larger changes to make a difference. Just because a 25% change isn't enough does not mean that a far greater 100% change is not enough to create a difference. I'm not advising that you should get a 120Hz display. You already are convinced that you won't see a difference and you don't seem like someone who is good at discerning differences like that in a display.

It is common sense that a 25% gain and a 100% gain are two wildly different things. Going from 40FPS to 50FPS (25%) is better than the 50FPS to 75FPS (50%) jump if 75FPS is equal to 60FPS, but the jump from 50FPS to 75FPS is far greater than that of the jump from 40FPS to 50FPS. Going by the rate of decay in improvement per percentage jump of the FPS, it stands to reason that 120FPS on a 120Hz would show a discernible difference over 60FPS and 75FPS. Going over 120FPS would probably never show a discernible difference, but going to it should.

[citation][nom]hanson308[/nom]Prove what ? Prove nVidia is an expert of TROLL ??? Prove THG is one of the members of the TROLL ?Did you ever see NAZI kill people ? If you didn't see the "proof" by your naked eyes, why you believe they kill people in WWII ?[/citation]

I believe that WWII happened because I have talked to several veterans of it. I don't believe that you have an accurate point because I have a rude forum troll who's opinions are unfounded in logic and completely contrary to the views of the vast majority of the tech industry. Considering that Cleeve already explained away your views of Tom's being a Nvidia biased site as incorrect using proper logic, I think that it's almost time to ban you for your outright stupidity. You are the biased one here. Again, either give us some evidence to back up your outlandish claims, or stop lowering the IQ of Tom's readers with your laughable comments.

When I said that supposedly Nvidia optimized games don't even always favor Nvidia cards over AMD, I didn't need to provide links because many articles here on Tom's show this. Just looking through some of the past articles will show this (many of them even have text talking about the phenomenon). You, however, have nothing.
 
I think the title of this article is a bit sensational.

That being said, I fail to see the cause of the outrage. If Tom's didn't point this out to AMD, would anyone? Would it ever be fixed?

It's not a big difference in quality. Neither is the difference between $5.00 and $4.99. But if you take away one cent at a time, each step is hardly noticeable and before long you're left with jack ***. So while I agree that the headline and sometimes the tone of this article smell like yellow journalism, the point about quality is entirely valid.

Even there is nothing to be considered as abnormalities, AMD has to do something to answer THG.
Even how ridiculous the accusation that THG's assertions be, AMD has to do something to answer THG.

why?

Because AMD can't afford the consequences in standing the way of devil.

What if AMD found that the so-called problem not needed to be addressed (as we common people argue), but THG had magnified the problem in hundred fold?

What choice that AMD have ?
 
[citation][nom]Veirtimid[/nom]I did read it btw. When I said that I was relating to constant AMD driver problems and imperfections, it's true, not just my perspective or prerogative.[/citation]

In that case, I apologise. We get a lot of trolling comments around here and it's rather easy to believe somebody is up to no good. I hope you get your driver woes sorted ASAP. I used to get squealing from my speakers until I realised it was down to the way the 2x2 power plug was seated. 😛
 
[citation][nom]PCgamer81[/nom]... I can notice a difference in frame rate up to 50fps, and barely notice a difference up to 60, where it I stop can no longer see a difference. ...[/citation]

Just one minor but obvious point worth making here: the frame rate at which one perceives smooth
motion, beyond which additional fps (crudely put) can't be detected, is different for each individual.
After so many years of dealing with professional VR kit, HD displays, etc., I can easily tell the difference
between 60 and 75 fps/Hz, but that's just me. I probably became more sensitive to this because it
was the field I worked in. I remember old N64/PSX argument threads where some claimed not to
be able to see the difference above 30fps, or even 25fps, while others would claim that'd make them
barf. And of course the use of fields rather than progressive frames in analogue TV standards made
the issue even more complex (most people standing in a shopping store room full of ordinary TVs
will see all the units they're not directly looking at as being 'flickery', but not all).

SGI had a saying, "60Hz, 30 hurts", ie. they aimed for minimum 60Hz on their vis sim solutions
with IR technology, but that doesn't mean there aren't people who can tell the difference between
60 and 75. If you're happy with 60, or even 50, then that's great for you, but it doesn't really say
much about tech in general because everyone's different. If the other guy needs 75+ to feel
comfortable, then that's his issue to deal with. And of course using vsync (or not) can affect how
your eyes perceive the data hitting them. It's a bit silly really for someone who prefers 50+ Hz
to be arguing with someone else who needs 75+ to feel comfortable. One person prefers a
paperback, the other prefers an e-Reader... I'm more interested in the story being told. 😉

Be happy you're not a bird, they need more like 200+ fps to perceive smooth motion. No Crysis
fun for them. 😀

Ian.

 
Ian,

I think where a lot of people have an issue is that FPS != Hz. Hz is pretty much static in comparison, whereas your game can have peaks of 150FPS and troughs of 35FPS. If you're gaming without VSYNC with such a differential, you're going to see all manner of tearing. Sure, VSYNC limits your framerate, but any framerate-based anomalies will be limited to those (hopefully rare) troughs.

I do like a minimum of 60FPS if at all possible. :)
 
Status
Not open for further replies.