Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Well, I thought that you summed it up nicely enough. However, seeing you insult someones intelligence after misspelling the word and improper grammar in a sentence that had that word in it, I thought I had to say something and it seems that nn agreed. Notice how you are not thumbed down. I'm pretty sure that nn's telling you that it was too tempting to correct someone who misspells intelligence was a clear hint of this exact feeling. Besides, I did grammar police, not spelling police. nn did spelling police.

You brought up a decent point. However, whether or not a 120Hz monitor or a higher picture quality monitor are better for gaming is a pretty subjective thing to argue about... What monitor someone games on depends on the gamer. For example, I prefer my CRTs for gaming over anything else in my house because they have the truest color and highest refresh rate and they don't have scaling issues if the resolution in use isn't their native resolution. Since I'm not able to afford a several hundred dollar monitor, am I stupid for gaming on something almost as old as I am?
 


I've seen 120Hz panels in friend's houses and I think that there is a noticeable difference between them and 60Hz panels if you get a good one (can't get one that is higher than 8ms because then it lags and looks like crap compared to one with a sub 8ms time because each refresh takes longer than 1/120th of a second). Some people can tell. Whether or not most of them are just placebo or not, I can't say, but I can tell the difference in some games. Now, it also needs to have at least close to 120FPS for me to tell the difference and I don't think that it is worth it to go that high because it costs more than twice as much money to get a graphics system that can handle 1080p or more at 120FPS than one that can handle 1080p or more at 60 to 75FPS. I've also done it with UT 2004 (one of my favorite games, yeah, low budget means slowly updating the games too... Oh well, it's still fun to load up a 6-8 player map with 32-33 players and go at it) and it's noticeable.

Still, not worth the extra money spent on a 120Hz panel or the compromise in picture quality (take your pick, more money or quality usually need to be chosen), at least not in my opinion. I might do it with graphically weak games like UT 2004 or WoW and a 120/240Hz CRT, but I can't afford a god LCD/LED monitor. CRTs still have excellent picture quality if you find the right ones and it might be a lot cheaper that way. Going above 120Hz, now that, I can't tell any difference.
 
[citation][nom]Cleeve[/nom]Yeah, no kidding, right?We totally could have published this weeks prior before there was a fix. Instead, we gave AMD the heads up and worked with them to remedy a problem that they appreciated being told about.Apparently, giving a shizz about image quality and looking into issues isn't OK with some commenters if AMD is involved because they'd rather we bury it.If it was Nvidia who had a problem, and we found it, none of those people would have complained.Pathetic. They know who they are, and they know it's true, too.[/citation]

Exactly. There's four ways it could've been done:

1) Run a headline as soon as you've found the problem informing everybody that there is a problem and that AMD are cheating
2) Run the same headline without the accusation
3) As 2, but promise to talk with AMD about it/say you're in contact with them
4) Hold the article back until AMD have gotten in touch and the issue has been, at the very least, acknowledged with a timeframe on a fix

1 is definitely wrong, 2 is a name-and-shame exercise and only a step up, 3 is a big improvement but it might not reduce the AMD bashing, and 4 is or should have been the best practice, instead we get a whole ton of comments along with more than the slightest hint of fanboyism.

I'm (apparently) one of the rare individuals on Toms who doesn't really experience AMD driver woes. That's okay, I am way off the bleeding edge (4830, rah!) and it wouldn't matter too much if things got a bit worse. However, historically, there have been plenty of complaints about driver quality and what makes it worse is that the hardware is usually excellent all-round. I remember the 8500 and the almost endless lamentations about how good it was compared to the NVIDIA competition, yet drivers were killing it before it could truly compete. We all know AMD (and, in the past, ATi) can make impressive cards, but the situation at times would be akin to buying a sports car with lousy engine mapping. Conversely, the same can happen with NVIDIA - the 196.75 drivers, from what I've read, play havoc with fan settings, causing a graphics card to potentially fry itself. Apparently, this isn't the only instance of this happening, but it does show that neither AMD nor NVIDIA is infallible.

All too often, people are happier to bitch about a problem and the respective company rather than actually sit down, diagnose and discuss the issue. At the risk of sounding like a Toms apologist, they have at least looked at issues, what might be the causes, and what could be done to fix or at least work around the problem at hand. If Toms wanted to slate AMD, I'm sure AMD would stop sending them review hardware, but you still have to offer constructive criticism otherwise you wouldn't have a bad word to say about anybody. Having a bad review can be just as good, if not better than, a good one, as you feel compelled to investigate why something did so badly. Toms, like other sites, readily heaps criticism on Intel's HD graphics (especially their drivers), yet Intel aren't suddenly going to stop sending review kits. Everybody slated Fermi for its power usage, yet NVIDIA didn't halt sending them cards. As it is, a single slightly misworded (or rather, should that be misread?) conclusion can send dozens of people into spouting accusations of favouritism.

So yes, sometimes there really is no way to avoid being slated for a thorough article which has actually benefitted an entire consumer group. 😛

I'd like to add that Toms occasionally goes above and beyond - I made full use of K10STAT thanks to the two-part Toms article on maximising efficiency with K8, K10 and K10.5 CPUs, and I'm sure a lot of people here use this or similar software to reduce power usage without reducing performance; after all, not everybody is an overclocker (certainly not here with a PII 710 on an AM2+ board).
 
I guess different systems have different issues. I had very recently had both 1 x GTX580 and SLi'd GTX570s and they had the same image quality issues in a HAWX...a grid-like pattern that was very noticable on dark scenes. There was also horrible tearing in Falcon 4.0 A.F. when using the 570's. The 2 x HD7970s I'm using now have noticeably better image quality...flawless to my eyes in both these games.

...I'd always been more of an nVidia fan until this experience (and that was for years and years).

It seems that different systems (or driver versions) have different issues. I guess this is obvious to those of you that live and breath graphics cards (and related news) but it was really surprising for me.
 
[citation][nom]blazorthon[/nom]I've seen 120Hz panels in friend's houses and I think that there is a noticeable difference between them and 60Hz panels if you get a good one (can't get one that is higher than 8ms because then it lags and looks like crap compared to one with a sub 8ms time because each refresh takes longer than 1/120th of a second). Some people can tell. Whether or not most of them are just placebo or not, I can't say, but I can tell the difference in some games. Now, it also needs to have at least close to 120FPS for me to tell the difference and I don't think that it is worth it to go that high because it costs more than twice as much money to get a graphics system that can handle 1080p or more at 120FPS than one that can handle 1080p or more at 60 to 75FPS. I've also done it with UT 2004 (one of my favorite games, yeah, low budget means slowly updating the games too... Oh well, it's still fun to load up a 6-8 player map with 32-33 players and go at it) and it's noticeable.Still, not worth the extra money spent on a 120Hz panel or the compromise in picture quality (take your pick, more money or quality usually need to be chosen), at least not in my opinion. I might do it with graphically weak games like UT 2004 or WoW and a 120/240Hz CRT, but I can't afford a god LCD/LED monitor. CRTs still have excellent picture quality if you find the right ones and it might be a lot cheaper that way. Going above 120Hz, now that, I can't tell any difference.[/citation]
You bring up an interesting point and you are civil about it. Kudos.

But I still have a strong conviction that the smoothness is in the eye of the beholder. If I were to take two 60Hz displays, and tell someone that one was 120Hz, more often than not they could "see the difference", I betcha.

In fact, that is doubly true after someone just foolishly shelled out a small fortune on one of those worthless monitors. In that case, they have to see the difference. The alternative is unthinkable (that they are gullible morons who are $700 poorer).
 
Thanks for the info, Tom!

Now I see what kind of crap I bought (7970)... Should have wait for Kepler.
 
Wow, differing perspectives. I'm happy so far with the 7970's. I had initially ordered to GTX680's but my source said 3 to 5 weeks. ...but was able to get me the 7970's the next day. ...for my my semi-casual gaming @ 2560x1600 the 7970s are fine should hopefully will last me for a year or two.
 
[citation][nom]Veirtimid[/nom]Thanks for the info, Tom!Now I see what kind of crap I bought (7970)... Should have wait for Kepler.[/citation]
Saying thanks for something you didn't read is somewhat amusing, but that's okay - it's your prerogative.
 

I did read it btw. When I said that I was relating to constant AMD driver problems and imperfections, it's true, not just my perspective or prerogative.
 
[citation][nom]Veirtimid[/nom]I did read it btw. When I said that I was relating to constant AMD driver problems and imperfections, it's true, not just my perspective or prerogative.[/citation]

Really? What problems have you been having with your drivers?
 

These are all the problems I got so far with my Radeon 7970:

1. Game crashes (mostly in Witcher 2) with tray error "Catalyst driver was shut down and been recovered successfully"

2. Weird occasional lagging in some games: ME3, Heroes VI, Witcher 2.

3. Screen flickering + colors change to very dark and unrealistic on HDMI 1.4a cable, had to switch it back to DVI

4. Squealing noise from 7970 in some game menus and some 3DMark 05/11 benchmarks. I did a research on it and apparently this is "normal" for AMD VGAs: http://forums.steampowered.com/forums/archive/index.php/t-2480627.html

Maybe some of those problems will go away eventually with newer driver updates, but I never had this kind of thing when I had Nvidia VGAs, just saying...
 


Both nVidia and AMD have these types of problems with various hardware and drivers from what I'm learning. I don't know if this is related but have you done any tweaking to your 7970 and are you using Catalyst 12.2 or 12.3? What kind of display are you shoving with that?

 

I have 12.3 version of drivers. I did crank up a frequency scale in CCC on max for engine and memory clock. Ran benchmarks - no artifacts or any other sort of errors, except that annoying squealing noise, but it also presents on stock frequencies.. My display is Samsung PX2370 23" running on 1920x1080 resolution.
 
[citation][nom]halcyon[/nom]Both nVidia and AMD have these types of problems with various hardware and drivers from what I'm learning. I don't know if this is related but have you done any tweaking to your 7970 and are you using Catalyst 12.2 or 12.3? What kind of display are you shoving with that?[/citation]

Well that sucks.
 

I'm not expert (I just play one on TV :) but I would try things liked uninstalling and re-installing the drivers and perhaps moving the card to a different slot (and if nothing else, try a different pair of PCI-E 8- & 6-pin cables). I couldn't get my 7970's to CrossFire until I swapped the PCI-E power cables...and I'd had no problems with the GTX570s. ...but had some really annoying artifacts with the 570's and a GTX580 I auditioned.

This has all taught me that its still a work in progress. In HAWX I get a max of 500fps. In HAWX2 I get a max of 62fps. These are both @ 2560x1600. One is DX10 the other is DX11.
 
[citation][nom]PCgamer81[/nom]You bring up an interesting point and you are civil about it. Kudos.But I still have a strong conviction that the smoothness is in the eye of the beholder. If I were to take two 60Hz displays, and tell someone that one was 120Hz, more often than not they could "see the difference", I betcha. In fact, that is doubly true after someone just foolishly shelled out a small fortune on one of those worthless monitors. In that case, they have to see the difference. The alternative is unthinkable (that they are gullible morons who are $700 poorer).[/citation]

Well the comparison I made was between a u2711 2560 x 1440 monitor which is MORE expensive then a 120hz 2ms monitor of the same size. I own both, and there is a clear difference between the two. the 2ms 120hz monitor has much less motion blur and the input lag is 100% noticable on the IPS panel.

So which monitor choice is better IS subjective on the gamer, but because of the game-types that person prefers. If you want the best possible first person shooter experience, then the 120hz 2ms monitor is the best choice. If you spend most of your time on stratigy games, movies, picture/video editing, etc then the IPS panel is the clear choice. Scrolling in RTSes is a LOT smoother on 120hz 2ms VS the u2711, so it's not like RTSes don't benifit from the low input lag/high FPS, but I'm not a competitive RTS player so I won't comment if it actually improves your gaming like it does in FPSes, but it does look nicer *Tested it on SC2, SupCom2 and CoH*. It looks smoother and less blurry when moving the camera around.

But if you think you're a "Competitive FPS player" and you opt for a laggy, 60hz 12ms IPS panel over the 120hz 2ms monitor, of which is MORE expensive then the 120hz 2ms monitor, then you just wasted your money on a lesser experience.

Also, if all you can afford is an old CRT monitor, then rock it. Gaming on old tech is still better then not gaming at all. But my point was directed at a people spending nearly the same amount of cash on two top end choices.
 
[citation][nom]airborne11b[/nom]Well the comparison I made was between a u2711 2560 x 1600 monitor which is MORE expensive then a 120hz 2ms monitor of the same size. [/citation]

I'd never bothered to do the research to find out that a u2711 was native @ 2560 x 1600. I thought it was 2560 x 1440. That's pretty nice.
 


Excuse me, you are very correct. It's 2560 x 1440.

The u3011 is 2560 x 1600. My wife has been using my u2711 for a long time, I mixed my numbers up.
 
[citation][nom]airborne11b[/nom]Excuse me, you are very correct. It's 2560 x 1440.The u3011 is 2560 x 1600. My wife has been using my u2711 for a long time, I mixed my numbers up.[/citation]

In fps high refresh rate with low input lag is the win for sure, in rts's like SC and the like i prefer higher res and higher pixel density than high refresh rates (the U2711 and U3011 are my favorites there). I guess it boils down to taste (and having hardware that can support the high res game play). Got 2x 680gtx under the hood now.
 
THG has been biased for many years. It magnifies AMD/ATI card minor flaw while neglecting explicit nVidia card problem.
For many years when playing DVD, HD movie , nVidia card render very unpleasant qualities when compare with that of AMD/ATI and even not comparable to INTEL. But THG just ignore these facts, why ?

U know what ? the function of a display card can be playing games and playing movie to majority people, so, from a reasonable and common people standard, I shall say that nVidia has been a failure start from GF3 generation.

Again, very few tech site/magazine talks about it , why ? simply because they are spoiled by nVidia's $$$

hate these crap writer.
 

You got some more info on Nvidia's flaws? Any proof links?
 
[citation][nom]eggfillet[/nom]I don't notice any difference to be honest, maybe it's just me.[/citation]
Of course you don't. You're not superhuman, or elitist.

The only thing I hate worse than elitism is wannabe elitism.

I play all manner of shooters on my 1080p @ 75fps and I set the refresh rate down to 60 because there is no use in overworking my hardware when I can't notice a lick of difference. But these geniuses would have me believe they can still notice a difference at over double the point where I stop noticing.

*psst* between you and me, I think they were picked on in school.

Where was I? Oh yeah, I worship at the feet of those with superhuman eyes!
 
[citation][nom]The Lads[/nom]Tom's has always had a pro nvidia bias, just like the article, nothing to see here..[/citation]
I never understood why it's so bad to be biased, as long as you are biased on the side of quality.

I prefer AMD cards for personal reasons. But, Nvidia cards function better, perform better, and are overall better IMHO.
 
Status
Not open for further replies.