Mainstream Graphics Card Roundup

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Yeah I don't know about Hawk X and Last remnant there. Well, hopefully those performance problems will be fixed soon. Anyway it's nice to see that crossfire performance has improved and is more consistent as evident by the performance of the 3870X2 and the 4870X2. Looks like I still don't need to upgrade from my two 3850s 512MB in CF just yet, but I would like to see another one of these articles when DX11 rolls around.
 
[citation][nom]proofhitter[/nom]Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.[/citation]

Just an FYI guys, The Last Remnant was added to the benchmark suite after we received enough reader feedback asking for it--same with GTA 4. You'll likely see the suite vary from time to time based on the email and comments we're getting, but it certainly has nothing to do with the hardware being tested. And kudos to whoever it was that said TLR shouldn't be pulled just because ATI doesn't perform well in it. If anything, fans of the game should be clamoring for better driver support in it from ATI!
 
Nice usage of the torrented version of L4D... can tell by the user name.
 
Am i missing something here? Is Crysis/Warhead no longer one of the most demanding games for GPUs? Why have games like L4D and FO3? most ppl who own or are thinking about buying any of these cards will have FPS well above their refresh rate in those games more than likely. I think ppl are more concerned with whether or not they'll be able to run the beastly games at over 30 FPS with these cards than they are about whether they can get 100+ FPS in the less demanding ones.

I also think it would be nice to see the framerates of these GPUs when running games w/ nvidia 3dvision, as that will be a major determining factor in my next GPU purchase.
 
[citation][nom]rhys216[/nom]@ Ramar!DX10.1 was originally going to be DX10.Nvidea could cut the mustard in time and M$ watered down DX10 to help out Nvidea!It had nothing to do with Nvidea being smart, in fact it's the complete opposite Lol!Perhaps you haven't had enough sleep? how about you go back to bed and cosy on up with TIno some more?[/citation]

I thank the other guy that said it for me, but do you have a source on that? Any reason why the 8 series destroyed the competition for the better part of a year? Working in an IT shop all day, you run into quite a few people willing to spout their insane agendas while you just have to nod and smile. Fortunately, here, I can ask that you give some sort of proof for what you're saying.

[citation][nom]neiroatopelcc[/nom]I've got a 4870 and I can honestly say 10.1 makes NO DIFFERENCE in the real world. But then physx doesn't either (unless you play that mirrors edge thing - which is EA, thus you don't). Buy what is cheapest.[/citation]

There are quite a few titles that support physx; check the wiki. One of the best uses I can think of was in Brothers in Arms:HH. It really does make an attractive difference, and I hope it's in Crysis 3 in some form. Crysis can have great physics, it just slows the cpu to a crawl.

[citation][nom]invisik[/nom]Cheapest gtx260/4870gtx260: 145$ http://www.newegg.com/Product/Prod [...] 68141274304870: 125$ http://www.newegg.com/Product/Prod [...] 6814131140[/citation]

Ahem. The 1GB 4870. Not the 512. Thanks, though.

[citation][nom]zmanz[/nom]Wow, people can't seem to understand that a game can't hate a graphics card. ATI just can't seem to get any good drivers done lately... hmmm... Since the 9800 PRO...[/citation]

Thank you. I find it hilarious that ATI's honest answer for why the 4770 runs Far Cry 2 so well is "We don't know."

I guess if I don't want bad marks I should add that nvidia sucks and tom's sucks and...oh wait, I just realised that I'm reading a FREE article and since I can choose whether or not to read here, maybe I should stop whining!

I don't mean to be inflammatory, but the internet is spoiled. In all honesty, the prices were quoted strangely, and Zotac seems to have a special place in Tom's heart, but if you have half a brain you can read the benchmarks for yourself and do your own homework.
 
[citation][nom]steiner666[/nom]Am i missing something here? Is Crysis/Warhead no longer one of the most demanding games for GPUs? Why have games like L4D and FO3? most ppl who own or are thinking about buying any of these cards will have FPS well above their refresh rate in those games more than likely. I think ppl are more concerned with whether or not they'll be able to run the beastly games at over 30 FPS with these cards than they are about whether they can get 100+ FPS in the less demanding ones. I also think it would be nice to see the framerates of these GPUs when running games w/ nvidia 3dvision, as that will be a major determining factor in my next GPU purchase.[/citation]

Left 4 Dead is a competitive game, and competitive gamers want every fps they can get. 100+ fps isn't an uncommon goal for TWL, CAL, and the other competitive leagues.
 
I don't mean to be inflammatory, but the internet is spoiled. In all honesty, the prices were quoted strangely, and Zotac seems to have a special place in Tom's heart, but if you have half a brain you can read the benchmarks for yourself and do your own homework.

Well that's fine, if Tom's issues a disclaimer admitting it's biased, because otherwise people without half a brain should be able to trust that a review is an honest analysis, of course mistakes happen but I doubt the reporting of price was a genuine mistake.

Also I hadn't heard of 'The Last Remnant' until this article, I am a little surprised so many people would have emailed in to include it as a benchmark? Obviously though ATI needs to sort out better support for this game

I used to put allot of wait on a Tom's reviews but not any more as they are just not trustworthy IMHO.

I will have to find a source later for the DX10.1 thing, as I remember reading an article about it before DX10 was released which was a little while ago.
 
[citation][nom]cangelini[/nom]Just an FYI guys, The Last Remnant was added to the benchmark suite after we received enough reader feedback asking for it--same with GTA 4. You'll likely see the suite vary from time to time based on the email and comments we're getting, but it certainly has nothing to do with the hardware being tested. And kudos to whoever it was that said TLR shouldn't be pulled just because ATI doesn't perform well in it. If anything, fans of the game should be clamoring for better driver support in it from ATI![/citation]

Can you please explain why you don't benchmark HAWX using DX 10.1 for ATI cards?
 
[citation][nom]proofhitter[/nom]..."ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. ...[/citation]
If is better, is better.
Also, If the game gets higuer framerates from PhysX support, PhysX card is better.
 
According to the graphs for my type of games I'll need at least a Radeon HD 4850. That's a bit surprising,since I thought the HD4670 would do just fine with it's near to perfect power draw.
But even the HD4770 seems to lag on certain games.

In fact most of these titles are only doing good starting from a HD4870 (from ATI's products) at maximum detail settings..
Then again, it's fairly easy to disable AA and decrease resolution to have acceptable fps.

It's a pitty.. I'd love to see a card performing like a HD4850 or HD4870, but with the powerconsumption of the 4670.
 
Nice article on mainstream graphics cards with plenty of good data.

I would have like to have seen the complete graphics temperatures for 2D and 3D and the slots.

Also, I would really appreaciate if the 3DMark2006 scores were in a table I can load into EXCEL so I can do my own analysis.

I am interested in the best performance per watt (after compensation for chassis load). Performance to me is more than raw frame rate. It is also framerate/watt. Many of these cards are higher performance because they require more power. If you normalize to 3D power consumption, the 9600GT and 4770 really shine. I am interested in cards in the 7950GT/9600GT power consumption class -- no more than 65W.
 
[citation][nom]marraco[/nom]If is better, is better.Also, If the game gets higuer framerates from PhysX support, PhysX card is better.[/citation]

Agree'd on both points. I think we all agree on that, and wish Tom's would take note.
 
Sorry for the double post, but I messed up the quote: Comparing HawX in DX10.1 is fair as well as games with physx support. Though I guess they just tend to stay away from both.

And to the guys worried about power draw, a 4770 runs just about everything comfortably if you're willing to sacrifice a few settings.
 
funny thing about last remnant...set shadows to low and it runs beautifully...I'm pretty sure TLR uses physx for the shadow calculations, so of course ATI cards will be at a disadvantage. on my modest rig (phenom 2 x3 720, 4GB DDR21066 ram, HD4870 512MB w/ latest drivers) TLR runs perfectly at 1440x900 with everything except shadows maxed. I haven't done a real test, but I would say FPS never drops below 60 or 70, and is often above 100.
 
I am just happy that I am not the only complaining about lack of DX10.1 benchmarks and that those comments are getting many thumbs up.

Frame rates were good overall, but AA reduces 3D performance by as much as 50%.

Well, all the more reason for DX10.1 as that is one of the things DX10.1 addresses? Nvidia GeForce GTX 275 896MB Review: Tom Clancy’s Hawx (DX10 / DX10.1)

Especially since you said:
The graphics differences between DX 9 and DX10 are huge, as sunlight effects are sharply reduced and the haze over landscapes and cities goes missing.

So you want DX10 if you card is fast enough and if you want AA and got DX10.1, you not going to use it?

I too would like to know the reason for omission of DX10.1 here.
 
proofhitter, cinergy, NuclearShadow, scrumworks, Summer Leigh Castle, da bahstid said it all.
But i have to say it again. NO MERCY FOR NVIDIA IF ITS CLEAR THAT ATI WINS!! And ATI IS a winner today. I cant help thinking, that the Last Remnant was added at the end of writing this article, just to let Nvidia win again.

NOT COOL
 
doesn't matter how cheap they are, i won't buy one unless it fits in my case. why do they have to keep making them bigger and bigger?? i mean manufactures put lots of effort into coming up with the ATX spesifications, so cases would be made to accommodate motherboards and psus properly, and then the moronic gfx card makers think its perfectly acceptable for a card to be longer than the mobo is?!?!? computers are supposed to be getting SMALLER, NOT BIGGER, so in 10 years time will we have the room size gfx card? and everybody will think that is awesome?

i can only hope 40nm brings with it some smaller, powerful cards.
 
[citation][nom]matt87_50[/nom]doesn't matter how cheap they are, i won't buy one unless it fits in my case. why do they have to keep making them bigger and bigger?? i mean manufactures put lots of effort into coming up with the ATX spesifications, so cases would be made to accommodate motherboards and psus properly, and then the moronic gfx card makers think its perfectly acceptable for a card to be longer than the mobo is?!?!? computers are supposed to be getting SMALLER, NOT BIGGER, so in 10 years time will we have the room size gfx card? and everybody will think that is awesome?i can only hope 40nm brings with it some smaller, powerful cards.[/citation]

Ok, imagine this: there were CTR monitors/screens and they were awesome (and still are), but people need something new, better (not exactly, but w/e), so LCD's were invented. Do you know the biggest CRT size? I'm guessing less than 30". And how big new LCD's are? Average size is around 50". Hope you get my point.

If you want PERFORMANCE, technology that will make it possible to put 2 RV770 or 2 GTX 275 chips on PCB of size of GF3 is not yet existent
 
@ambictus
and @cangelini

I don't blame the fact that you test TLR but I think it is unfair to include the results in the overall performance graph without a mention as it clearly misleads anyone who doesn't read the whole 19 pages of the article. It would be better if you'd just tell readers that the game can be unplayable with some ati cars but exclude the game from the overall results.
If you include TLR in the overall, thus bringing all the ati cards down (or bringing the nvidia ones up) to inform people they can't play this particular game without even mentioning it under the graph, I'm sorry but I don't understand. While you did this why didn't you score 0 FPS to any card which couldn't run HAWX in DirectX 10/10.1 mode?

HAWX test on hardwarecanucks

I'm neither pro ati nor pro nvidia. The GTX 295 is simply the best single card as of today but I think it's a completely different story in the mainstream market.

As for the final result using your own graphs including TLR (that disadavantages the 4870):
ZOTAC GTX 260 175$ at newegg vs HIS HD 4870 1G 160$ at newegg that's a 9.4% price difference.
At 1600*1200 4xAA: +5.4% for the GTX 260 --> I'd take the 4870
At 1920*1200: +3.7% for the GTX 260 --> I'd take the 4870
At 1920*1200 4xAA: +7.4% for the HD 4870 --> I'd take the 4870
At 1920*1200 8xAA: +6.7% for the HD 4870 --> I'd definitely take the 4870.
So why do you take the GTX 260? When you translate something, be sure to translate the price, which was the point of the whole article.

For all these reasons and despite what you've said I still think this article is biased and TH uses its *cough* credibility to mislead the average guy seeking for information on which graphics card they should buy.
 
@proofhitter

Very well said, and what I wanted to do had I had enough time back on page 1. To be honest, though, the cheapest 260 216 on newegg is ten dollars less than the zotac, bringing the price difference to five dollars. Since pricing is that close, you really have to examine the games you're going to play and decide which are more important. Or simply, which company you like best.

Also remember that [as someone else was nice enough to point out] that these benches are run on an i7, which effectively gimps the Geforce scores. Anyone buying a graphics card in this range is probably running a Phenom or a 775 chip, in which case the geforce would almost definitely outperform the 4870.

And to all of you screaming about an "ongoing" Nvidia bias, read http://www.tomshardware.com/reviews/radeon-hd-4870,1964.html where Tom's trumpeted the great coming of the 4870, even with TLR. These were on a 775 chip, so pretend the 260's numbers are around 5-10% higher and you've got the 216, proving my point that they're near identical cards for near identical prices.



 
Oh and, @matt,

Most "gamers" run full tower cases for better airflow and other reasons, and most mid towers are being shaped to better suit long cards. Where metal beams used to block a card that was too big, they've started pulling those beams back towards the case.

I still doubt we'll see cards much bigger than this; it's getting to the point where they barely fit in a full tower.
 
Status
Not open for further replies.