Radeon HD 4870 X2: Four Cards Compared

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]BlakHart[/nom]my 1gb 4870 is always left out lol[/citation]

I wish they'd included the 3870x2. While it's generally equivalent to the 4850 in many cases, it might do worse (or better) in particular resolutions. It's one of the cards listed on their charts as being so close to the new generation that upgrading's not usually worth it.
 
[citation][nom]rreeewww[/nom]Why benchmark all the same pack of outdated games (except Mass Effect)? I'd like to see up to date games like Fallout 3, FarCry 2, Dead Space, Left 4 Dead, etc.[/citation]

I wish they'd include some MMORPG's too. Wrath of the Liche King and Mines of Moria are out right now. Quite awhile back, either Anand's or Tom's did an article on graphics performance for WoW. It would be nice to see that updated with the expansions. Yes, we know Mass Effect and Crysis are more graphics intensive than any MMORPG, but performance there counts for those of us who play it. I play LOTRO on an 8750 Toliman with an MSI 3870x2.
 
[citation][nom]rreeewww[/nom]Why benchmark all the same pack of outdated games (except Mass Effect)? I'd like to see up to date games like Fallout 3, FarCry 2, Dead Space, Left 4 Dead, etc.[/citation]
Same reason they won't soon release the same tests on an i7 platform. If they have to use the most recent games, they'll have to retest all cards and not just the 4 'new' ones. All the other cards in the charts were tested at some earlier time, probably for other articles.
 
Nice review folks, but I have two points

1- Where's Catalyst 8.11 drivers ?
is it new for the tests ? so why the 180.43 is there where it's newer than the 8.11 !!
2- Is it me or I'm the only one not seeing HD 4870 1GB ?
 
[citation][nom]Pei-chen[/nom]Unless you are a fat as a cow you don't need a power house SUV to move your ass.It seems that people don’t care their 4870 X2 cost $100 more per year to run over GTX 280/260.[/citation]

No and who said you need massive power. who needs a 4870x2. Same people who NEED a suv just people who want the most power for thier dollor. But your right i wouldnt care $100 a year is pennys hell i really dont care about $100 a month that i blow on stupid crap. now its debatable a video card would cost you a hundred a year. just doing a little math in my head that comment just made me laugh.
 
nice effort by Tom's, an article done by enthusiast for the enthusiast. 😀
it seems you have completed all possible useful benchmarks. we know what the results might look using these cards on core i7.
maybe we're going to see benchmarks for new games before directx 11 cards are sampled.
 
[citation][nom]dagger[/nom]Unless you game 24/7, you'll be lucky to get $10 per year savings in electricity bill.[/citation]

[citation][nom]EnFoRceR22[/nom]No and who said you need massive power. who needs a 4870x2. Same people who NEED a suv just people who want the most power for thier dollor. But your right i wouldnt care $100 a year is pennys hell i really dont care about $100 a month that i blow on stupid crap. now its debatable a video card would cost you a hundred a year. just doing a little math in my head that comment just made me laugh.[/citation]
The fate of democracy in the hands of people that can’t do basic math.
GTX 280 on X68@2.93 - 117W
4870 X2 on X68@2.93 - 202W
Difference = 85w/hr
Assuming you turnoff your PC for 8 hours during sleep = PC on 16hr.
85*16*365 = 496,400 Wh = 496.4 kWh
Con Edison charge in NYC $0.20 per kWh so 496.4 kWh = $99.28
 
[citation][nom]jerreece[/nom]LOL $100 more per year? What kind of math did you do? Seriously, the power use between video cards is not that dramatic. Even if the card used 150W more, that'd be like running two extra 75W light bulbs in your house. Your Kilowatt hour usage isn't going to jump that dramatically....If you had a 100W light bulb running 24/7 for a month you're talking about 731 hours (give or take). That's 73.1 kWh use. Here in Montana, we're paying about $0.10 per kWh which is $7.31 per month to operate a device 24/7 at 100W for a month. That's a total of $87.72 a year, to run a 100W device 24/7.So there's no way the small difference in power usage between two video cards can cost you $100 extra per year.[/citation]
[citation][nom]dagger[/nom]Unless you game 24/7, you'll be lucky to get $10 per year savings in electricity bill.[/citation]
Most people pay more for electricity.

[citation][nom]ravenware[/nom]Prius is a BS answer to a real problem. There were cars getting similar mileage before it's release. This is probably the most overhyped vehicle/product of the decade.A real solution would be to abandon gasoline as a fuel source all together or use it at an absolute minimum; not to make a care that gets 40MPG.[/citation]
Find me a conventional car that can stop using gas during traffic jam.
 
One thing that continues to confuse the hell out of me is why on FSX non-SLI GTXs outperform the SLI and why the 260 outperforms the 280 in the "1920x1200, GameAA, Game AF, Ultra Quality" bench. Did the polarity of the planet change?

Regarding fanboi comments about AMD-ATI beating nVidia here, it depends on the app/game. Every non-biased tech site out there says to look at all the game benchmarks before making a decision on going SLI or CrossFire. For me, nVidia dominates AMD-ATI in Crysis and FSX, and that's all I care about at the moment.
 
[citation][nom]Jhawke[/nom]One thing that continues to confuse the hell out of me is why on FSX non-SLI GTXs outperform the SLI and why the 260 outperforms the 280 in the "1920x1200, GameAA, Game AF, Ultra Quality" bench. Did the polarity of the planet change?[/citation]
Na, but it looks like coding practices have gone down the drain.
 
Why no 8800GT/SLI? It's an older card, but still very popular.

And yes, the results do look rather wierd in some cases. Why does
Crysis show GTX260 SLI being faster than GTX280 SLI, but both
slower than a single GTX280? Looks like the drivers in some cases
are way off for SLI/CF setups.

Ian.

 
[citation][nom]Pei-chen[/nom]Find me a conventional car that can stop using gas during traffic jam.[/citation]

You asked for 1 car, well heres 1 car that doesn't use ANY gas... traffic jam or not... 2009 Civic GX NGV

or we can go the electric route: www.teslamotors.com

Any "solution" that involves oil is not the solution to alternative energy.
 
May I suggest a better way of organizing the data? such as using tables rather than long lists? Otherwise good article with loads of information.
 
you do realise that the ASUS smart doc needs to be UPDATED WHEN THE GRAPHICS CARD DRIVER GETS UPDATED!!!! DID YOU GUYS EVEN CHECK IF THERE WAS AN UPDATE??

without this "update" the graphics card will not work properly with the smart doc if you install a driver that the smart doc cant communicate with

ASUS Smart Doctor version 5.286 released 2008/11/18
Smart Doctor provides 3 major features:
1. VGA card information and running status for GPU temperature, GPU fan speed(optional) and so on.
2. Overclocking feature.
3. GPU temperature detection.
If your VGA card supports Smart Doctor, after the installation of Smart Doctor (Don''t forget to install VGA driver and ASUS GamerOSD), you will have both 3 features.
 
[citation][nom]Pei-chen[/nom]The fate of democracy in the hands of people that can’t do basic math.GTX 280 on X68@2.93 - 117W4870 X2 on X68@2.93 - 202WDifference = 85w/hrAssuming you turnoff your PC for 8 hours during sleep = PC on 16hr.85*16*365 = 496,400 Wh = 496.4 kWhCon Edison charge in NYC $0.20 per kWh so 496.4 kWh = $99.28[/citation]

We pay 7.5 cents here. But are you trying to assume your card is pulling gaming load 16 hours a day? if i install a outlet thats max load is 1000watts but i never plug anything into it should i count on my bill that im going to be spending 2.1kwh a day? However if by that you mean the whole system then yes that seems alot more likely and i would agree if you also assume at idle which most likely most computers would be at most of the time when they are on during the day.
 
Is the GTX 280 in tri Sli or is it dual Sli? The comparison from an 2x to a GTX will work. Its 4 GPUs vs 2 GPUs. If you would compare a tri Sli to the 2x maybe it would be a fair competition.
 
4v2 is unfair but 3v2 is fair?

Tripple SLI would be like 6 GPU's Dual sli would be 4 GPU's Though since im sure you ment dual and tripple card SLI ill let that slide.
 
So....

What I read was that even though the expensive 4870x2 was good, the 260 and 280 were better, lol. 260's in SLI costs only $239 per card now. Is this iwhat I was supposed to take from the article? LOl.

Cheers,
 
Thanks to add the excellent FPS "Enemy TerritoryL Quake Wars" in your test.

It's my favorite game of the moment. A game who is underestimate by many player because it's not conventional like other FPS.

I regret to not have see the test with RTS "Supreme Commander". Maybe next time.

Thanks for Crysis, Call Of Duty, and World of Conflict too.

I have BFG GeForce 9800X2 with Intel Core 2, E8400 with 8 GB DDR2/1333 MHZ of memory running on Windows Vista 64-bit all updated.

Vista is for at least two Core. Else it suck.

I use Fraps 2.9.6.7637 and run ETQW at 1660x1050 (stange they don't have standard resolution 1600x1200) with 2 AA I think.
And I have 60 fps all time.

But I have 4 Hard Drive in Raid 0, which help to be faster too.
Something it's all time underestimated on all benchmark.

Thanks again and have a nice day !
 
I stopped using ATI because of their terrible drivers a long time ago. Was considering trying again due to price vs. performance, but after seeing that Catalyst is STILL a mess, I'm glad I stuck with Nvidia.
 
[citation][nom]Pei-chen[/nom]The fate of democracy in the hands of people that can’t do basic math.GTX 280 on X68@2.93 - 117W4870 X2 on X68@2.93 - 202WDifference = 85w/hrAssuming you turnoff your PC for 8 hours during sleep = PC on 16hr.85*16*365 = 496,400 Wh = 496.4 kWhCon Edison charge in NYC $0.20 per kWh so 496.4 kWh = $99.28[/citation]
Thank you Pie-Chen - when there were just single cards being reviewed, and the 4870 came in just 1-3 watts less in full 3d than the GTX260, the red started screaming efficiency and blabbering about electric bills. They tried to make certain noone noticed the GTX260 was 30 watts LESS than the 4870 in idle or 2d mode...
Now the tables are turned completely, and of course - well I LAUGHED when I saw you got 20 MINUSES for your post where you merely pointed out the fat power hogging suction of the 4870x2 because "strangely" the power chart was the first thing NOT SORTED FOR WINNERS/LOSERS - so that fat power hungry piglet ati electric earth destroyers PROBLEM could be hidden on down the line...
LOL
Very observant, it's just the ati reds are now professional liars, so expect blood everywhere - oh boy wait till the 295 actually gets used...lol... oh it's going to be more bloviating.
( It's too bad ati can't even to this day match the dominance of the NVIDIA ! GTX280 - ati panicked, and had to combine 2 cards, or use CF, or both - why can't ATI make a single GPU card as good as the GTX280 is ? Because they suck, and need so much help from lying fans who get mad when the truth is pointed out ? That's it it sure seems. )
 
Status
Not open for further replies.