AMD's Radeon HD 4870 X2: R700 First-Look

Status
Not open for further replies.

tipmen

Distinguished
Jun 30, 2008
244
0
18,680
0
Good job ATI/AMD! really good card you have here lets hope its cheaper then the GTX 280. I'm glad i waited it out now i can replace my old setup. Now i will get new X48 board and 2 of these puppys and im sure I need a better power supply I don't think a 750 watt will do it.
 
G

Guest

Guest
Whoa, what happened to the heat and power consumption level charts? Where they that far off the scale you dare not show them? The card performs nicely but this is ATI/AMD we are talking about, they love heat and power. So what are they?
 

Cmhone

Distinguished
Feb 15, 2005
21
0
18,510
0
It says at the end of the article their power meter went kaput, and also not to expect much in the power consumption department anyways.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,079
0
20,810
9
Someone explain to me what this means in english?
"Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1
 
G

Guest

Guest
It's nice to see AMD/ATI challange nVidia again, it's been too long nVidias market. My take is that this situation can only benefit the consumers; push for faster GPU's to more affordable prices. It will be interesting to see nVidias run to retake the performance as well as the midrange price/performance crown.
 

randomizer

Champion
Moderator
[citation][nom]neiroatopelcc[/nom]Someone explain to me what this means in english? "Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1[/citation]The HD3870X2 had only 2x512MB memory, and each GPU could only access 512MB, which held its performance back. Although I think it should say "doubled", not "quadrupled", since the HD4870X2 has 2x1GB.
 

Thurin

Distinguished
Apr 8, 2008
70
0
18,630
0
.............................
Point 1:
It's great that Ati is putting up a fight and it's obvious that with all fan-boys and trash talking/flaming aside the only reason why this rapid progression on a pure processing and on the overall technological level is made possible through competition between the titans of the graphics processing world.
.............................

BUT!

According to other valued/trustworthy sources the 4870 X2 manages only a 1.09 over 1.00 increase in performance.

see link:
http://tweakers.net/reviews/957/4/his-radeon-hd-4870-x2-versus-asus-geforce-gtx-280-pagina-4.html (Copy paste the link if needed)
.............................

As for the significant increase stated in this article, it's great, but only applicable to those games with integrated support for crossfire technology. (Further and extensive testing would be desirable, and posting actual facts to support claims would also be duly noted. // I know some testing has been done, but a full review on exactly what value is added with the new card, still remains to be seen.)

.............................
Point 2:
As for NVidia's reply to the 4870 X2 dual-gpu card is the introduction of the 65 nm -> 55nm gpu chipsize reduction.

Link:
http://tweakers.net/nieuws/55050/introductie-55nm-versie-geforce-gtx-280-eind-augustus.html

According to the news this is to take place at the end of August. meaning that the reply is ready and launched by the end of THIS month.
.............................

I would like to at this point state I'm not a fan boy for either side, and I myself have switched from Nvidia to Ati back to Nvidia and will keep doing this as either side dominates the market with high quality merchandise and depending on which one is the market leader at the points where I update my machines.
.............................

My apologies to those of you who cannot read or understand Dutch.

Here is a substitute link for the interested English parties:
http://www.stokedgamer.com/2008/07/nvidia-geforce-gtx-290-launching-at.html

The reason I responded with such diligence and ferocity is not because I hate Tomshardware, think that Ati is doing a bad job or that NVidia rocks beyond compare... The reason for my reply is that I wish for the consumer to be fully informed and not end up buying a more expensive product for less bang/buck than originally expected.

Stay informed folks ! (For those of you who have a no limit budget the 4870 X2 might be the way to go, other than that, if you are looking for a more financially affordable card, wait for the 55nm NVidia GTX280 revision (name is yet to be determined, release expected at the end of the month))

That concludes my rant for the day and I hope you guys are now fully informed or that I have at least have sparked your interest on the field of gathering information before drawing a conclusion based on a single review ;)

Look around! and stay informed ! Ati and Nvidia both rock and the best of luck to both companies... may the struggle be eternal and the progress fiery and extreme!

.............................

-= Michael Out =-
 

Thurin

Distinguished
Apr 8, 2008
70
0
18,630
0
[citation][nom]Thurin[/nom] The reason I responded with such diligence and ferocity is NOT because I hate Tomshardware [/citation]

I'd like to stress that fact, as I read Tomshardware every day and have always enjoyed reading the news articles, however I would like the news to be as unbiased as possible and lately it seems Toms has been slipping a little.

.............................
-= Michael Out =-
 

thepinkpanther

Distinguished
Nov 24, 2004
289
0
18,780
0
$500+ for 2 gpus with 1gb memory each? i guess its worth. I usually never spend more than $300 for a card so i guess ill still wait till the 1gb 4870 comes out. I think the 260gtx would have been worth it, if it had dx10.1...even thought i dont know what the .1 does.
 

Thurin

Distinguished
Apr 8, 2008
70
0
18,630
0
The 10.1 is just an extended graphics capability suite including all features and extras that were planned for dx10 in the first place but couldn't be included in the primary release.

( hence the relatively small difference between dx9 and dx10 and dx10 games being cracked to work on dx9 machines with all the same effects save for the few bits of eyecandy that make up the difference between dx9 and dx10 )
 

ZootyGray

Distinguished
Jun 19, 2008
188
0
18,680
0
so... nvidiot will continue to offer 2nd rate overpriced underperforming stuff that's obviously been price inflated in the first place and so... so it's easy to reduce the ripoff price in the face of genuine article? Is that about it Thurin Michael out? And you are protecting me how? By baffling bs? How stupid are you? Or do you think we are?

Blow your doors off in really high resolution. And the drivers evolve.

Best card. You really are out michael - they pay you for that?

btw how's the massive recall? thx for looking after our interests - now go to the corner and soil your nvidiot undeez.

ATI wins. There will be lots of reviews.
 

Thurin

Distinguished
Apr 8, 2008
70
0
18,630
0
Reading is an art too Zooty *winks*

It's not a flame post towards Ati, and Ati does win, for now... at high costs.

before the month is over, Nvidia will win again at high costs... This is the cost of progress.

I;m not on either side, whichever is best I'll go for... I'm promoting and supporting the battle between the two industrial giants.

Take one out of the race and progress will become lethargic, dead slow and the prices will be insane... even more insane than they are already...

But if you choose to see me or my posts in a narrow minded way resembling the likes of what you just exhibited, then by all means go right ahead. You are entitled to your opinion.

However you may want to try and keep it a bit more decent.

(side note: using some mutated version of hackz0r 1337-speak or whatever this bit of slang is will not improve your standing with me or anyone else I wager... Try keeping it clean and I have no problems discussing any of the matters at hand with you.)
 

deadliest

Distinguished
Jul 26, 2008
11
0
18,510
0
yeah its all nice , these graphic cards and stuff, but am i to play crysis and call of duty 4 for the rest of my life there are no games that need this !
 

pulasky

Distinguished
May 7, 2008
74
0
18,630
0
MEEEEEEEGAAAAAAAAAAAAAAA CRAAAAAAAAAAPPPPPPP """"""""""REVIEWWWWWWWW"""""""
this crap site should renamed to NOOBIDIA`S HARDWARE suck all fk noobs
 

Vorador_21

Distinguished
Aug 12, 2008
3
0
18,510
0
Dear Sir,
I admire your efforts to show the maximum possible both as description and tests and I would like to ask a couple of questions:
1. Is it possible to show the same tests but with AMD configuration?
2. Is it possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options?
In appropriate for you time I shall expect your answer.
The reason to spend so much in time, research and money by my humble opinion is to offer the best possible and realistic images on the clients and gamers. Please, forgive my words but to have such video cards and test them only with 4AA/4AF is ... a shame. I save as much as possible and when I upgrade something I chose carefully the best I could afford.
Thank you in advance for your co-operation.
Yours faithfully
Vorador_21
 

Thurin

Distinguished
Apr 8, 2008
70
0
18,630
0
Keep it decent folks.... constructive criticism is one thing, but groundless and shameless flaming is quite another. Please stay on topic and discuss the review rather than stabbing each other in the face.

Thanks
 

NeoData

Distinguished
Aug 12, 2008
13
0
18,510
0
Thurin before you start accusing Tom of beeing biased atleast get your facts right.
In the review done by tweakers.net lower resolutions were used.
If you had bothered to do some research you would have noticed by now that these highend cards are intended for use at resolutions starting at 1680x1050 anything lower is just a waste of money as a midlevel card will do just fine.
And at these higher resolutions the HD4870X2 is obviously the clear winner and thus you comments here sofar pointless.

 

NeoData

Distinguished
Aug 12, 2008
13
0
18,510
0
*something went wrong while saving my post*
Unless you point was to point out the performance margins at lower resolutions are very similar
 

giovanni86

Distinguished
May 10, 2007
466
0
18,790
4
[citation][nom]deadliest[/nom]there aren`t any new pc games at all [/citation] Then your obviously not really looking. [citation][nom]demonhorde665[/nom]well if nvidia's last 2 product cycle's give any clue of waht to exepct (gf 8800 gt and 9600 gt to be specific) , i expect that by the begining on next year they will have lower power consumming budget versions of their GTX 280 that will sell at a 4th of the big boy's cost , yet still give some where about 80% of that performance if not more, I'll wait till then before i start lookind at upgrading from my 9600 GT. here's to hoping nivida AND ATI continue the cycle of building less power hugnry cards that perform better (for the buck) than last years "big" hitter. and pray they never go back to the old days where budget cards , only gave you 20% of the upper cards ability , i just love teh fact i only payed 130 bucks for my 9600 gt and it performs only slightly behind a full 9800 GTX (or 8800 gtx for that matter) [/citation] A 9600GT performs behind a 8800GT as stated right here on Toms hardware. Only when u pair two 9600GT's in sli does it perform better then one 8800GT. Wherever you got your information from your wrong. Read, it helps to know things before you write them down as a fact. And your shooting for the moon with a fire cracker if you think a 9600GT performs behind a 9800GTX. my 2 cents. Oh and great article, its too bad i am not a ATI/Crossfire fan. Can't wait to see what Nvidia will be releasing in the coming months. Happy Gaming.


 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS

Latest posts