AMD Radeon HD 4890 X2's Coming

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]neiroatopelcc[/nom]it's an e6600 @ 380x9 on a ga-p35-ds4 board (see my profiles pc conf)[/citation]
It MIGHT bottleneck at lower resolutions, but 1900x1200 and the CPU @ 3.5 shouldn't be much of a problem.
 
Just wow, A 4890 X2 and a screen that would be capable of displaying the signal that card will pump out will probably cost more than my entire new PC build + a few hundred even. Anyone who builds this call me I would like to grill steaks on you GPU cooler it should be large enough and hot enough to blacken a T-bone.

Nvidia's GPUs seem to have hit a wall when it comes to a dual GPU solution of course I think after the 4890 X2 ATIs next model will have to include 40nm parts or they will be just as stuck. I think we are about to see the limits of the current Generation GPU and manufacturing process is capable of. 40nm and under will have to be achieved, but at least ATI is already on that track with the RV740.
 
newsflash: when running two or three 4890x2 cards in crossfire, Radeon suggests forgoing the power supply and simply running a power cable to your nearest nuclear power plant or electrical substation.
 
[citation][nom]rkaye[/nom]newsflash: when running two or three 4890x2 cards in crossfire, Radeon suggests forgoing the power supply and simply running a power cable to your nearest nuclear power plant or electrical substation.[/citation]
You expect computer electronics to be solar powered or something? They use some current - and it has to come from somewhere. More hardware requires more power. It's really simple.
 
[citation][nom]zipzoomflyhigh[/nom]Nvidia just figured out how to put 2 gpu's on a single pcb and needed a die shrink to do it. ATI is about to roll out 40nm which will put an end to Nvidia's dominance. Nvidia needs a platform, or they cant compete with Intel and AMD.[/citation]

Um, no. Nvidia joined two of their new GTX275's at the hip with a cooler sandwiched between them. Ati is the on that designs a special pcb for the dual-gpu cards (ref all the way back to the 7900GX2 and 1950 Gemini).
 
lol im running gtx 295 in 3way sli. crysis on maximum at 200fps, i had to limit the fps to 60 because it ran way too fast. makes this card seem like a joke XP so you guys know that 1792mb X 3
 
Really angee.. I suggest looking into selling that 3rd card. As it make no sence since you can only use a max of 4 gpus. I really hope you're lieing other wise you just wasted a whole lot of money.

I really hope the 4890x2 comes out soon. Specially since I can't find any manufacters on newegg that I like at the moment.. The rest went out of stock, and took off the page for the gtx 295's...
 
I love the comments like "4GB? who needs that??" - yes it's insane now, but in 10 years time it will be the standard minimum. lol, I remember my dad back in 1996 saying "a 1GB hard drive? who needs that??"
 
Imagine this - Radeon HD 4890 X2 in a Tripple Crossfire ... OMG 12Gb
 
[citation][nom]fonzy[/nom]Besides Crysis is there any game out there that needs that much power? Not being sarcastic just curious.[/citation]

All the games coming out next year and also DirectX11 if you heard.

 
lol, untill then i was just about to buy a 4850xt, now i can't wait for a 4890

ive been using a nvidia geforce mx440 @64mb for 8 years, and now an upgrade to 4GB of vram...... i cant believe it
 
295GTX is far from being overkill. There a number of games that bring it to it's knees. Even if you run it a 1680x1050. I like my games to run at least over 60fps with everything max. I would trade my 295GTX in a heart beat for ATI 4890x2.
 
This will probably be the first card that needs 2*8pin PCIe connectors (due to JEDEC's habit of forcing designers to massively overrate their TDP figures)... or it'll be the first to drop PCIe1 back-compatibility 😱 Probably 280W+ real-world consumption (HD4890 is 120W a pop, but this is two HD4895s on one PCB) so expect a listed TDP around 375W!

But if this thing really IS a HD4895X2 in reality then it'd take back the single-card crown no sweat. Hell, the opposition is two GTX275s sandwiched together, and as a single HD4895 will probably trade blows with the GTX285...
 
Graphics card wars are getting ridiculous! Can I get an Amen to that?
 
[citation][nom]marokero[/nom]I wish these GPUs were less power hogs to achieve such levels of performance. More efficiency using the same amount of power would've been better, from both AMD and Nvidia.[/citation]
I don't think ATI has produced "power-hogging" cards, if anyone has, it's been nVidia. One 295 gtx (dual GPU) needs a PSU at around 680 watts. Two 4890's (1gb) are rated at around 600 watts. One 4890 needs only 500 watts. 500 watts is still pretty big....but in comparison, ATI's waaaaay more efficient. (same goes for price from ATI products as well).
 
Status
Not open for further replies.