AMD Radeon HD 4890 X2's Coming

Status
Not open for further replies.

marokero

Distinguished
Feb 12, 2009
47
0
18,530
I wish these GPUs were less power hogs to achieve such levels of performance. More efficiency using the same amount of power would've been better, from both AMD and Nvidia.
 

deltatux

Distinguished
Jul 29, 2008
335
0
18,780
I agree with the power efficiency and heat efficiency. I think that to me is more important than speed boosts. Just the thought of a dual slot just makes me cringe since the next GPU upgrade I'm getting most likely will take up 2 slots and that's leaving me feel uneasy because that means sacrificing a PCI slot. I already lost my only PCIe slot because of my heatsink controller needing the shortest path to outside my case.
 

hellwig

Distinguished
May 29, 2008
1,743
0
19,860
A follow up to the GTX 295 after ATI releases the 4890x2? Unlikely. It took NVidia a year to dual-GPU their GT200 series (needed a die shrink first), and even then they had to underclock the chip. I'm not sure NVidia has anything else up their sleeve. They blew their wad with the GTX280. I think the GT200 is as fast as its going to be, have to wait for the next generation.
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
"both GPUs running at least at 1 GHz core clock speed"
"customers should be seeing 2 GB and 4 GB cards"

HOLY JESUS CHRIST! 1 4890 @ 1Ghz clock speed beats anything the GTX 285 can do even if it is OC'd! This might not only be the fastest card, but 2 of them might be the fastest setup!

The heat these things will produce will be large. I guess it is time to get another GTX 480 radiator...
 

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810
[citation][nom]hellwig[/nom]A follow up to the GTX 295 after ATI releases the 4890x2? Unlikely. It took NVidia a year to dual-GPU their GT200 series (needed a die shrink first), and even then they had to underclock the chip. I'm not sure NVidia has anything else up their sleeve. They blew their wad with the GTX280. I think the GT200 is as fast as its going to be, have to wait for the next generation.[/citation]

Though not from a reliable source, I did read that Nvidia is about to tape out their GT300 processor line.
 

FlayerSlayer

Distinguished
Jan 21, 2009
181
0
18,680
[citation][nom]mlcloud[/nom]Probably not. But it's an easy futureproof.[/citation]The problem with futureproofing your machine is that you're spending $600 now for a card that will be $300 when you start needing it. Personally, I'm happy with my GTX 285 1 GB, which does everything I need it to do rather well. No, it's not the best out there, but the diminishing returns for performance vs price at the bleeding edge is too rich for my blood.

4 GIGS OF RAM though?! Holy frak! What needs that?
 

HTDuro

Distinguished
Feb 15, 2007
77
0
18,640
a real joke to get more money ... 4gb of Vramn .. kiddin me ... useless.

both nvidia and ati are funny ... they just OC them last generation cards to get a better than the last one ... and put 2 of them on 1 card ... god i miss the time when ATI and NV got a real fight ... GF4 TI, ATI 9800, 7800GTX, 8800GTX etc ... it was a real fight .. now its a kids one ..
 

Kari

Splendid
This beast would still operate under 300W, as per JEDECs PCIe standards. 4870X2 is already at 260W so this new card should be around the same...

Could this be a 40nm device?? Because it feels doubtfull that they could squeeze out such high clocks for both cores and remain in that power envelope with a 55nm chip.
 

marokero

Distinguished
Feb 12, 2009
47
0
18,530
[citation][nom]FlayerSlayer[/nom]4 GIGS OF RAM though?! Holy frak! What needs that?[/citation]

If AMD or Nvidia make drivers available to use these cards for 3D content creation, many programs like Maya, 3D Studio, Lightwave, etc, might be able to take advantage of the memory. But that would cannibalize their Firepro and Quadro sales, so who knows...
 

blppt

Distinguished
Jun 6, 2008
569
89
19,060
[citation][nom]HTDuro[/nom]a real joke to get more money ... 4gb of Vramn .. kiddin me ... useless. [/citation]

Its really closer to 2GB effective (2 GPUs in crossfire, remember)...which CAN be used by GTA4, if nothing else. Maxing out the draw distance uses about 1.5GB VRAM at the highest settings.
 

wira020

Distinguished
Apr 9, 2009
63
0
18,630
whatever happened to the 32nm die shrink proscess??? RV740?... or was it 40nm?.. i thought that was their direction?

I also bumped into some net report that gtx 295 will be shrunk to one slot... dunno if it was just rumours... think of the possibility of gtx295x2 in a dual slot design... now that's a beast...

i wonder why amd/ati still cant lower their watt usage since nvidia's card can do dual slot design with just the watt usage of 1 slot..

what ever it is... it's good for us... since we're going to see alot more price drops... i know that's the trend... but now the price drops went down in speeds we never imagine before...
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
you know, i kinda saw this coming... if you go to newegg, you can see only 2 4870x2's... so they're pretty much killing their stock before bringing in the new guns, probably gonnabe at the same price
 

wira020

Distinguished
Apr 9, 2009
63
0
18,630
[citation][nom]TheViper[/nom]Couple two together in Cross Fire and we might finally see Crysis on Very high, 8XAF, 4XAA, 2560 x 1600 @ 40+ FPS.[/citation]

haha... to the mainstream.. that sure wont happen...
 

mlcloud

Distinguished
Mar 16, 2009
356
0
18,790
[citation][nom]FlayerSlayer[/nom]The problem with futureproofing your machine is that you're spending $600 now for a card that will be $300 when you start needing it. Personally, I'm happy with my GTX 285 1 GB, which does everything I need it to do rather well. No, it's not the best out there, but the diminishing returns for performance vs price at the bleeding edge is too rich for my blood.4 GIGS OF RAM though?! Holy frak! What needs that?[/citation]

Futureproofing was always a luxury reserved for those able to afford it. I'm sure we all agree though that a pair of 4890's clocked at 1ghz at 2gb GDDR5 (what!?) each would leave your computer some room for Crysis-level graphics, provided that video game creators continue to disregard the performance capabilities of computers and graphics card ._.;;
 

yoda8232

Distinguished
Jan 7, 2009
66
0
18,630
[citation][nom]wira020[/nom]whatever happened to the 32nm die shrink proscess??? RV740?... or was it 40nm?.. i thought that was their direction?I also bumped into some net report that gtx 295 will be shrunk to one slot... dunno if it was just rumours... think of the possibility of gtx295x2 in a dual slot design... now that's a beast...i wonder why amd/ati still cant lower their watt usage since nvidia's card can do dual slot design with just the watt usage of 1 slot..what ever it is... it's good for us... since we're going to see alot more price drops... i know that's the trend... but now the price drops went down in speeds we never imagine before...[/citation]

Those were 40nm cards and they are coming out soon latest early May. Max TDP of 80W but more performance than a HD 4830 and almost the performance of a HD 4850. GDDR5 for $100 bucks?
I'm waiting for that card.
 
Status
Not open for further replies.