The AMD Radeon HD 6990 Dual GPU Card is Huge

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
i really hope that it is at least 30-60 dollars cheaper then buying two 6970s separately. will it use ddr5 memory or something more? 4 gigs of video memory in one card will be awesome im just wondering if both cores will have acess to all 4 gigs or if it will be a 2 gig per core affair. each will have it's drawbacks.

the best part about the dual gpu design is that some games such as eve online dont take advantage of dual gpus so haveing two in one will increase the performance for those games and wil scale better. also i buy this card in march/april there wont be a single card that beats it for at least a year as opposed to me buying something and then within a few months that card is obsolite now. i regret not buying the dual 3870 card when they were first available and this time around i wont be left behind!
 
Somewhere, there's an oblivious idiot who will Crossfire and overclock two or even three of these. He'll have to buy a 1500W PSU. At 85% efficiency, that pretty much tops out a common 15A circuit (never mind the 30" monitors he's probably using); maybe he can fit in a desk lamp. He'll really need to add a dedicated 20A line.
Just to play games.

What a waste...a colossal waste...
 
It's impressive that they're able to keep the power requirements to a 6 + 8 pin configuration. So at max that could potentially supply this card with about 300W, and I'm guessing actual TDP will probably be pretty close to that. I've heard rumors indicating that the GTX590 will require an 8 + 8 pin configuration, although I haven't seen any leaked images of the card yet. This thing will probably be quite a beast and I can't wait to see the benchmarks once the NDA lifts.
 
[citation][nom]hunter315[/nom]I think they may need to move to triple slot rather than keep making these things longer. How long is that? I dont think it will fit in anything but an open air case@mindless yeah i hate you for that too[/citation]

I believe I read somewhere that the length is actually slightly shorter than the 5970, so I don't think we'll be breaking any size records with this. Anyone know the die size comparison between Cayman and Cypress off hand?
 
[citation][nom]Emergancy exit[/nom]i really hope that it is at least 30-60 dollars cheaper then buying two 6970s separately. will it use ddr5 memory or something more? 4 gigs of video memory in one card will be awesome im just wondering if both cores will have acess to all 4 gigs or if it will be a 2 gig per core affair. each will have it's drawbacks.[/citation]
Unless AMD changes the way Crossfire works, each GPU will have its own separate 2GB GDDR5, the 4GB will not be shared between the GPU's. This can be a potential drawback with lower memory capacities, such as the GTX295 with 896MB per GPU, but I honestly don't think it's in any way an issue here. When gaming at the highest resolutions today, 2GB memory capacity will not be the limiting factor for an HD6970/6950.
 
[citation][nom]jtt283[/nom]Somewhere, there's an oblivious idiot who will Crossfire and overclock two or even three of these. He'll have to buy a 1500W PSU. At 85% efficiency, that pretty much tops out a common 15A circuit (never mind the 30" monitors he's probably using); maybe he can fit in a desk lamp. He'll really need to add a dedicated 20A line.Just to play games.What a waste...a colossal waste...[/citation]
I think the max is two, as that would technically be Quadfire, which is the most Crossfire supports I think. I think the same went for the 5970, and the older X2 cards.
 
That card make you feel tiny !! ATI will keep up with most powerful graphics card on earth, just guessing it might give a 3dmark vantage score of 25,500 points in Perfomance mode JUST GUESSING. Nvidia still 2nd. ..
 
Man that's a huge card! Personally I don't think the size will be a huge issue (no pun intended) for most people, as long as it doesn't run hot and suck power. And yes benchmarks vs. the GTX 580 will be interesting.
 
Status
Not open for further replies.