Temperature issues while crossfiring XFX HD6950's

hotlanta22

Distinguished
Nov 27, 2010
46
0
18,530
I recently purchased 2 XFX 6950's from newegg to crossfire. http://www.newegg.com/Product/Product.aspx?Item=N82E16814150550

Once both were installed, the primary card would idle around 65c and while playing games it would get up around 90c. Both of those values seemed really high to me.

Last night I removed 1 of the cards to see what the temp would be. The single card would idle around 42c and would get up around 70c while playing games.

When the cards are in xfire mode they are basically right up next to each other with maybe a sliver of space between them which no doubt leads to the increased heat.

So my question--is the 65/90c setup too high? If so, what are my options?

Specs:
-CM Storm Scout case /w 750 MK II Silencer PSU
-i7-950 /w CM V8 cooler
-GIGABYTE GA-X58A-UD3R
-CORSAIR XMS3 6GB
-2x(XFX 6950)
-60gb Agility II SSD(Boot drive)
-Spinpoint F3 500gb
-ViewSonic X Series VX2260wm Black 21.5

 

psyxix

Distinguished
Sep 4, 2011
148
0
18,710
0_0 65 at idle? Wow that's kinda high. The 90 is quite high also although not that alarming. The 2 cards are too close apart, if you have a 3 way motherboard I would suggest to take the 2 more distant slots. You'll also want to buy yourself 1-2 more 140mm fans for the side panel to reduce those temperature. It's pretty obvious this has to do with an airflow problem, increase the airflow and you'll see the temperature drastically improve.
 

arson94

Distinguished
Apr 18, 2008
867
0
19,010
90C is high, but not too high for video cards. Current generation video cards are rated for 100C and above. I found this info somewhere before but I just can't remember where. Well, I'll say the temps are fine as long as your GPU heaters aren't affecting your ambient case temps and/or CPU temps to the point of surpassing their limits.

On another note, I think it should be mandatory to list max operating temps on the manufacturer's product page. I just spent 30 f*ckin minutes trying to google it and that sh*t pisses me off. I mean AMD does it for their CPU's, why not their GPU's?

Sorry, I've been pissed off about this for years lol.