Overclocking Wattage Increase

Slayer697

Distinguished
May 13, 2010
204
0
18,710
I posted this in the Overclocking forum here at THG but it seems to actually be dead so I thought it better to turn this over to this forum which was so incredibly helpful in leading me towards my choice GPU:

I've been researching components for the last week as I work ever vigilantly towards creating my first box in almost a decade. I've recently decided that my new system is going to be getting an ASUS DirectCU 5850 and I've been reading a lot more reviews just to solidify my resolve. In learning about how voltage works with OCing I'm getting very curious about what kind of increases I can expect to wattage as I overclock my card. What kind of increases in heat level can I expect as well?

Keeping my system relatively quiet (purr or very low hum not a roar or a bonfire) is important to me but I might change my mind as I OC my GPU and CPU. I would definitely like to see what I can get out of this model of the 5850.

I have a Corsair HX650 and I'm trying to keep my system open to upgrade in the future should I want to run two 5850s in CrossFire. I have yet to decide on a motherboard
/cpu combo, but the CPU will be an i5-750 or a 1055T. Any insight on the questions I've asked is appreciated.
 

borisof007

Distinguished
Mar 16, 2010
1,449
0
19,460
Hello Slayer,

I'm not huge into voltage increases on the GPU in order to OC the GPU and RAM beyond what it can normally handle, however I can say that if you plan on getting a core i5 you should go with a motherboard that has an x58 chipset. This will allow you to run either SLi or Crossfire without compatibility issues.

Hopefully this bump will get you some attention too : )
 

Slayer697

Distinguished
May 13, 2010
204
0
18,710
Thank you boris. I'm actually not likely to crossfire in the future under the advice of people in this forum. It makes more sense for me to sell my card (if I can) and buy a newer one when that time comes. That being said, I've actually posed a question about x8x8 vs x16x16 because there were so many conflicting comments about the necessity of x16x16. From the massive amount of reading I've done over the last week I can say I'd feel confident in saying that a 5850 or lower will perform almost as well in x8x8 configuration as it will in x16x16. The performance gains may not justify the increase in cost of the motherboard, especially if in the end I don't go with crossfire.
 

borisof007

Distinguished
Mar 16, 2010
1,449
0
19,460
What you have in the x8 x8 vs x16 x16 is that it does somewhat depend on the card's output power (ie: how much data it's spitting back to the CPU via the PCI-E bus).

If the card puts out data that will use more than half the bandwidth that the 16 PCI-E lanes provide (let's say 9-10 lanes worth of data for simplicity), than having a dual x16 config makes sense. Because if each card is using 10 lanes worth of data, and you only have 8 lanes for each card, you're going to get ... congestion! Just like real life traffic : ) If you have 20,000 cars on a road that can only handle 16,000 at a time, you get major traffic issues.

However, if you're SLi'ing or Crossfiring two lower end cards that wouldn't make full use of those 16 lanes in the first place, for example, let's say they use 6 lanes worth of data each, then you would theoretically notice no difference between an x8 x8 setup vs an x16 x16 setup because the total lane usage is only 12 lanes worth of data and you have 16 anyway.

TLDR Version: If your card uses 8 or less "lanes" worth of available bandwidth on the PCI-E bus (16 total "lanes"), then an x8 x8 setup would be just fine. However, note that for future video cards the performance will only keep getting better, so it'll be harder to scale in the future.
 

Slayer697

Distinguished
May 13, 2010
204
0
18,710
That would've been a great "best answer" for my 8x8x vs 16x16x thread. For now one card is going to be great anyway.

Anyone got any insight into expected wattage increase for overclocked GPUs?
 
Hmmm I always go with this approach, first look up the max VRM spec of the card and I'll give a few examples.

3870 :105w
9800gt :105w
8800gtx :189
2900xt :215w

These are max VRM values and not normal load values at stock clocks. Always factor these when doing any overclocing to figure your power and cooling needs. So judging from the looks of the card and the load charts the VRM may top out around 160-200w range.
 

borisof007

Distinguished
Mar 16, 2010
1,449
0
19,460
I'm going to sound stupid here, but when you're typing VRM, are you referring to the maximum wattage the card is using, the maximum voltage the GPU can take for overclocking, or the maximum amount of wattage the card can use overall, even under overclocking?

If it's the total wattage usage for the 5850, it's 151 watts approximately under load. If it's overclocked (without voltage increases) it can jump much higher. I'd say that under a regular overclock it should jump to 170 watts to 180 watts max.

Source: http://guru3d.com/article/radeon-hd-5850-review-crossfire/9
Even though it's a crossfire review, it still has the TDP for a single card on there.
 



Correct, some can be modded to be survive higher loads but cooling is an issue. Plus one has to have the tools and the skills to perform such mods.
 


Some can raise the voltage slightly but for most cards you need to do extreme mods.

2900xtmod2jo7.jpg