If a CPU bottlenecks a GPU by a lot, does that mean the GPU will draw a lot less power(electricity) than normal?

SeriousGaming101

Distinguished
Mar 17, 2016
330
0
18,780
I currently have a AMD 1100t CPU + GTX 1080 Ti GPU

My CPU bottlenecks the GPU a lot. So, instead of getting 200 FPS on max settings in-game if I had a high-end CPU like a i-9. I only get half of that, 100 FPS.

So my question is that, if the high end CPU with the GTX 1080 Ti makes the GPU draw 250 watts of power, then will my old 8 year old CPU cause the GTX 1080 Ti to draw less power (like 125 watts) because of the lower FPS due to bottlenecking?
 
Less GPU usage does typically mean somewhat proportionally less GPU power usage: the GPU does not clock as high, which means its core voltage also doesn't need to ramp up as much and more of its resources end up idle most of the time, using little to no power.
 
I don't think that bottle-necking by the CPU is going to cause the GPU use 50 % less power.

If you want to check it, invest in one of the outlet power meters. Then check it before and after upgrading the system. Unfortunately unless you used the same motherboard, you wouldn't have a good test.
 

The lumped approximation of power in CMOS circuitry is P = C * v^2 * f. If you run the chip at 50% load, you roughly halve some combination of C (the lumped capacitance of active circuitry) and f (frequency) along with some degree of voltage reduction since less voltage is required for stable operation at lower frequencies.

From 100% load to 50% load, the power draw can drop by significantly more than 50%, especially on higher-end GPUs where quiescent (idle) power is a much smaller chunk of board power than entry-level ones.
 
But the problem is the processor. The AMD 100T 6-Core 3.3GHz, 3.7GHz Turbo isn't going to bottle-neck that much. Plus it can be overclocked.

It is on par with the I5 -8400 6-core in performance . Not my choice of the best processor, but it gets a lot of recommendations at Tom's. If this processor is so bad then so is the I5-8400.


 


What? The 1100t is no where near the 8400! The 8400 about double the performance!
 


Not from the stated operating frequencies. The base operating frequency frequency is 2.8 GHz for the i5-8400. The base frequency of the 1100T is 3.3 GHz. The turbo frequency of the I5-8400 is 4.0 GHz. The turbo frequency of the 1100T is 3.7 GHz. And the 1100T is overclockable. An overclock of only 0.3GHz would match the turbo frequency of the locked I5-8400.


And they are both 6 core processors.
 
the speed of the cpu is only a small part of the performance. that just says how many clock cycles per second the chip can do. how much work can be done each clock cycle is more important. and the 8400 will perform a lot more work per cycle that the 1100t will by a long shot.

same reason the old AMD fx series oc'ed to 5 ghz+ still could not touch what intel had out at much lower frequencies and power draw. the work per cycle was easily double what the fx cpu had going for it and that only got worse as it aged and intel dropped new generations of cpu's.
 


Are the new processors more efficient. Sure they are.
 

Try it, you may be surprised. Use a hardware monitoring tool to see if your motherboard's VRM reports CPU power draw. You will most likely find out that CPU power usage increases considerably with load. On my PC, 10% CPU load draws about 10W, 50% draws around 25W and 100% CPU load is around 50W. That's 2.5X the power for 5X the load from idle to 50% and double the power for double the load from 50% to full.

GPUs are fundamentally graphics-oriented CPUs, power scaling will be similar: much lower incremental power per incremental load at light to medium load, then increasingly steep extra power per extra performance beyond that.
 


Frequency and amount of cores does not tell you anything when comparing two different architectures of cpus, do you really think we havent moved on in performance in the last 10 years?. look up some benchmarks.

With respect, if you dont understand this and have no knowledge of relative cpu performance you should not be commenting on it. You'll just be spreading misinformation which is not the idea of these forums.
 


Thanks for you opinion. I've read quite a few reviews of CPU'S. And they do comparisons with different games. But there is another thing the report every time. The report the operating frequencies. And yes the do compare operating frequencies.
 


Believe me, the operating frequency between two cpus of different architectures is not a measure of performance.

You could force clock an 8400 to 2.0Ghz and it would still outperform the old 1100t.
 


And I'm sick of the argument.
 

Benchmarks don't compare frequencies, they compare the overall throughput of the CPU (or GPU) on a given workload under a given set of operating conditions. The reports includes clock frequencies simply because it is one of those variables that are part of operating conditions that need to be known for reference. If a review achieves 1500 points in bogoMark using an i5-8600k and you only get 1200 points on your i5-8600k, you want to compare operating conditions between the review's setup and your own to figure out where the 300pts difference comes from.

As Rob wrote, clock frequency is only part of the equation. Instruction-level parallelism is the other major part. AMD's CPUs prior to Ryzen are at a substantial IPC (instructions per clock) disadvantage against Intel's Core i generations, which is why the 5GHz FX9590 gets destroyed by Intel's 3.xGHz CPUs in most games. Intel still has the IPC lead over AMD's Ryzen in most cases but close enough that it doesn't matter much to most people.