News PCIe 5.0 Power Connector Delivers Up To 600W For Next-Gen AMD, Nvidia GPUs

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
That, and Nvidia's software based gpu scheduler(which they've been using since GTX 600) as opposed to AMD still using a hardware based one.
That software scheduler has been shown to make cpus work a little harder and thus increase the power they use, as well as increasing the probability of cpu core/thread bound scenarios; many games are still bound by a single one.
Users and reviewers would be deceiving themselves if they only looked at gpu power consumption.
This is partially true. NVIDIA only moved how instructions are stuffed into a warp (the basic unit of scheduling) to the CPU. The rest is still handled by the GPU. Although considering how long they've been doing and how deterministic running said instructions are (at least according to NVIDIA's research), it's likely optimizing how instructions are stuffed into a warp has a minor impact at worst and only the most potato of CPUs will see a benefit of going from NVIDIA to AMD.

Although there's the caveat in mentioning that NVIDIA taxes the CPU more in that NVIDIA has included multithreaded rendering on DX11 driver wide at some point. So depending on when you're looking, you're seeing the effects of this. DX12 is up in the air because you're supposed to do multithreaded rendering to get the best performance.
 
Last edited:
"Well liquid cooling makes the heat go away from your PC to the ambient environment "

As does an air cooler.

The CPU produces the same wattage and heat. Liquid cooling just moves the transfer to a slightly different place, 6" away from the CPU. And may be slightly more efficient in doing so.
"The CPU produces the same wattage and heat" you need to test this to claim such thing. mostly because of how modern processors work: unless you lock the frequency the frequency/voltage fluctuate depending on thermal headroom, if the processor is at lower temperature it can boost higher and thus consume more power. plus temperature in general affect the resistance of of any electrical components and on complex circuits of processor you can't simply predict you need to test that.
now of course air cooler dissipate the heat to the environment (less efficiently) but as I said you can't compare them so simply.
 
"The CPU produces the same wattage and heat" you need to test this to claim such thing. mostly because of how modern processors work: unless you lock the frequency the frequency/voltage fluctuate depending on thermal headroom, if the processor is at lower temperature it can boost higher and thus consume more power.
As long as you use anything meaningfully better than the stock HSF, most CPUs will hit stock multiplier and power limits before hitting maximum temperature and won't be putting out significantly more/less power/heat out beyond that. The CPU or GPU might call for 10mV less Vcore while operating at 10C lower core temperature but at 100-250A when under full load, that is only a 1-3W (~1%) difference.
 
Status
Not open for further replies.