[citation][nom]jay236[/nom]Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?[/citation]
depends how hot the chip gets if the temp variance is small it wouldn't matter too much.
Isn't dynamic clocking almost the same(only for the saving energy part)? I think it will delay for a second or half a second to start rendering something.
Although this is great when you would have 4 seperate GPU's and 3 of them aren't in use. I am not sure how stable it will be for overclockers though...
[citation][nom]maxsp33d[/nom]I wonder if we'll ever reach a point when we can do this with any part of the system (i know SATA hdds are hot-swappable)[/citation]
Some server boards can hot swap cpus/ram
[citation][nom]Judguh[/nom]That's cool that you could take out the GPU with the system running, but what practical use could that be used for?[/citation]
Imo, you DON'T need to remove the GPU, however, I think nVidia is just making the point of showing odd the difference between idle mode and completely off mode.
[citation][nom]Judguh[/nom]That's cool that you could take out the GPU with the system running, but what practical use could that be used for?[/citation]
It was just an example on how the technology completely shuts off the GPU, rendering it "disabled" to the system's eyes.
What's great about this technology is that it's mostly software based. As long as you have an Nvidia GPU, and an Intel IGP, this technology could even be implemented on the desktop platform to help achieve lower idle power usage.
[citation][nom]Judguh[/nom]That's cool that you could take out the GPU with the system running, but what practical use could that be used for?[/citation]They are showing it is fully off, not just clocked down to save power its fully off. so you have Onboard power levels and dedicated performance(and power draw) when needed.