Nvidia Optimus Allows 'Hot' Removal of GPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rizky_pratama

Distinguished
Jun 24, 2008
19
0
18,510
[citation]That's cool that you could take out the GPU with the system running, but what practical use could that be used for?[/citation]

I guess now you can swap GPU with your Grandma's GPU
 

FloKid

Distinguished
Aug 2, 2006
416
0
18,780
Heh cool. Now the laptops should have a little latch that let's you open it and pull out the GPU without the need to open the whole case. Wish I had $1 for every laptop sold like that :)
 
G

Guest

Guest
Really awesome. Perhaps one day...they'll have a GPU "interface" on the laptop where one and unplug and upgrade the GPU. I have a laptop right now, it was great a year ago when I bought it...now...because of the graphics card, this laptop is just...mediocre.
 

mkrijt

Distinguished
Oct 28, 2009
79
0
18,630
They should bring this to the desktop. Let's say the NV fermi has a TDP of 280 watt and let's say 1 KWh costs about 15 cents. At best that would save you 0.28 KWh x 24 hours a day x 365 days a year = 2453 KWh per year means $368 per year. Even if this would only be half it would be nice. Just by energy savings money alone you could buy a new video card every 3 years :)
 

rizky_pratama

Distinguished
Jun 24, 2008
19
0
18,510
By the look of it Fermi will be hot, Noisy and Expensive(i hope i am wrong) but i know it is Fast. How long do you guys think a "mainstream" Fermi will hit the market?
 

ewood

Distinguished
Mar 6, 2009
463
0
18,810
[citation][nom]ohim[/nom]another BS to put as a sticker on your video card when you buy it, i see no reason at all to remove my video card when my PC is running.[/citation]

Seriously? You think that is the point of the demo? They are showing that the gpu is %100 powered down when no in use. This is important because a part that is turned off requires not power. Most gpus just go into a low power state, they never really turn off.
 

leo2kp

Distinguished
Ummm, how about unplugging it, run a graphical application (to allow it to fail), then plug it back in? To me that's just like shutting off an external drive and unplugging it. The computer still may have that device initialized (or whatever) so unplugging it is no different than turning it off. I'd also like to see the adapter unplugged, reboot the machine, then plug it in. Decent technology either way. Very nice.
 

830hobbes

Distinguished
May 30, 2009
103
0
18,680
[citation][nom]jay236[/nom]Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?[/citation]
Thermal cycling is less and less of a consideration at manufacturing techniques get better. The more solder there is, the more expansion. Thermal cycling just messes up connections. The better the connections, the less thermal cycling matters.

Also, it's cool that it saves power by turning off the GPU but I can't really see a situation where I'd need to hotswap my graphics card.
 
Now this is cool :eek: and would be great for small form factor PCs and well HTPCs. One thing though I would like is a external enclosure that can at least operate and power a modern gpu of at least a 105w class lets say a G92 while still having 8x pci-e. Why one might ask well the uses of such are plenty even if it is just for cuda and phsyx.
 


we have had this for a while, though its up to manufacturer to use the slot (MXM), also keep in mind there are many versions of this slot that are not compatible with each other physically, electronically, or thermally (cooling)

so its not like moving desktop cards around, though i would love it if the laptop manufacturers decided on a laptop design thermally and structurally so we can do this easier
 
G

Guest

Guest
Probably NVidia's response to Intel's Atom N450 CPU with integrated GPU.
Seems like NVidia is investing in software now that it sees it's been outbid by 2 other companies.
 
Status
Not open for further replies.