Nvidia Primes Optimus Mobile Graphics Tech

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
Would love to have both a GMA 4500HD and a 5870m, which auto detects which to use.
Now we just need affordable laptops with dual 5870m's. =D
Also, this would be great with a ULV dual core with i7.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
This has been the goal since hybrid SLI came out. It was quite obvious at the time that the technology to implement this sort of GPU switch was incomplete without the driver ability to perform the switch automatically. I definitely agree that the manual implementation was ridiculous, especially when having to do it on a regular basis (i.e. playing games).

I just can't believe how long it took to come out. I guess it's a lot more difficult to implement then it appears (especially since no one else has done it yet).
 
Look at this notebook, please: The Asus K40AB

I tested it for my nephew and it switched graphics on the fly with 3D apps on the BG with no problem and no restart. I installed a copy of his Win7 MSDNAA and it worked flawlessly perfect.

That's a friggin sweet notebook for the money it costs.

Cheers!
 
Oh yeah... And where's FERMI? lol

Anyway, switchable on-MoBo graphics don't attract me that much. I'd rather if they develop the attachable graphics (dock stations at least) to make "gaming laptops" when u want it or "need it", lol. Give us that tech, not hot swapping for videocards inside the case xD!

Cheers!
 

necronic

Distinguished
May 22, 2009
109
0
18,680
[citation][nom]r3t4rd[/nom]Explain Nvidia's "rebranding" then. All Nvidia did was slap a new driver name (softeare), updated the firmware, on thier old hardware. Or you can say whats the difference between ATI/Nvidia's top Consuer Video Card and thier Top Professional Workstation Video Cards? - Nothing just firmware, Software...on the same peice of Hardware. So to answer your question...its being done all the time.[/citation]

I can't speak to the newest lines of cards, but for older stuff (2 years and more or so) that is completely untrue and another example of having a limited understanding of the systems but having read a lot of stuff other people have said.

And here are the major hardware differences (so this isn't an OPINION)

1) QC - Pro cards undergo far more rigorous QC to withstand far more intense operating conditions. That may not sound like a big deal but it is, and this in and of itself could arguably triple the cost. For instance with a gaming card its ok if 1 out of 10 fail. People get angry, you send them a new card, no biggie. Now, if that level of failure were to happen at a geological imaging company, or an architecture firm right near a deadline....they would never buy your product again

2) Features - there are too many and they are far too broad to list, but things like hardware/architecture accelerated anti-aliasing, clip panes, color depth, vertex shaders, etc.

3) Board layout - The board itself is laid out completely differently in the two cards. That ties into 2 (and probably 1), but I thought since its flat out visible it would be a point to make.

Anyways, you're wrong. I don't care what 'teh 1337' hackers say, they are not the same card and just "rebranded".
 

r3t4rd

Distinguished
Aug 13, 2009
274
0
18,780
[citation][nom]necronic[/nom]I can't speak to the newest lines of cards, but for older stuff (2 years and more or so) that is completely untrue and another example of having a limited understanding of the systems but having read a lot of stuff other people have said.And here are the major hardware differences (so this isn't an OPINION)1) QC - Pro cards undergo far more rigorous QC to withstand far more intense operating conditions. That may not sound like a big deal but it is, and this in and of itself could arguably triple the cost. For instance with a gaming card its ok if 1 out of 10 fail. People get angry, you send them a new card, no biggie. Now, if that level of failure were to happen at a geological imaging company, or an architecture firm right near a deadline....they would never buy your product again2) Features - there are too many and they are far too broad to list, but things like hardware/architecture accelerated anti-aliasing, clip panes, color depth, vertex shaders, etc.3) Board layout - The board itself is laid out completely differently in the two cards. That ties into 2 (and probably 1), but I thought since its flat out visible it would be at point to make.Anyways, you're wrong. I don't care what 'teh 1337' hackers say, they are not the same card and just "rebranded".[/citation]

OKay, I am no GPU Genious and couldn't careless about it all works to make them. I understand enough about GPU's to know what is at stake. That said, lets move on.

First thing, I did not say for the older generations of GPUs. I was refering to the newer ones. The two architect of ATI vs Nvidia and others are different where if Optimus is to work on any GPU, it would require alot of work - so agreed. But I would not doubt it can be done to work for and on any GPU. Secondly, my sarcasm just seems to go right over people's head sometimes.

My whole point, Nvidia just seems to be beating around the bush with Fermi, talking up a storm of new things to come, and this again is similar to Physix - not by design but marketing implementation.

That is what was ment.
 

alextheblue

Distinguished
"Unfortunately, it'll likely be difficult to gauge the benefits of Optimus on the UL50Vf, as the system includes a 1.3 GHz Core 2 Duo SU7300 processor and GeForce G210M GPU--not quite what we'd consider a platform in need of much more 3D muscle than integrated graphics would otherwise deliver."

No joke! A 210M is roughly on par a 9500M, which itself is considerably slower than the 9500GT desktop variant. This would have been better paired with a GT 240M or a Mobility Radeon 5165 (slightly OC'd Mobility 4650). Then again the CPU may be too slow, depending on the type of game, especially if you plan on keeping the laptop for a couple of years.

It is still a good idea, if the automatic switching works as well as advertised.
 

masterjaw

Distinguished
Jun 4, 2009
1,159
0
19,360
This is a good idea. But I would still prefer the concept of dock stations if portable gaming systems is concerned as it gives you the choice of what GPU to use and be able to easily replace it with powerful one without the need to overhaul/buy a new one.
 

razorblaze42

Distinguished
Jun 2, 2009
150
0
18,680
[citation][nom]brett1042002[/nom]Where is fermi...[/citation]
believe it or not, this bigger and better news for NVidia than fermi. Intel leads both NVidia and ATI selling the most popular form of computers graphics, IGP's. NVidia plans to use Optimus to piggyback off of Intel's huge success with IGP's, which significantly improves Intel integrated graphics over those offered by ATI, is clearly a move to gain favor with Intel and replace the lost chipset revenue. Brillant!! Oh and for those thinking of this as a traditional form of SLI, you're a little misplaced. These will work in SLI with IGP's like the Clarkdale
 

r3t4rd

Distinguished
Aug 13, 2009
274
0
18,780
[citation][nom]masterjaw[/nom]This is a good idea. But I would still prefer the concept of dock stations if portable gaming systems is concerned as it gives you the choice of what GPU to use and be able to easily replace it with powerful one without the need to overhaul/buy a new one.[/citation]
If I remember correctly 3DFX back in the day had something to this extent. That technology...was then, guess what, bought up by Nvidia. And it took them this long to refine it? That is why I think Optimus can still be made to work with other IGP Chipsets vs. Nvidia only. Again...its another Nvidia Martketing Ploy since they are losing in this dept. And no I don't hate Nvidia, rather, Nvidia is becoming the new Blizzard that irks me.
 
Status
Not open for further replies.