Nvidia Granted Patent For Hybrid Graphics Systems

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
How about just making a decent integrated GPU that has intelligent power profiles via drivers. :)

Anyway, having dealt with Optimus hybrid drivers and related headaches, they can have all the patents on it they want--I'll never buy another computer that uses it.
 
[citation][nom]mhAMDy[/nom]Optimus is not supported on linux.These companies like Nvidia, SiS and the like are effectively suffocating the linux platform; releasing hardware technologies without proper support.I myself have given up on linux for now because of this optimus thing.This technology is to save power, which is most meaningful now on laptops running batteries.And because laptops don't run that much gaming, that's the strength point for linux.because i know of a lot of people who run windows machines now only to play games.And hardware support for this specific technology should be available for linux....pffffffffffffffffffffffffffffff![/citation]
Still cant watch Netflix on linux 🙁
 
It would be cool if they can combine integrated and discrete graphic power like a car that uses electric and gas. I guess that's not possible... yet. Granted, integrated graphic wouldn't amount to much but still it would be nice to be able to utilize it.
 
I didn't read the actual patent text, but in the quote from the article, it mentions that
the graphics driver transmits the rendered images from the DGPU to the IGPU local memory and, then, to the IGPU DAC.

If that's how the actual patent is written, then they've already boxed themselves out of applying this patent to any non-analog display output. So it can't be applied to a system using solely DVI-I, HDMI, or DisplayPort, but rather only to systems using VGA outputs from the IGPU. And also, I would raise the question of whether an APU is considered to contain an IGPU, or whether it is a different component. It depends on how the patent defines IGPU. So in the end, this patent may be essentially worthless before it ever saw a courtroom.
 
[citation][nom]altriss[/nom]yes only on gamer computers....Do you have ever heard about graphic acceleration?Like in hospital system for imaging?Or for research in high-energy particles?Or for simulation of physical systems in automobile and aeronautical industry?Or for measure acquisition in various areas?and so on...Maybe for people knowing nothing about computer than COD and angry birds, DGPU are rare products but there is a whole world which need computation power and thus DGPU...[/citation]

but those things won't use optimus, they aren't mobile stuff
 
[citation][nom]cuecuemore[/nom]Another patent granted, another blow to technological progress.[/citation]

Spoken like a good Communist 🙂

Patents allow companies to protect intellectual property and make money - it's a part of basic Capitalism.

Without patents more people would have less incentive to create new things because someone would come and steal your hard work....

I don't know what kind of utopian world you wanna live in but everyone patents things... For example - Cisco patented an algorithm for routing and a Chinese company decided to disregard their patent, and copied it.

Source: http://www.todaysengineer.org/2007/Aug/protectingIP.asp

You would blame Cisco?
 
[citation][nom]verbalizer[/nom]it's not the number of patents, it's the quality of them that counts..[/citation]

I do agree with you my friend. I just wanted to point that when we talk about one more patent count, we must see a higher ground. Nvidia have 1474, AMD 9820, Intel 21713. My opinion is that 1 patent granted to Nvidia isn't that "blow to technological progress". Everybody needs to protect it's intelectual property.
 
[citation][nom]madooo12[/nom]but those things won't use optimus, they aren't mobile stuff[/citation]

In fact yes they will use similar schedulers.
As I work on them I can ensure you how much power consumption is a great challenge.
A heavy computation-cluster is not overcharged all the time, and then sparing few MW by dynamically shutting down some DGPUs is an already considered solution. But I don't think this patern will help or prevent researches as the platform are really different.

[citation][nom]madjimms[/nom]Why do they call it "discreet" graphics? Nothing discreet about a standalone card. they should simply call it standalone.[/citation]
Because in electronics "discreet" is the antonym of integrated. It comes from mathematical language where discreet means "something from what you can define the limits". In integrated systems, you cannot truly say "here stop the GPU and start the CPU as the borders are quite mixed. However with a DGPU you can definitely do it.
 
No wonder all these tech companies keep suing each other. They don't get patent grants until years after the technology has been on the market and everyone else has had a chance to copy it. Way to go broken patent system.
 
[citation][nom]cuecuemore[/nom]Another patent granted, another blow to technological progress.[/citation]

I don't see how, especially considering NVidia was the first to implement it in a commercial product, and has all the right in the world to file it as their own invention.

Just because you don't agree doesn't mean they lose that right.
 
Status
Not open for further replies.