• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Lucid Demonstrates XLR8 Frame Rate Boosting Technology

Status
Not open for further replies.
"dynamically scaling back quality"...didn't Carmack fail at this recently (Rage)?
 
Lucidlogix is making me love them more now! I remember far back that it was just the concept of using different GPU's in tandem that intrigued me, but practical efficiency technologies are really wowing me!
 
[citation][nom]fb39ca4[/nom]Oh look a lot of fancy marketing terms! So exciting!!![/citation]

Well, you gotta name your "children" something cool (that would be up for discussion (its coolness)). 🙂)
 
whats the news here? This is already being done in laptops. my wifes i5 has an intel and an nvidia gpu. Under light load it uses the intel but when I load a 3d intensive game the nvidia one takes over.
 
[citation][nom]geminireaper[/nom]whats the news here? This is already being done in laptops. my wifes i5 has an intel and an nvidia gpu. Under light load it uses the intel but when I load a 3d intensive game the nvidia one takes over.[/citation]

Your in a small box.......

Yes it's the same concept as what nvidia has done but a few things:

1. it's in a desktop

2. In theory, this software could be used with any intel motherboard (maybe AMD in the future) and ANY gpu combination once graphic card makers get on board with this.
 
[citation][nom]warmon6[/nom]Your in a small box.......Yes it's the same concept as what nvidia has done but a few things:1. it's in a desktop2. In theory, this software could be used with any intel motherboard (maybe AMD in the future) and ANY gpu combination once graphic card makers get on board with this.[/citation]

I fail to see why anyone would ever really care about this in a desktop setting. The only advantage I see is a small power savings in 2D mode where most discrete cards already have a fairly small consumption rate anyway. If it were even $50 a year I would be surprised. I didn't even see any difference in my already small electricity bill when I added SLI GTX 580's to my system. On battery power? sure that's great, plugged in to the wall? /shrug who cares.
 
Err...what's the news here?? I am currently using a Lucid capable Z77 board running i3+ GTX 460 with the monitor attached to the mobo HDMI (using i3) output. Supposely to save the power in i-mode...but according to my test, it does nothing....my system still at 70W+ whether in i-mode or d-mode(connected to the GTX-460) when idle or low load.

But at least the Virtual Vsync and Hyperformance seems to work. The only news here i saw is some 'modification' done by Gigabyte on the mobo....does this means all the current claims are false since it won't work save power and turn down the discrete GPU completely?? This means only those with whatever 'modifications' can shut down the discrete GPU?? I e-mailed LUCID but they never reply me.
 
Just to add, yes it can kick in to d-mode or hyperformance as long as i define the apps in the virtu software to tell it which apps to use i-mode/d-mode/hyperformance. Does this means this new edition is intelligent enough to kick start itself without me specifying the specific apps?
 
[citation][nom]energy96[/nom]I fail to see why anyone would ever really care about this in a desktop setting. The only advantage I see is a small power savings in 2D mode where most discrete cards already have a fairly small consumption rate anyway. If it were even $50 a year I would be surprised. I didn't even see any difference in my already small electricity bill when I added SLI GTX 580's to my system. On battery power? sure that's great, plugged in to the wall? /shrug who cares.[/citation]


You don't get it you can't use quicksync while a videocard is in the pci-e slot with software you can switch between ivybridge and any addon card, also if your using the addon video card in a game with software you can access the onboard gpu for more fps.
 
[citation][nom]bucknutty[/nom]All that computer equipment and they cant afford a propper desk? Could you imagine trying to game at that computer station?[/citation]

It's call budget restraints. Lucid doesn't have Intel's finance health, otherwise they would've been involved in much more ambitious projects.
 
[citation][nom]energy96[/nom]I fail to see why anyone would ever really care about this in a desktop setting. The only advantage I see is a small power savings in 2D mode where most discrete cards already have a fairly small consumption rate anyway. If it were even $50 a year I would be surprised. I didn't even see any difference in my already small electricity bill when I added SLI GTX 580's to my system. On battery power? sure that's great, plugged in to the wall? /shrug who cares.[/citation]

Switching off the discrete seems to pose multiple benefits.
1. Less electricity expenditure, which is the most obvious.
2. Less system idle temperature, I would think.
3. Possibly a longer video card lifespan, as well as for its fans. This is taking into account how a lot of people use their PC's for a lot of none graphically intensive tasks meaning a lot of time having electricity passing through the video card and fans motors spinning.
4. Actually getting to use that, in many cases, unused/useless IGP on Sandy and Ivy Bridges. This could be up for debate in terms of affecting CPU performance though as it could, in some cases, impair Turbo Boost dues to temperature constraints as I've read before.
5. Also, like techguy911, said above, to be able to utilize QuickSync with a discrete GPU if ever you do. This was one of the original objectives (or advantages) I believe.
 
Status
Not open for further replies.

TRENDING THREADS