MSI Big Bang Fuzion: Pulling The Covers Off Of Lucid’s Hydra Tech

Status
Not open for further replies.
Nice article,its very good for users for upgrading,because for current SLI/CF you need 2 exact cards but with Lucid you can use different cards as well,but it still needs to be more optimized and has a long way ahead of it,it looks very promising though
 

Von_Matrices

Distinguished
Jun 1, 2006
24
0
18,510
I'm highly doubtful of the Steam hardware survey. I think it is underestimating the number of multi-GPU systems. I for one am running 4850 crossfire and steam has never detected a multi-GPU system when I was asked for the hardware survey. The 90% NVIDIA SLI seems also seems a little too high to me.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]Bluescreendeath[/nom]The CPU scores for the 3D vantage tests are way off. You need to turn off PhysX when benchmarking the CPU or it will skewer the results...[/citation]

It's explained in the analysis ;-)
 

kravmaga

Distinguished
Dec 10, 2009
74
0
18,630
"But when you spend $350 on a motherboard, you’re using graphics cards that cost more than that. If you’re not, you aren’t doing it right"

Quoted from the last page; I disagree with that statement.
There are plenty of people in situations where using this board is a better investment performance per dollars. This is all the more relevant as this technology will undoubtedly find its way into cheaper boards and budget oriented setups where it will make all the sense in the world to bench it using mid-end value parts.

I, for one, would have liked to see what using gtx260s and 5770s would look like in this same setup. As is, this review leaves many questions unanswered.
 

SpadeM

Distinguished
Apr 13, 2009
284
0
18,790
Well the review does give an answer in the form of: It's better to run a ATI card for rendering and a nvidia card for physics and cuda (if u're into transcoding/accelerating with coreavc etc.) with windows 7 installed.
Or at least that is the conclusion i'm comfortable with at the moment.
 

HalfHuman

Distinguished
Feb 13, 2006
83
0
18,630
i also agree with the fact that a person who will buy this board will necessarily go for the highest priced vid boards. maybe some will but not all. there will be more who will try to save the older vid cards.

i also understand why you paired the 5870 with nvidia's greatest. there is a catch however... lucid guys did not have the chance to play with 5xxx series too much and you may be evaluating something that is not too ripe. i guess the 4xxx series would have been a better chance to see how well the technology works. couple that with games that are not yet certified for lucid, couple that with how much complexity this technology has to overcome... i think this is a magnificent accomplishment o lucid team part.

i also think that in order for this technology to become viable it will go down in price and will be found in much cheaper boards. for the moment the "experimenting phase" is done on the expensive spectrum. i saw some early comparisons and the scaling was beautiful. i know that the system put together by lucid... but that is fine since that was only a demo to show that it works. judging on how fast this guys are evolving i guess that they will go mainstream this year.
 

Andraxxus

Distinguished
Jun 1, 2009
312
0
18,780
I hope that the guys at Lucid will have a chance to continue with this
wonderful technology.Not long ago mixing ATI with Nvidia was unthinkable
and many people asked if they could CF or SLI mixed boards on forums. So I think that this is something that should have the support of the people
that buy GPUs so that we can end this proprietary technology farce (see Physx).I'm not saying that the Physx is bad but the restriction are bad.
Well in the end I just hope that they won't be bought by a rich so called
"competitor" that will can the product so that it can keep sucking money
from the buyer just for minor improvements or rebranding.
 

juanc

Distinguished
Nov 18, 2009
96
0
18,630
I think that this will really pay if the people develop some driver that can "get the most out of each card" by rendering using each cards "best features" like for example, render the 3D Scene with the GeForce and apply the AA with the ATI and the colouring with the ATI. Balancing using what's best on each card.

Then I'll get 1 middle of the pack ATI and one middle of the pack NV. Run what runs best on each, or combine the best features of each card togheter.
 
Nice review Mr. Chris, sharp as usual.

I agree with zipzoomflyhigh, but this chip has a lot of pontential. It needs some polishing or help from ATI and nVidia to make it better. If they can make it some how (ATI and nVidia for Hydra), this would boost up their sales for not being "platform bound" and leasing their multi gpu tech to third parties. I can dream a little, right? lol.

Anyway, very good news and hopefully nVidia nor ATI will bully this tech.

Cheers!
 

thackstonns

Distinguished
Dec 4, 2008
230
0
18,680
Here is why I like this technology. I can keep my 4870 and upgrade to a multicard system without having to buy 2 more graphics cards. So I could do a 5870 and instead of moving the 4870 to a different computer I can keep it. Here is where I have that problem though. Physics will suck because of nvidia's restrictions, I will have a hell of a system that will run crysis and looks good, but since the rest of the games are console ports I will be wasting money to play crap quality games.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
Nice read, but I question the actual useable titles with this Hydra. Testing with games that aren't supported doesn't show what the board can do, but only shows what it can't.
Using 5/6 titles that aren't supported officially makes this board and technology appear to be an epic failure. Would be nice to know what it does when the game is acually supposed to work, or what happens when the drivers allow these games to work in the future.
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
The problem I have with the product is that they are essentially replacing the GPU obsolescence schedule with the chipset obsolescence schedule. And their platform choice makes this particularly bad because while AMD makes an effort to keep their sockets backward-compatible, Intel seems to do the opposite. In fact, Intel now seems hell-bent on segmenting the platform space as much as possible while constraining the product lifecycle as much as possible. Want to reuse your C2Q or upgrade to a 6-core (gulftown, is it?) CPU on this mobo? Good luck with that. With socket 1156, Intel has effectively forced you to buy a new mid-range CPU and constrained you to the mid-range market. If past behavior is any predictor of future behavior, I fully expect the next major generation of Intel CPUs (e.g. 3+ yrs out) not to be compatible with 1156. How long do you think Intel will make advancements on 1156-compatible CPUs?

So, yes, you can mix GPUs from different generations and even from different vendors. But by the time it even makes sense to do that twice, you'll need to upgrade your whole MB to keep a balanced CPU-GPU system. If the X-mode, A-mode and N-mode scaling were more seamless and effective on the latest HW, and the cost were more in-line with other 1156-socket MBs, I could see this MB making some sense. But given that you need to spend an extra $150+ for this Mobo, I'd rather put that $150 towards the second card or an upgraded card with a longer life span before obsolescence.
 

memeroot

Distinguished
Jan 7, 2010
42
0
18,530
big fan of the concept and $150 isn't to much for something a bit fun....
however needs to be x58 and what is the over clocking ability of the board?
also does it have the same audio advantages
 

xer0

Distinguished
Jan 7, 2010
21
0
18,520
So what happens when Nvidia (which already has with Physx) or ATI decide to to make drivers (or even firmware) that looks for the competitors's (or lower-end, same-manufacturer cards) and says "Sorry we're being douchebags and turning off functionality and performance features."
 

applejack

Distinguished
Jan 7, 2010
1
0
18,510
"There are also one or two situations where a pair of GeForce GTX 285s in N-mode outperforms conventional SLI. Surprisingly, DiRT2 is one of them."

false conclusion - DIRT2 doesn't support SLI yet. its just reasonable that N-mode would outperform a SINGLE GTX285.
 
Status
Not open for further replies.