Super-Size Me: Gigabyte's G1.Assassin In An XL-ATX Monster

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Intel is in a weird place right now with 1366 vs 1155. It's not always 100% clear which is better, but Sandy Bridge is obviously the better value. My hopeful prediction is a new LGA 1365
 
[citation][nom]jednx01[/nom]Personally, I've never seen the point of 4 video cards... That seems like total overkill to me. Still, nice looking board.[/citation]
I've never seen a point of even just 2 video cards...but that's just me. My GTX 460 SE 1GB plays anything so far.
 
[citation][nom]falchard[/nom]That bottom PCI-e x16 slot is like an appendix on this board. There aren't enough lanes for it to be of any use. However, the amount of motherboards with proper configuration for the use of all 4 is a small pool.The thing I like about this board is no Realtek. There is a more sophisticated network adapter and a better sound chip. I don't like Sound Blaster, but I like them more then Realtek.[/citation]

I dunno atleast Realtek support there chips with propper drivers that dont cause BSOD's on random systems and every other issue you can think of
 
[citation][nom]jednx01[/nom]Personally, I've never seen the point of 4 video cards... That seems like total overkill to me. Still, nice looking board.[/citation]

Not quite in this league, but I used to have 4 graphics processors in my rig!
A 4870x2 and a regular 4870 in crossfire, and a 5450 seperately for beamer output. I ditched the extra 4870 later in favor of better cooling for the x2, but still - that's 4 gpu's in one rig. Just not quite the performance of this, and not all in crossfire.
 
[citation][nom]JOSHSKORN[/nom]I've never seen a point of even just 2 video cards...but that's just me. My GTX 460 SE 1GB plays anything so far.[/citation]

Try a multimonitor setup, and you'll see the point. Until the day that crossfire works in window mode though, the use of multiple cards is slightly limtied in a real world scenario, but it does boost performance in fullscreen sufficiently to warrent the extra cost.
 
[citation][nom]barmaley[/nom]Is it me or this case really does not have enough openings for all the expansion slots on the motherboard? Look for yourself pic#2 - there is no opening if you were to install a video card in the bottom green slot. The next one up (black one) also looks like it's not lining up with any card opening either.[/citation]

Thats because they state that the case they intended to originally use couldn't fit the motherboard. Pic #2 and #18 are different cases

"...we planned to showcase the premium board in Cooler Master's Cosmos S enclosure. I admit my surprise when the motherboard didn’t fit in this large EATX case—the G1.Assassin requires the XL-ATX form factor. Note how the board bleeds into space required by the power supply!"
 
"So much else about this build is ludicrously overpowered that we thought Kingston’s HyperX T1 Black memory modules would be a perfect fit. Three 3 GB sticks of the stuff results in a a total 12 GB of DDR3 RAM running at 1600 MT/s. Note the colossal heat sinks (like you didn’t already)."

Unless you smuggled in 3Gb extra memory in somewhere I think your maths has a problem with the above statement :)
 
Quick Boost front panel eh? Better not be like the Turbo buttons of old...

Honestly, the only reason you'd run more than 2 cards is if you need the GPGPU support or better performance with 3D and/or Eyefinity/Multi-Display support. I don't see the reason for the dedicated 1GB DDR2 RAM for the "NPU." Still, that is a hawt looking mobo. Toss me the mag heatsink and I'd be happy enough.
 
[citation][nom]Maximus_Delta[/nom]Hard core gamers out their will love this stacked with 4 graphics cards to show off at the local LAN party. Imagine the electricity bills tho... 600-700 watts at the wall? 4-5 hours gaming sessions each night?[/citation]
and then when they plug it in and blow the fuse, everyone will let out a huge sigh when the power goes and people will hate them
 
If those are 1gb video cards you need to replace them with 2gb models because I know that this system will be used with eyefinity and the cards need the extra ram to show their full powah!
 
[citation][nom]The Greater Good[/nom]For the love of GOD, MAN!!! That's a magazine, NOT A CLIP. A clip is for something like the SKS or the M1 Garand.[/citation]
call it a clip or mag whatever you want....it's ugly
 
[citation][nom]nebun[/nom]call it a clip or mag whatever you want....it's ugly[/citation]Beauty in the eye of the beholder and all that :)

 
[citation][nom]rohitbaran[/nom]+1[/citation]

Not +1 but +100000000000

[citation][nom]Article[/nom]The quartet of Radeon HD 5870s we had on hand for testing doesn't scale as well as the newer models (when they're running correctly[/citation]

Of course they're not stable.Did you notice the fact that all these cards throw their hot air inside the case? so basically, even if the 5870 doesn't produce a lot of heat, the back of three of these cards will be heated by the air coming from the card above them.
 
[citation][nom]red1776[/nom]Not if you are running multi screen resolutions.That is a common misconception. You might do some reading on PCIE scaling (it can be found right here at TOM's.) I have proof of this sitting on my desk, and have built them for customers as well. The 4th slot on my Asus C4F is an x4, occupied by a 4th Asus EAH DirectCu 5850,(before you say it, I use a PCIE riser pass-thru) and it adds frames nicely to shader heavy games running at 5760 x 1080. Have a look at the scaling down to a single lane here:http://www.techpowerup.com/reviews [...] ng/25.htmlFrom Tom'shttp://www.tomshardware.co.uk/pcie [...] 964-4.htmlFar from an "appendix" http://www.techspot.com/gallery/me [...] -high.htmlhttp://www.techspot.com/gallery/me [...] -high.html[/citation]

Guys, the four slots on this board will run in x8/x8/x8/x8 mode when they're all populated. Also, the four of them are PCI-E 2.0 capable. You want proof, here it is

http://www.gigabyte.com/products/product-page.aspx?pid=3752#sp

If you think they're wrong, you can download the manual and see for yourself
 
Replace that trying-too-hard-to-be-cool magazine heatsink with a hello kitty or something equally silly and I'd buy it just for the wtf factor.
 
This board is for people with a lot more money than sense. The disclaimer on the heat sink shows that Gigabyte is aware of that fact.
All that just to play games is a phenomenal waste. It doesn't need the bling, but a board like this only begins to make sense for those using CUDA/GPGPU apps to do real work.
 
why such a large board? My motherboard supports 4 cards too, and its normal size. Really don't see the point in making it so big.
 
Status
Not open for further replies.