Gigabyte Shows Off An X99 Motherboard, Too

Status
Not open for further replies.

Shneiky

Distinguished
It is indeed catchy, but I don't like the gaming logo. Don't like ROG logo either. Hope ASrock switches to pure black (instead of the dark brownish) like everyone else. Then it will be so sexy of a board with that golden accents.
 

Treynolds416

Honorable
Jul 5, 2013
30
0
10,530
Those green caps on the audio turn me off. It's still a nice looking board, but so far MSI's solution is the nicest looking in my opinion
 

red77star

Honorable
Oct 16, 2013
230
0
10,680
X99 + Haswell-E is only combo to go with...everything else is blah. You pay upfront a bit more but upgrade path is well set just like it was with x79 and also you don't have to do upgrade in years to come...saves money on the long run if you are enthusiast.
 

utengineer

Distinguished
Feb 11, 2010
169
0
18,680
Intel needs to open up the partner NDAs on x99 for this release. AMD has no competitive product to worry about. No need to keep the technical specifications under wraps and prevent MB partners from leaking specifics. At this point, the first boards coming to market are already finalized for production and manufacturing.
 

ap3x

Distinguished
May 17, 2009
596
0
18,980
Where is my Asus Gene. Hurry up! Nice looking board from Gigabyte though. I am long overdue for a new machine so I am super excited about X99 and Haswell-E.
 

Heironious

Honorable
Oct 18, 2012
687
0
11,360
Red is for gaming, yellow is for extreme overclocking, and blue is for every day normal computers. It is now what motherboard makers are doing for less confusion to consumers. So says Maximum PC.
 

mapesdhs

Distinguished
If it's that low, a lot of potential high-end users will not buy X99. 64GB is not enough to cope with 4K/AE, etc.
Expecting solo professionals to handle 300% higher workloads while at the same time reducing the max RAM
per thread by 30% is a bad idea.

Ian.

 

Shneiky

Distinguished
Well 4K AE is possible within 64 GBs, you just need to precomp from time to time. But anyway, for 4K, who uses AE? Nuke and NukeX is the way to go. (J.J. Abrams, I am looking at you with your Video Copilot Lens Flares)

Back to the topic - I am expecting the boards to be able to support 128 GBs when paired with a Xeon. I don't think Intel will allow the memory controller of Haswell-E to go more than 64 Gbs. Else, they would canibalize their Xeon sales. I hope I am wrong.
 

DrBackwater

Honorable
Jun 10, 2013
362
0
10,810
Poor mans entry level for xeon market.

Haswell 8 core 16 threads / xeon 16 core 32 threads
X99 Ddr 4 16 gig per channel 128 / sr 2 evga 128gig

Seriously if your gamer avoid these specs (its a waste of money well the 8 channel x99. not the haswell cpu)

If you buy ferraris and enjoy driving to the shops for milk then enjoy ddr4 costs.
 

mapesdhs

Distinguished


I've already seen normal HD use up 64GB. It's not enough for 4K, period, especially not if the no. of cores is
moving from 6 to 8.

Nuke is fine for those who can afford it. There's a lot of solo pro's out there on a tight budget.


> Back to the topic - I am expecting the boards to be able to support 128 GBs when paired
> with a Xeon. I don't think Intel will allow the memory controller of Haswell-E to go more than
> 64 Gbs. Else, they would canibalize their Xeon sales. I hope I am wrong.

Then Intel would be really stupid if they did such a thing. Besides, the demands scale with industry
& sectors; I doubt it would harm XEON sales when those needing a lot more than 256GB will not be
buying consumer setups anyway, eg. ANSYS, GIS, defense imaging & photogrammatry, etc.

Ian.

 

jenojaxx

Honorable
Jun 7, 2013
156
0
10,710

-waste of money yes , but not for the rich -
That extreme haswell is like a bugatti as you said , and for some (not for me) it's worth it . The 4channel memory too.
BUT thoose Xeon E5-xxxx v2 and v3 are like buses or trucks , they get the job done , but they are waaaayyyy overkill (and damn expensive from 1.5k to 3k)
 

mapesdhs

Distinguished
Saying XEONs are overkill depends entirely on the user and the intended task. For those at whom XEONs are
aimed, just one of them is almost certainly never enough, except for ANSYS perhaps. For GIS and many other
tasks, dozens at least are preferable. They're expensive because they do cost a lot to make (lots of cache
RAM on those things, and other logic for multi-socket links, etc.) and the target market can afford such pricing.
3K for a CPU is nothing when a complete system can easily be $1.5M. I have an old SGI in my garage which
was about that much when new in 1993 (24 CPUs, 2GB RAM, which in 1993 was enormous), used by various
car companies, oil/gas corps, etc., over the years, until I bought it used for a snip.

Ian.

 

DrBackwater

Honorable
Jun 10, 2013
362
0
10,810


It's crazy, games are nowhere near the specs or demands these systems need. and even if you were doing a render farm or cloud solutions- these go to stop shop specs are poor entry level components.

Pc components remind me of a gym session: oh boys, check those abbs out, gotta bench some more, can't let the competition catch up. WHAT competition. the markets saturated with stuff no one needs, well with out being non descriptive.

Ivy bridge, sandy bridge 6 cores are perfect for anything. and latency stability on x99 defeats anything anyone would want because if it was for game related reasons on higher frame rate ( its like maintaining a rally car,;nothing but problems. if your so called ddr4 emplodes and walks out the door you must replace it, hope you like doing that.) replacing a broken channel, or trying to test whats wrong with your system due to faulty memory sticks like trying to babysit a database. pain in arse all for same results.

x99, x79, am+3. dam these hardware wars almost sounds like a console war. they all fundamentally do same thing, push 0-1.



 

Shneiky

Distinguished
@DrBackwater,
Why do you base your argument on games? Sure - they are a big market, but they are not the "one and only".

E2650v2 - 8 cores at 2.6 GHz - price is 1200 Euros. The Xeons that run 8 cores at 2.9 and 3.0 GHz are 2000. The new 5960x will be half the price of those.

That I7 is a perfect replacement. Cheaper and overclock able. When you put your CPU to productivity - you appreciate it more and more. You might find anything new in the Enthusiast segment CPU redundant - that is your position with your opinion and your needs. Well I find any graphics card over GTX 660 as an overkill and completely unnecessary - but that is my opinion and my needs.

The X*9 chip-sets were and will remain the poor mans Xeon. I want one, I need one, but I don't have the cash for one. There is a passenger for every train, and if their numbers for the same spot are enough, someone will make a train for every passenger. Anything and everything is only worth as much as someone is willing to pay for it.
 

mapesdhs

Distinguished


Consider yourself awarded several gazillion points for saying that. 8) It's something I've repeated in
numerous discussions, but so many just don't get it. Numerous tech products are expensive because
of how much the customer is happy to pay - especially true in the movie industry, defense, etc. Years
ago I was unable to sell a particular model of old UNIX workstation because my prices were too low;
the commercial perception didn't match (the original value was very high, so a used price that was too
low meant awkward questions within company purchasing depts.) After I put the price up by about a
factor of 4, then they sold ok. Funny old world.

Ian.

 
Status
Not open for further replies.