Xotic PC's New Exodus PC Costs $6,700

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Seriously - how does having a ridiculously expensive SSD like an Intel 750 make a better PC than one that doesn't? Same thing with a 1500W power supply - even a single Titan X doesn't need it.
 
Yes, this machine is stupidly overpriced / underpowered.

BUT, all the people saying striping a bunch of SSDs together is useless have never done it right. You need a RAID controller, not that silly software nonsense in the BIOS. I've been running for about 2.5 years 4x OCZ Vector 240GB in a stripe backed by a LSI MegaRAID 9280-4i4e. With the battery installed, you get 512MB of cache that can return success on uncommitted writes, as long as the controller has the data in memory. Also, since it's hardware offload, Windows stops trying to optimize I/O, and just lets the controller handle queuing and caching by itself.

So, for transfers of a couple hundred megs, which is OS stuff, it's pretty instantaneous. For larger writes, I get just under 2GB/s with 4K blocks and larger, both directions, all day long, sequential or not, any queue depth. About 300K IOPS, give or take, either direction. Keep in mind, this is an older controller, running older SATA devices. A modern controller would be even faster. Add SAS SSDs, and now you are getting into serious high performance (and cash burn, granted)

Yes, I know that I'm gonna kill a drive someday, which is why it backs up to more robust storage daily. I've restored before and I'll restore again.

The bulk array is all spinners and it is much less impressive, but I can still move sequential data in and out at roughly 500MB/s on a four-drive RAID-5 using Toshiba 6TB disks, which are definitely not the fastest.

So, if you want real performance, give money and a PCIe slot to LSI. Life is good here.

Also, I actually did build a $7K machine, this is part of it. This is older stuff, but my box could stomp the crap out of this one in anything other than GPU performance when it was built, and I've upgraded the GPUs since then. The important thing is: buy really expensive Xeons. They don't overclock because there is no headroom at all; you are already living on the edge, it's already as fast as it's gonna get. FWIW I've never seen mine drop below 3.4 even when all 16 threads are cranking, these are the best-binned, most flawless chips Intel makes. The LGA2011-3 i7 chips are Xeons with too many defects to be Xeons, remember that. The Haswell Xeons have 18 cores in their top configuration, 8 in the lowest, and the 5960K which is the only one without defective cores only runs at 3GHz, which was OK for me, but you have a Haswell and I have a Sandy Bridge.

Intel Xeon E5-2687W (8C/16T 3.1/3.8 20M)
Asus Rampage IV Extreme
4x 4GB Hynix DDR3-1600 ECC RDIMM
3x EVGA GeForce GTX 680 2GB
LSI MegaRAID 9280-4i4e plus LSIBBU-07
4x OCZ Vertex 240GB
4x Hitachi Deskstar 7200 4TB
Thermaltake Toughpower Platinum XT 1275W
A Corsair water block, forget the model but it's 120mm
2x Noctua 120mm fans for that, and one 80mm Noctua case fan
...and the chassis is a Sun Ultra 24 just for fun.

The display is an Apple 27" Cinema Display, which is counted as part of the $7K total, so the parts themselves were $6K or so. The CPU was $2000 by itself, which was the most expensive component or subsystem. Board was $500, GPUs were $500 a throw, RAID card with the battery was about $500 as well, PSU was a couple hundred, I think all the RAM was about $600, all the disks were probably $1000 combined, another $200 for cooling / fans / random things.

What can I say, I really like having all the drives able to hotswap, and the backplane is keyed for SAS. Granted, everything BARELY fits, and cooling is a challenge, but it runs within reasonable limits. Also, this was the very last machine Sun produced before they got borged... no such thing as an Oracle Ultra 24, but they did put their filthy badge on the Ultra 27. Eww. Mine has the Sun Microsystems logo in glorious purple on the side. What can I say, I had a soft spot for those guys. Also, they gave me the machine for free long ago, and it is genuinely a well built chassis. Modern improvements are the 6TB Toshibas (which are larger, of course, but noticeably slower) and 3x GTX 970 4GB cards to replace the 680s. BTW, buy EVGA. I had one 680 fail, they replaced all three so as not to break SLI for me. For free. Love that.

So, when you are spending $7000 on a PC, I expect to see signs of serious mental illness, not Intel's weaksauce consumer socket. I want to see *every possible thing* crammed into a chassis with no room left over for anything else. And, I mean, 18 cores! you can have 18 GODDAMN CORES now! If you are only doing single threaded stuff, sorry, you're wrong. Start encoding H.265 or something.
 
I dunno why that was brought in, the 1300 G2 that is in the Exodus was already way, way oversized but which is only $159. For $400, neither the 1500 watt, Titanium rating or full modular brings anything to the table for the money spent in a pre-built system. Twin 980 Tis run just fine on a 1050.

Personally, I would have used a 1050 Snow Silent from Seasonic ($219) but if ya wanna charge $6,700, I can see why they went with a "bigger number" but lower build quality rating when only 1050 watts was needed. Getting 1300 watts implies that ya getting more when in reality, it's something that you get no benefit from.
 


The XoticPC build uses components that seem more intended to convey an image of increased performance rather than actual measured performance. That is listing high performance options that in reality the user won't be able to take advantage of given the intended usage gives the appearance that this investment will pay off in some way. Meanwhile, it's the high cost premium charged for upgrades (an order of magnitude above the actual component cost difference) which will help pay the labor cost of installing a very customized water cooling system. As for the "doing SSD's right" part, no doubt that doing that way gives you very impressive transfer rates.... the question is , can you benefit from them and this will depend upon how the system is used.

I have an old Porshe Turbo from the late 70s that is in storage. It can hit 145 mph so would it be worthwhile to drive it to work each day to reduce my commuting time ? During rush hour, traffic keeps everyone averaging about 25 mph on the 3 line highway, so considering outside limitaions, that answer is no.

I just copied a 1.69 GB file from 1 partition on a laptop drive to another, so I had the HD doing double duty read and write so it had to jump back and forth which is the worst condition possible. It only took 31 seconds and took place in the background. Copying a 140MB file took less than 3 seconds with a good part of that I think being my reaction time in stopping the stopwatch :).... and this is on a 2.5' HD !

On the lappie, loading AutoCAD the first time after a fresh boot takes a bit, it has to check in w/ AuotoDesk to make sure license is legit and sync w/ various other apps and plug ins. If I close it and open it an hour later it will open in 7.5 seconds. It still needs to check the server and do a few bits of housekeeping, to make sure files stored on server are available. On the workstation / server, this obviously proceeds much faster as no wireless networking involved. So Question 1, outside of benchmarks, in a production environment, will I see an increase in productivity with my CAD operators if I use the hardware RAID system ?

On my workstation, I set it up with multiple SSDs, SSHDs and a HD. The SSD boot takes 15.6 seconds, the SSHD 16.5 and the very old Seagate XT HD takes 21.2. The question is not whether one option provides a speed advantage over another...it's whether one option provided a speed and return on investment for things you actually do. The workstation is used as CAD workstation by numerous users as well as my oldest son and myself for gaming and his flight sims (he's a pilot). We can boot or run any program / game off any drive. I have had several users boot from each of the drives, load AutoCAD files off each of the drives and play games off each of the drives. Without telling anyone when I switched locations, no one noticed the switches except in one instance. Booting off the HD 1 person my son thought it was taking longer to boot when booting off the HD. When booting of the SSHD versus SSD, no one noticed. Question 2, will I see an increase in productivity from my employees by shortening their boot time using hardware RAID ?

So while 2 GB/second is impressive, how noticeable will it be when the largest file you load is 20 MB ? Will one observe a difference between 0.4 seconds @ 500 MB/s and 0.1 seconds @ 2000 MB/s. Yes, there are uses for those speeds but how does the user's experience change between one and the other on a gaming or even CAD workstation PC ? Question 3 ....Will my employees realize an increase in productivity from loading these files faster via a hardware RAID solution ?

Now the one time that our storage systems surely will benefit from faster speeds is when we backup all the server files to a mirrored drive ... not mirrored as in RAID 1, but mirrored using backup software. Question 4, can we gain some advantage here ?

The monkey wrench in this equation is how the user interacts with the machine, so let's see how that impacts those 4 questions.

1. If I review a set of say 4 drawings, mark them up in red ink and drop it on CAD Operator's desk, the typically post lunchtime timeline will look like something like this:

13:00:00 - Operator clicks icon to open the program
13:00:01 - Operator begins reviewing markups on 1st sheet
13:00:08 - Programs finishes loading
13:00:26 - Operator clicks on appropriate file on recent file list
13:00:27 - Operator peruses changes on sheets 2 thru 4
13:00:36 - File finishes opening including referenced / linked files stored at various locations
13:00:55 - Operator turns back to 1st paper sheet, goes into model space tab and begins edits
13:08:15 - Operator completes Sheet 1 edits
13:08:19 - Operator prints Sheet 1
13:08:23 - Operator begins editing Sheet 2
yada yada yada
13:31:21 - Operator completes Sheet 4 edits
13:31:25 - Operator prints Sheet 4
13:31:29 - Operator saves file
31:33 - While waiting for last sheet to plot, Operator fills out time sheet, calls home and tells wifie bringing home milk and bread
34:00 - Operator grabs new prints, puts old markups on top and drops on my desk.

So while there are some instances where the user would sit and wait if all he / she had to do was stare at the screen, various other tasks must be performed away from the screen which compensate for that "loading time", often many times over. If I boil potatoes, I could certainly fill the pot faster and bring it to a boil faster with a higher flow faucet and higher BTU burner. But if the pot takes 30 seconds to fill and 4:30 to bring it to a boil, spending for the ah heck burner and faucet won't have an impact if it takes me 8 minutes to peel and slice the taters. The water will be boiling 3 minutes before the taters are ready.

2. Boot time....

08:30:00 - Operators A and B arrive together (carpool) and start their PC's, then take off their jackets on heads to sink to clean grunge outta his coffee cup, other one checks her phone messages.
08:30:15.6 - Operator A's machine with SSD is at Login screen
08:30:16.5 - Operator B's machine with SSHD is at Login screen
08:30:05 - Let's say if they had hardware RAID, they'd boot in 5 seconds
08:31.30 - Both get back to their KBs, log in ...he heads to coffee machine, she returns telephone calls
08:35:00 - Both operators get back to their KBs, long after login process completes.

3. File loading..

Basically covered in 1. When working on the workstation, no single file takes more than a second or perhaps 2 to load with that time used by whomever is modifying it to review what changes will be performed. Because of the SSHD, all recently used files are likely to be found on the SSD portion. Dropping that to 1/100th of a second won't impact the user's timeline as he / she is thinking, getting their thoughts in order on how the file will be modified.

4. Backups....

Backing up the 4 server partitions which hold all the firm's business records from 1985 takes while a while. Since only files modified each day need to be mirrored, each partition takes about 7-10 minutes. These are scheduled after hours, with each partition scheduled an hour apart starting at 7, 8, 9 and 10 pm when no one (except me or my son perhaps) is in the office. It all occurs in the background regardless of what I am doing as, at that hour, I am usually working on my lappie or gaming; the game files are on another drive.

So the discussion I think really isn't an argument in the sense that both positions are correct. Where file usage depends more on user interaction, the user will always be the logjam ... the storage system will always be ahead of the user. On a gaming box, which is the subject here, in the few seconds it takes the game to load, I am looking at my quest notes, material needs, recipes and trying to wrap my tired-ass brain cells around where I was at last time I played. The games is always ready before I am :)

If your are consistently making large data transfers, (not a common activity on a gaming box), editing / encoding large video files in a production environment or providing a streaming service, then extremely fast storage subsystem speed will be a top priority. But then we'd be talking about something other than a gaming box.
 


No that RAM requirement was lifted in Windows 8.1. Windows 8.1 and forward has a RAM limit of 128GB. That's mainly because Microsoft no longer sells the "Ultimate" version. You still need Pro for VM and running language packs and that sort of thing.
 


There are users here (not necessarily in this thread) that will swear up and down you are 100% incorrect.
"RAID 0 + SSD is always faster and worth any possible hassle or fail potential." (or words to that effect)
 
Again, I can only go by my own expreience and all the test sites / reviews I have read. I'd welcome a reliable source that said otherwise based upon something other than anecdotal evidence. Outside of special applications and settings, I have just never read of or experienced a benefit. Here's what HardForum had to say on the subject

"However, many have tried to justify/overlook those shortcomings by simply saying "It's faster." Anyone who does this is wrong, wasting their money, and buying into hype. Nothing more." There's more quotes below.

I have twin Samsung 256 GB pros.... ran them in RAID for 3 months:

1. After getting nothing out of it but benchmark bragging rights, I called Samsung and rec'd the following:

a) We do not recommend RAID
b) Our utilities are non-functional in RAID
c) We do not provide technical support for RAID installations.

2. UEFI Boot time improved with single SSD.

3. Program performance / file loading showed no observable difference overall, but some hitches / difficulties disappeared.

4. I don't play a lot of games (3-4 a year, maybe) but my son is generally on there after I hit the sack running flight sims or gaming. We saw no benefit and dealt with a few glitches will the array was in place.

You were probable around when I posted this collection of test / review quotes more years ago than I care to admit with regard to RAID 0 on HDs:


http://en.wikipedia.org/wiki/RAID_0#RAID_0

RAID 0 is useful for setups such as large read-only NFS servers where mounting many disks is time-consuming or impossible and redundancy is irrelevant.

RAID 0 is also used in some gaming systems where performance is desired and data integrity is not very important. However, real-world tests with games have shown that RAID-0 performance gains are minimal, although some desktop applications will benefit.[1][2]


http://www.anandtech.com/printarticle.aspx?i=2101
"We were hoping to see some sort of performance increase in the game loading tests, but the RAID array didn't give us that. While the scores put the RAID-0 array slightly slower than the single drive Raptor II, you should also remember that these scores are timed by hand and thus, we're dealing within normal variations in the "benchmark".

Our Unreal Tournament 2004 test uses the full version of the game and leaves all settings on defaults. After launching the game, we select Instant Action from the menu, choose Assault mode and select the Robot Factory level. The stop watch timer is started right after the Play button is clicked, and stopped when the loading screen disappears. The test is repeated three times with the final score reported being an average of the three. In order to avoid the effects of caching, we reboot between runs. All times are reported in seconds; lower scores, obviously, being better. In Unreal Tournament, we're left with exactly no performance improvement, thanks to RAID-0

If you haven't gotten the hint by now, we'll spell it out for you: there is no place, and no need for a RAID-0 array on a desktop computer. The real world performance increases are negligible at best and the reduction in reliability, thanks to a halving of the mean time between failure, makes RAID-0 far from worth it on the desktop.

Bottom line: RAID-0 arrays will win you just about any benchmark, but they'll deliver virtually nothing more than that for real world desktop performance. That's just the cold hard truth."


http://www.techwarelabs.com/articles/hardware/raid-and-gaming/index_6.shtml
".....we did not see an increase in FPS through its use. Load times for levels and games was significantly reduced utilizing the Raid controller and array. As we stated we do not expect that the majority of gamers are willing to purchase greater than 4 drives and a controller for this kind of setup. While onboard Raid is an option available to many users you should be aware that using onboard Raid will mean the consumption of CPU time for this task and thus a reduction in performance that may actually lead to worse FPS. An add-on controller will always be the best option until they integrate discreet Raid controllers with their own memory into consumer level motherboards."

http://www.hardforum.com/showthread.php?t=1001325
"However, many have tried to justify/overlook those shortcomings by simply saying "It's faster." Anyone who does this is wrong, wasting their money, and buying into hype. Nothing more."

http://jeff-sue.suite101.com/how-raid-storage-improves-performance-a101975
"The real-world performance benefits possible in a single-user PC situation is not a given for most people, because the benefits rely on multiple independent, simultaneous requests. One person running most desktop applications may not see a big payback in performance because they are not written to do asynchronous I/O to disks. Understanding this can help avoid disappointment."

http://www.scs-myung.com/v2/index. [...] om_content
"What about performance? This, we suspect, is the primary reason why so many users doggedly pursue the RAID 0 "holy grail." This inevitably leads to dissapointment by those that notice little or no performance gain.....As stated above, first person shooters rarely benefit from RAID 0.__ Frame rates will almost certainly not improve, as they are determined by your video card and processor above all else. In fact, theoretically your FPS frame rate may decrease, since many low-cost RAID controllers (anything made by Highpoint at the tiem of this writing, and most cards from Promise) implement RAID in software, so the process of splitting and combining data across your drives is done by your CPU, which could better be utilized by your game. That said, the CPU overhead of RAID0 is minimal on high-performance processors."

Even the HD manufacturers limit RAID's advantages to very specific applications and non of them involves gaming:

http://westerndigital.com/en/products/raid/http://westerndigital.com/en/products/raid/


And here's a new one:

http://www.tomshardware.com/reviews/ssd-raid-benchmark,3485-13.html

"One SSD on its own scores again in the contrived tests we put together. The performance differences when we boot up and shut down Windows 8, then fire up different applications, are marginal at best and not noticeable in practice. Single drives actually manage to outperform the striped arrays some of the time, even.

If you're planning an upgrade and want to know whether to buy a couple of 128 GB drives and put them in RAID 0 or just grab a single 256 GB SSD, for example, the answer still seems clear enough to us: just grab the large drive and use one. Using Samsung's 840 Pros as an example, a pair of 128 GB drives will run you $300 on Newegg right now. The 256 GB model sells for $240 (maybe that's why it's out of stock currently). There's also the issue of reliability. If one drive in a RAID 0 configuration fails, the entire array is lost. At least for a primary system drive, one SSD on its own is safer."


I do see a lot of video production sites recommending RAID 0 arrays for production (read "time is money") usage.....in other workstation apps I see potential applications but for example you'd need some rather extremely expensive rendering setups to make it pay off there. A server streaming movies to multiple users is another obvious application. But as for an increase in Gaming performance ... I'm gonna paraphrase Cuba Gooding (Jerry McQuire) and say "Show me the numbers" on a reputable review site (and by that I am excluding youtube videos using specifically tailored situations where conditions are manufactured to create an instance where supporting data can be "created"). Normal usage as might be expected from any person sitting down and loading / playing a game

https://www.youtube.com/watch?v=OaiSHcHM0PA


 
I would love to get in the head of any moron that thinks this is even remotely a deal. Fact is I could build 3 computers that would all out perform this overpriced thing! What a joke!
 


I'm done responding to those who pay no heed to business practices and costs.
 
i know these setups are intended for those with too much money and not enough knowledge but here they have made some very easily corrected design flaws.
maybe they are just looking for a way to unload some extra hardware they had lying around + still tack on their "builder/supplier" additional charges.
 
My PC is basically identical except for using a 4790k (plenty for gaming) and EVGA GTX 980 Ti FTWs. The cost including an Acer g-sync monitor and a limited edition motherboard was thousands less than this machine. Bottom line their price is way over the top, at least in my opinion. Also my 4790K also runs at 4.8MHz and cool at that. For this price you would expect to at least see a custom water cooled loop with nice rigid acrylic tubing or as has been mentioned Quad SLI as lets face it SLI is really NOT going to give hard core gamers the experience they have come to expect when going to 4K. That one is just my opinion based on research when deciding to go for a 1440 or 4k monitor. I personally chose 2560 x 1440 as low frame rates given the price of the cards would have pissed me off big time.
 

The problem with the inWin case is that it does not have good reviews. I have been using a NZXT Phantom 820 for building a custom computer, I had all the hardware installed, then the multicolor lighting of the case went out. The board they use is a small flimsy thing. I am wrestling with NZXT to provide another one. This is one computer that is sitting and I cannot sell. CaseLabs can provide better quality around $450, a Merling case
Also Fractal Design has a new case, the S model, with big window and lots of internal space. Space for 3 HDD and 2 SSD in the back of the case.
Quality saves time and time is money.
 
how about to start; 1 or 1.5TB SSD, more 7200rpm storage, 8 cores, not going over the top with the PSU or the RAM.
pretty much the same that has been stated over and over again here and you keep arguing for some reason would entail it to cost another few thousand.
no one, even in the US, gets paid $20+ per hour to do basic manufacturing/assembly. and no one should be charging 100% profit for anything.
 
Most stuff in this world is charged for 500% profit or higher. On any game utilizing 4 cores and less, the 6700K will perform better theoretically than the Haswell architecture X99 CPUs because of its better architecture, and it can overclock better. This is a gaming machine. Storage is your personal opinion. If someone wants to, they can take those solid state drives out of RAID. The PSU is what it is, and so is the RAM. If they downgraded either, it would sell for less money and they'd make less money. And for one who's not tech-savvy, 32GB RAM and 1300W PSU sounds quite beefy.

It doesn't matter what the workers are paid, the point is they have to make money, and the person who'd buy this at a $4500 price would probably still buy it at the $6700 price.
 
i'm guessing you have been ripped off by the pre-built scam and are just here trying to rationalize your own decision over and over and over and over and over...

 


I do not own a prebuilt, I only build my own. I sense frustration in you. Please do not get angered with me or take my response into a personal judgement upon me.
 


Okay.

@Gobonzo: It is best to admit mistakes when they occur, and to seek to restore honor. Forgive me if I appeared not humble in my tone, as I myself harnessed anger.
 


You are so wrong is not funny. I am an electronics engineer, have a BSEE, and Master In Business Administration. You have to start from the other end:
1. Rental and maintenance of the building you are using to build your computers.
2. Hourly rate for the technicians building the computers, they have to know what they are doing so it would be $30/hr plus the Health Plan and Social Security.
3. Cost of furniture, desks, chairs, tables, burn-in shelves, parts storage.
4. Cost of software to maintain inventory, check sales, check warranties,
5. People in the help desk, answering questions from clients or prospective clients.
6. Many ancillary people, accounting, sales, engineering, etc.
7. Capital cost of goods, the stuff you use to build the computers.
8. Electricity and other facilities.
9. Cost of components that have to be eaten because they failed during burn-in.
9. Interest cost of loans you take in order to buy the components for the computers.
These is a small list of all the expenses. Your way would have any company going bankrupt in three months.
People that don't know about manufacturing should not make statements about the cost of a computer. If it is too expensive for you, buy a cheap computer and see what it gets you.
 


 
I have used computers with RAID boards for many years. Their hardware solution is way faster than software RAID. On the subject of RAID 5, many computer companies do not recommend it because you can lose all your data if the wrong disk goes bad. I used RAID 10 most of the time, is an HDD goes bad you just replace it with a new one, and the data is restored from the other branch.
Doing RAID 0 in SSD or M.2 may not bring the same improvements that showed when using only HDDs.
 


3 computers with custom, hand bent acrylic tubing ? That is what is being sold here. I could put $400 PC with a H series CLC on a shelf in Walmart, and some kid will bring it home beaming that he scored a "high performance water cooled PC" for only $800 !


For this price you would expect to at least see a custom water cooled loop with nice rigid acrylic tubing

For this price you **are getting** "custom water cooled loop with nice rigid acrylic tubing"

The XoticPC component listappears as if it was created to "impress" rather than perform. How often do we see builds here where the poster asks for opinions on the build and the selections look like they were made by filtering each component, sorting on "most expensive" and picking the one at the top.

The marketing thought appears to be wow them with the custom acrylic stuff ... that will impress their friends for sure ... and then for the component list, select the most techno sounding offerings full of buzz words (130 watts ... wow ! ... RAID...Wow !,... X99 w/ 2 x SLI .... wow ....etc) that give the impression of high performance to the uninformed but which bring nothing to the table for a gaming box. No, it isn't a matter of it being a bad PSU, it's not even an expensive PS, many good 850s cost that much, But that build will benefit in no way at all from having 1300 watts, it's only there because the "wow versus cost ratio" is high.

You can easily spend $6,500 - $7,500 on a custom water cooled build w/ rigid tubing, brass fittings, custom cables, etc (including tooling) using a 6700k, twin 980 Tis, pair of SSDs, pair of HDs, speed controls, temp monitoring etc

Water Cooling, Cable components, WC Fittings, Tubing, Sleeving and Tools - $2,400
PC Components, KB, Mouse, Monitor, 16 fans - $4,100
Wires for custom cables - $110

They can knock the price down by eliminating the fittings ($1000) but by doing the hand bending, labor costs go up. The thing is, you can't get anyone to build you a custom hand bent acrylic build for the price of the alternate builds being proposed. These are "works of art". A 50 year old executive doesn't buy a $100,000 Porsche because he needs something with more performance, to commute between client's offices; he buys it for the statement it makes when he pulls up. I could drop a 427 into a Vega and get similar straight line acceleration, but that doesn't make them "the same".

It took me over 100 hours and I used fittings for my loop; bending takes much longer. Geek Squad charges $150 to set up a wireless router that takes what, an hour on a bad day ? Last I heard (2014) MS was $499 per incident.

So yes, you could build a better performing computer for much less by dropping the "wow" stuff and replacing it with items that are cheaper and faster, but by the time ya add in the WC components, the tooling and the value of your time / labor, $6,500 isn't bad except for ya are overpaying for components that don't add anything. Undoubtedly the could chop $500 and more off dropping the silly stuff, but they'd also have to cut their margins without the wow factor and lose about $500 w/o the wow stuff.

Digital Storm charges $1773 for hardline tubing option.

Digital Storm Apollo - $4,994
Intel Core i5 6600K 3.5GHz (Codename Skylake) (Unlocked CPU) (Quad Core)
GIGABYTE Z170XP-SLI (Intel Z170 Chipset) (Up to 4x PCI-E Devices)
16GB DDR4 2666MHz Digital Storm Certified Performance Series
1x SSD (250GB Samsung 850 EVO)
1x Storage (1TB SeagateWestern)
2x SLI Dual (NVIDIA GeForce GTX 980 Ti 6GB (ASUS Strix Edition) (VR Ready) DigitalToshiba)
H20: Hardline: Digital Storm Acrylic Tubing Custom Cooling System (Video Cards + CPU)
1000W Corsair HX1000i
DVD-R/RW/CD-R/RW (DVD Writer 24x / CD-Writer 48x)
Hardline tubing for video cards and CPU

PCpartpicker puts those at about $2533 w/o case fans (and WC) totaling $4,306 w/ the water cooling at DS's cost. That leaves $690 for labor, overhead and profit

Now subtract for 4 year warranty
Lifetime TS
Shipping
Order Processing
Insurance
Overhead and Profit
Lotta labor assembling the system

So, yes, we could always build it cheaper ourselves but while most of the audience here will tackle an air cooled or even CLC build, it's not like for the service provided by DS is making a killing here. XoticPC on the other hand.... is adding a lot of questionable stuff to justify a much larger markup.




Undoubtedly true ... but relevant to the question at hand ?

If I drive the Posrsche to work instead of the SUV, will I get to work any faster. The Porsche, like the hardware RAID, is certainly much faster off the line, at top end and in cornering speeds.... but when rush hour traffic limps along at 25 mph, how do I get to work any earlier ?

If I put hardware RAID on my office manager's box, will she type more report pages ?
If I put hardware RAID on my CAD Operator's box, will he get more drawings out ?
If Xotic puts hardware RAID on their gaming box, will the user see any observable impact on his gaming experience ?

If I spend $600 for hardware RAID, for each machine in my office, how long will it take for me to realize a return on my investment in increased productivity ?

Let's say I have SSHDs in 10 boxes. Adding a 250 GB SSD would cost me $90 per box. Each day the systems will boot in 15.6 seconds instead of 16.5.

That's 0.9 seconds per day, x 220 working days per year = 0.055 hours. If I bill them out at an average of $90 an hour, that's $1.50 a year.... It will take me 60 years to break even. However, the reality is that whether the system boots in 15.6 or 16.5 seconds, won't change anything as everyone is returning calls, getting coffee or talking abut last night's game while their machines are booting. Will SSDs in RAID change this ? Will HW RAID change this ?

Tires that are rated and balanced for 200 mph handle better than those with lower ratings but if ya never break 65 mph, it isn't really doing anything for you.

As above, I set up a machine to boot off a 5 year old HD, a SSSD and a SSHD .... 4 outta 5 people in the office never noticed that the HD was slower.... 5/5 never noticed a difference between the SSD and SSHD.

In benchmarks, hardware RAID will show extraordinary differences, however.... what most people do on their machines is incapable of taking advantage of this speed.

Invariably the reasons behind RAID in any discussion always seem to wind up with "what if you have to copy 500 GB of files ? the HW RAID will do this many, many times faster. No doubt, but what impact does that have on me ? I might do this one every 3 years.

Two guys are copying a 500 GB notebook partition to network storage after coming back from a business trip.... one has RAID 0 SSDs, one has a SSHD on their lappies..

a) I walk in, plug in my SSHD lappie, launch free backup program, press backup and go to bed... mine finishes 40 minutes later, he brags his SSD will finish in 20.... my life is not affected, we're both sleeping.

b) I walk in, plug in my SSHD lappie, launch free backup program, press backup and go to bed... mine finishes 40 minutes later, his finishes in 40.... as it happens, both storage devices are far faster than the wireless network speed.

c) We are both too tired to even open the laptop bag when we get back. First thing in the morning we both plug in our ethernet cables to the network.... same result..... but no matter how long it takes, both puters remain perfectly capable of operating and performing their intended tasks and filling all of our needs while that all happens in the background

Again, yes there are applications that benefit from high transfer rates, but how often to we transfer these large files from our RAID array to the same array ..... moving the file just changes the mapping and is instantaneous, copying will require that the data actually be read and rewritten but for what purpose ? Moving it from the RAID box to another location is limited by the the rate at which other box can receive it (HW RAID too ?) but also by the interconnection speed.

So I don't think anyone is saying that RAID isn't faster or that HW RAID isn't faster then SW RAID, the question is .... does the speed advantage have any impact on the user experience and, in all but a very small amount of instances, the answer is no. Gaming and enthusiast sites go back and look at this issue every few years or so. The results have consistently been the same.

I had SCSI RAID systems back in thee early 90s... every 5-7 years I try it again. Yes, when I run the benchmarks I am undoubtedly impressed but I have yet to find an instance where it has improved productivity or enhanced any other usages. Now if I was streaming music or movies, or editing / encoding movies in a production environment, it would be an obvious consideration.
 
Status
Not open for further replies.

TRENDING THREADS