I dunno, $2200 doesn't seem like too high a price for use in an enterprise environment, which is clearly the intended use for this product. Especially if there are no alternative vendors. You get to make a bit of extra money for being first to market, especially if the product works exactly as advertised, as would seem to be indicated by this review.
Expresscard solutions are pretty worthless if you want to use powerful graphics since it's only a 1x pci-express slot you'd need to have some pretty crappy onboard graphics to bother with that option. The Thunderbolt based solutions with it's 4x PCI-express 2.0 are good for mid range cards like a 7850 but it would would be less than useful and a waste of money when used with a 7950 let alone a titan.
I don't care how much you guys are griping here. If this niche market picks up and makes it easier to upgrade your gaming rig instead of having to start over with a new MoBo, PSU, the latest ram and CPU that work with said MoBo... if all you had to do is add this... then... hell yeah. THIS is awesome. and if it picks up in popularity, the prices will come down!
No, because the thing itself is cost prohibitive. $2200 right up front is the cost of one dream machine or two balanced performance builds. That means at the very least this thing would have to last for two whole upgrade cycles before it became worth it ( three, more likely. ) So you'd banking on this technology and interface protocol to still be relevant and efficient seven years ( or more ) down the road. That's not a gamble anyone wants to make. If I were to buy a whole system now, I could either buy one of these AND a card, or I could make one whole system now and then make another whole system two years down the road, and I still wouldn't have spent as much.
No, because the thing itself is cost prohibitive. $2200 right up front
If the concept itself took off and the engineering costs could be spread over a much larger production volume, the cost would likely drop to less than $500: this is little more than a pair of PCIe x16 transceiver cards, a PLX or equivalent PCIe switch, a PSU and external enclosure. We are talking about ~$250 worth of parts and materials.
The bulk of the price tag is due to R&D and tooling amortized over a very small projected production run. Materially, there is no reason for it to be anywhere near that expensive.
What about 4-way SLI with Quadro K6000's? That would be an incredibly massive ammount of data going through that one PCI-e 16x slot... It should be enough to show wether a bottleneck exists or not. TEST IT, TOM'S HARDWARE TEAM! TEST IT!
It would be an interesting test with a PC designed for the real gpu computing. I mean a pc with 2, 3 or 4 turbobox with 8, 12 or 16 GPU titan type or tesla or xeon phi.
A test for the pros, I'm a graphic designer and use iray for my render, I have 3 gtx 580 and I would buy 2 or 3 turbobox, but not having actual tests can not invest so much money without security of the result.
I do not know if by the editorial staff can perform these tests, but I would be grateful to anyone who was in possession of 2 or more turbobox send test with iray.