PCI RAID Controller Vs SATA Port Multiplier

Todd1561

Reputable
Oct 30, 2014
4
0
4,510
I've got a small Mini-ITX computer I built as a home server, running Windows 2008R2. Currently it has a single 1TB SATA drive that I was planning to put into a Windows software mirror with a second drive. Long story short that doesn't seem like it's going to work. Something about Windows RAID not supporting Advanced Format drives entirely.

So I'm looking into hardware RAID options, I have more experience/confidence with that anyway with enterprise controllers. The problem is this mobo only has a PCI slot (not PCIe), so I'm concerned using a PCI controller will be too much of a bottleneck. But at the same time I'm just using desktop class SATA drives and really only using this box over a 100 Mbit network connection, so maybe the PCI bus won't be much of a bottleneck?

Another option is a SATA HPM that supports RAID. They are dirt cheap and seems like I'd see less of a bottleneck using even a single SATA port. However, documentation is very spotty with these and I'm not seeing any way to actually manage or monitor the array for failures. They just use simple dip switches for configuration. No manufacturer even goes over the disk failure replacement procedure, presumably you just power down and replace the drive. But I've heard things can go awry quickly when a drive dies on these controllers, so I'm a little weary.

Here are a few options:
http://www.addonics.com/products/adsa2.php
http://www.newegg.com/Product/Product.aspx?Item=N82E16816124009
http://www.newegg.com/Product/Product.aspx?Item=N82E16816124049

What would you guys recommend?

Thanks,
Todd
 
I think you're misunderstanding the problem. I'm not running out of SATA ports, so getting a board with 6 or 10 isn't really going to get me anywhere. My goal is to get RAID on this setup, Windows software RAID doesn't seem to want to work with my drives so I'm looking into hardware alternatives. The way I see it I have two options: A PCI-based RAID controller or a HPM that supports RAID and connects to one of the mobo SATA ports. I'm not interested in a different board, other than this it's perfect for my needs.
 
Well that's part of the question, realistically how much of a bottleneck would it be to use a PCI-based controller? a 32-bit PCI bus running at 33 MHz can theoretically move 133 Megabytes per second. I think my desktop class 7200 RPM SATA drives would top out around 100 MBps and even worse is the network connection, at 100 Mbps you'd be hard pressed to move 12 MBps or even at 1 Gbps (which I'm not using) you'd only likely reach 100 MBps. So I'm not convinced any of this really matters, seems like there's other hardware acting as the limiting factor. But maybe my numbers/logic is off.

Yes, I could spend a few grand and get an all around better machine, but that's not really the point. I'm more wondering what negative impact (if any, how much?) this one particular component would have on the setup as a whole.
 


I did. :)

However, note that that card in your first link "Support large hard drives of 137 GB or greater" - that should tell you a few things about currency and maybe even drivers and other things.

As for Port Multiplier, I recall, but can't find it now that not all SATA controllers supported that. So you may have to make sure the SATA controller in your motherboard will support it. And then, RAID on a Port Multiplier - very interesting.

 
Figured I'd post back with an update. I found an old Adaptec 2610 RAID controller in my parts bin (PCI-X, but works in a PCI slot) so I tested with that in a RAID 1 array. As expected performance took a hit. I benchmarked the disk system before I introduced the RAID controller and topped out around 120 MB/s write and 130 MB/s read. Using the controller I dropped to 43 MB/s write and 90 MB/s read. Of course some of this is just the fact that I'm adding the overhead of RAID, whereas before it was just a single disk. Even given the performance hit I'll likely continue with this setup as it provides the redundancy I'm looking for and there are still worse bottlenecks in the system, namely the 100 mbit network connection.

Thanks for the input guys.