Dear all,
For clarity I will first describe what I want:
I currently have 3-4 servers at home with all there own storage.
I would like to have a central point (like a SAN) to have all the servers start and use the data that is available on the SAN (ideally, I would like to have the servers to be diskless and totally boot from the SAN, so block level access is needed).
I was looking at something like "Openfiler" as OS for the SAN. The easiest way to connect the servers and the SAN over the network with 10 gbit/s (to use the full potential of the disks, SATA III). That was a option that is way out of my budget and I had to look at something else....
What I came across:
So, I looked further at the Internet for other options to connect the servers and the SAN. There are some forum threads that was suggesting that a so called HBA (Host Bus Adapter) could be used. So I looked at Amazon and there were some SATA HBA's, so I thought maybe with a SATA HBA I can connect the SAN with the server (even if that is not technically a SAN).
There were also some threads with the suggestion that SATA was not able to connect two PC's/servers together. I understand the reason (Host->client protocol compared to USB point-> point) , but isn't it the job of the HBA to function as a disk for the server?
Where do I use it for:
1.) I am running some only services I made myself.
2.) Hobby 😉
My question:
1a.) Is it possible to connect a server and a SAN with SATA (with a HBA, so not directly!)
1b.) If yes, what can card should I get (what is good or bad, specs to look at)?
2.) Is SATA the best option or are there other options, maybe USB ?
P.S. I am starting in the field of SAN and things like that, so any additional information is welcome.
For clarity I will first describe what I want:
I currently have 3-4 servers at home with all there own storage.
I would like to have a central point (like a SAN) to have all the servers start and use the data that is available on the SAN (ideally, I would like to have the servers to be diskless and totally boot from the SAN, so block level access is needed).
I was looking at something like "Openfiler" as OS for the SAN. The easiest way to connect the servers and the SAN over the network with 10 gbit/s (to use the full potential of the disks, SATA III). That was a option that is way out of my budget and I had to look at something else....
What I came across:
So, I looked further at the Internet for other options to connect the servers and the SAN. There are some forum threads that was suggesting that a so called HBA (Host Bus Adapter) could be used. So I looked at Amazon and there were some SATA HBA's, so I thought maybe with a SATA HBA I can connect the SAN with the server (even if that is not technically a SAN).
There were also some threads with the suggestion that SATA was not able to connect two PC's/servers together. I understand the reason (Host->client protocol compared to USB point-> point) , but isn't it the job of the HBA to function as a disk for the server?
Where do I use it for:
1.) I am running some only services I made myself.
2.) Hobby 😉
My question:
1a.) Is it possible to connect a server and a SAN with SATA (with a HBA, so not directly!)
1b.) If yes, what can card should I get (what is good or bad, specs to look at)?
2.) Is SATA the best option or are there other options, maybe USB ?
P.S. I am starting in the field of SAN and things like that, so any additional information is welcome.