Hi,
First time posting here but an avid reader for many years. Thanks for all the solutions I found solutions for here...
Anyway, I am building a professional system that requires to have the following:
So total I would need at least 4 PCIE slots and 32 PCIe lanes on the CPU or CPU+Chipset.
I currently have a Z170 Extreme 7+ with a i7-6700K which has 16 lanes on the CPU and 20 lanes on the Z170 chipset bus.
My limitation is in the fast that to get the 5GB/s on the SSD RAID0 I have to use one drive on an on-board M.2 PCIe slot and one on a PCIe carrier card that will be put on a PCIe slot from the CPU bus taking away 4x lanes (if using 2 on the M.2 on-boards slots I run into the Z170 bus limitation of 3.2GB/s). This means that the PCIe lanes default to (in the best configuration) 8/4/4 configuration which is now too slow for my SDI card.
I found several 40 lanes CPU and motherboards such as the Intel E5-1650 v4 or Intel Core i7-6850K on the processors side and the X99 OC Formula/3.1 for example on the motherboards side.
Question 1: Any recommendation on the best motherboard for this application?
Question 2: Are there limitation on the CPU bus PCIe bandwidth like there is on the Z170 chipset or could I possibly run into bandwidth limitations in let's say a max configuration such as 1 GPU 16x and 4x SSD Samsung Pro (2.5GB/s) for a total of GPU bandwidth + 10 GB/s of SSD?
First time posting here but an avid reader for many years. Thanks for all the solutions I found solutions for here...
Anyway, I am building a professional system that requires to have the following:
■ GPU 16x such as M4000 or the new Nvidia GTX1080
■ Pro SDI playback PCIe card (requires 2.0 8x but 3.0 is good for future upgrade and all motherboards are 3.0 anyway)
■ 5+GB/s SSD RAID0 using at least 2 Samsung 950 Pro (require each 3.0 x4) for raw video read
So total I would need at least 4 PCIE slots and 32 PCIe lanes on the CPU or CPU+Chipset.
I currently have a Z170 Extreme 7+ with a i7-6700K which has 16 lanes on the CPU and 20 lanes on the Z170 chipset bus.
My limitation is in the fast that to get the 5GB/s on the SSD RAID0 I have to use one drive on an on-board M.2 PCIe slot and one on a PCIe carrier card that will be put on a PCIe slot from the CPU bus taking away 4x lanes (if using 2 on the M.2 on-boards slots I run into the Z170 bus limitation of 3.2GB/s). This means that the PCIe lanes default to (in the best configuration) 8/4/4 configuration which is now too slow for my SDI card.
I found several 40 lanes CPU and motherboards such as the Intel E5-1650 v4 or Intel Core i7-6850K on the processors side and the X99 OC Formula/3.1 for example on the motherboards side.
Question 1: Any recommendation on the best motherboard for this application?
Question 2: Are there limitation on the CPU bus PCIe bandwidth like there is on the Z170 chipset or could I possibly run into bandwidth limitations in let's say a max configuration such as 1 GPU 16x and 4x SSD Samsung Pro (2.5GB/s) for a total of GPU bandwidth + 10 GB/s of SSD?