[SOLVED] *NEEDS ADVICE* : New 10Gbe NAS Setup for small Motion Design Studio


Oct 13, 2016
Hello Folks,

I'm from Malaysia, and just a heads up that I am a total newb. Please be nice and patient with me :D Seeking for advice from experts as I recently bought and set up a 10gbe NAS for the studio. There are 4 workstation that works on it. But after the setup of this NAS, we noticed that the transfer speed is only running around 100+ mbps. We expect a 10Gbe setup to transfer files way faster. The guy that sold me these items told me that it's either our pc that is causing the slow speed or that it's because we have alot of small files in our folders. Bigger files should transfer faster, therefore he can't do anything about it and it's completely normal.

I doubt his expertise as he wasn't responsible for the setup. And was in a hurry, leaving us with alot of questions unsolved.

I saw on the net that we will need a 10g router for a 10g setup. But there are some that says that does not matter, because 10g router only works if our ISP is up to that speed.

In the end he suggested me to get 2 Seagate Nytro SSD as cache hard drives, and it will transfer my files way faster.

Pros and Experts, do share me your advice or suggestion before having to spend on the additional hard drives or a new router. Below is the list of items for the NAS setup.

NAS : Synology DS1618+
RAM : Synology DDR4 16GB
PCIE : Synology E10G17 - F2
Fibre Module : Trendnet TEG-10GBSR x2
SWITCH : Trendnet TEG-40128 12Ports
PCIE (PC) : Trendnet TEG-10GECTX x4
HDD : Seagate EXOS 7E8 6TB x2
CAT 7 : Ugreen Cat 7 Cables x 4
ROUTER : TP-LINK Archer C1200

Download : 100.22mbps
Upload : 98.58mbps
Ping : 4ms

I appreciate all of you who spent your time reading this. Thanks in advance!
How are you measuring the speeds. Copy a large files and watch the network tab on the resource monitor.

You could check the ports in the switch and pc to make sure they are really running at 10g but unless you set them slower they tend to always run at 10g.

You should get much more than 100mbps but many factors related to the disks are going to keep you from getting anywhere near 10gbit. It is not just the NAS the drives in your end work stations will also limit the transfer speeds. The drive you list you have in the NAS for example when running as a single driver can only get a maximum rate of about 1.5gbits/sec. Running the drives in raid 0 you might get close to double that but there is no redundancy. The only way to get 10gbit speeds is to run raid groupings of SSD devices on both the raid and your end work stations.

If you want to brute force test the network you can load a old program called IPERF on 2 of your workstations and transfer data between them but I suspect your network is not the issue. If you had copper cables it would be common for them to drop to 100mbps but on optical they pretty much run on the speed you set them to or fail to come on.


Oct 2, 2018
Please post the network diagram for your setup. Also, could you explain in detail how you plan to use this setup and what are your expectations from it.


Oct 13, 2016

Here is the diagram for the setup : https://imgur.com/lNSOUJF

We do Motion Design works and all 4 workstations are working directly on the NAS. Where all of our assets, project files and rendered image and movie files will be on the NAS. Right now, there is no issue in all of the PC's while working on projects. Just that we will expect transfer speeds up to 1000mbps instead of 100mbps ++. I am also not sure whether will these transfer speed actually affects us in working? Especially if we were to have a few more PC to work on the NAS in the future. Hope my explanation helps! Thanks!


Oct 13, 2016

We have 2 workstations running on Samsung 960pro m.2 SSD cards. The other 2 workstation running on older SSD's. Also, for the NAS the guy set it to run in Raid 1. Will that be causing the speed limitation? Thanks!
Storage networking is a very specialized....and high pay...field. This is not something I do much but if you have the drive run in raid1 it is only to protect the data it will not improve the performance. You end stations I suspect because they run ssd will be much faster than the NAS so any bottleneck in your system is going to be hard drive in the NAS. I strongly suspect you will not see much over 1gbit in normal day to day operation with that drive. This is extremely complex to figure out since there are so many variable like file sizes and disk layouts.
Physical HDs currently have top speed of ~1.5 mbit. Is a mechanical device, the platter can only spin so fast, the R/W head can only seek so fast. People have been able to improve that with HD Arrays, with larger buffers etc. There are tricks u can do, but don't expect miracles.

Linus Tech over at utube have 10G server, they shove massive amount of 8K videos, check them out for any tips.

Your current vendor not worth much I say.


Mar 16, 2013
I see this as a major fail from both ends.
You and your "guy".

You have a supposed need for 10GBe performance between the workstations and the NAS box.
He come s in, waves his hands around, gives you 100mbps performance, blaming the issue on "other things".
And he walks away with some money?

There needed to be an actual contract.
"We need 10Gbe performance, with this type of data."
'OK, here's the equipment you need (and why), and here's how much that will cost, including installation.'

Then some back and forth discussion, eventually agreeing on a price and performance.
He comes, builds the network. Shows you the actual performance, and you pay him.
Everyone walks away mostly happy.



Latest posts