[SOLVED] Should I partition my SSD?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Jun 2, 2020
56
0
30
I have build 2 PCs for my Friends Office. Its an Auditing firm, so a lot of data (but very small in size) are saved every day. There will be pdf, scanned images, scanned documents, excel, spreadsheet, screenshots, TALLY data (Accounting Software), etc. been saved everyday. I build 2 PCs (the rest are laptops).
One have
Silicon Power P34A80 512GB NVMe PCIe 3.0 x 4 SSD
and other PC have
Silicon Power P34A80 256GB NVMe PCIe 3.0 x 4 SSD

My doubt is that should I partition my SSD. I have read that its better not to partition an SSD like HDD. But won't it be difficult to handle the data? If there is any sort of disadvantage if partitioned, I would rather not do it and handle the data by using folders. Please any suggestions are welcomed.
 
Solution
Recommend? No.

But if this turns out to be the actual and only hardware involved...yes.
Maybe a 150GB partition for "shared data". Or whatever size you need. Separate the user data from the shared data.

USAFRet

Titan
Moderator
Ok. What about instead of an external HDD, can i install an Internal HDD in the 2nd PC and backup in that through the Network itself. Is there any way that I could sync data periodically from the 1st PCs' SSD to 2nd PCs' HDD?
Not recommended. A 2nd internal in that 2nd PC is subject to the same things as the actual PC.
The whole idea behind an external is that you can take it offline easily, or even take it out of the office.
 
  • Like
Reactions: mohan_kumar
Jun 2, 2020
56
0
30
Not recommended. A 2nd internal in that 2nd PC is subject to the same things as the actual PC.
The whole idea behind an external is that you can take it offline easily, or even take it out of the office.
Now, I have my data in my SSD (which is used by Network). I also bought an External HDD for Backup. How can I backup the data. Should I bring the External HDD daily and backup manually at the end of the day before shutting office or are there any way that I could do it from outside my office like sync through Internet. I am confused. If it is a subject to study provide links so that i would go through it so i would not waste your time.
 

USAFRet

Titan
Moderator
Use this as a starting point.

Macrium Reflect is my tool of choice.
 
  • Like
Reactions: mohan_kumar
I had composed this reply to your previous thread, but couldn't post it since that thread was deleted. :(

First off, I have a very similar setup. File sharing takes very, very little computing power, so your server is not only an overkill but an overkill on the level of using a tank to kill a fly.

I have some older thin client repurposed as file servers connected with 3x external drives. One drive is shared and every 15 minutes the new files on that drive is copied to the other two drives. This gives us 3x copies very nearline.

Every month or so, another external drive is connected to update an off-site backup. I think we have multiples of these too, so we have multiple versions of backups.

We're using enterprise grade hard drives since file sizes are small and even over usb over a network they're fast enough.

Our solution uses very little power and uses standard components so a server or drive failure doesn't slow us down.

Recently, we've added some nas units that replicate the file server each night. Some of these nas units are off-site as well.

Some food for thought before I answer your questions.

  • Depends on what your goals were--cost? reliability? business continuity? ease of administration?
  • The rule of thumb on backups is 3-2-1--3 copies, 2 different mediums, 1 off-site. If things are critical, do more of each--like how we have 6+ copies on at least 4 mediums and 3 different off-site locations.

    And the biggest mistake people make in their backup strategy is the restore--how quickly can you get the data back when you need it. This is where cloud solutions break down pretty quickly as they are only as fast as your Internet access which is typically a fraction of how fast you can transfer files on a local network. Don't forget to plan your restore strategy as thoroughly as your backup.
  • See the previous answer. Ideally, you could use both.
  • If you need something dirt cheap, just check out rsync.net. Their minimum is 400GB, but that's only $10/mo with unlimited support and transfers and even 7 daily snapshots. Another idea would be to simply sign up with a hosting company as they have backups and whatnot with all their plans. Each host is different on how they charge, so just read the fine print.
  • Yes, most definitely. Rsync.net just uses standard rsync and has ways to map drive letters. Other services may need to install something on the file server. Remember that the harder it is to get the data there, the harder it will be to get it back. Having to install a program to access the data is a dealbreaker for me.
  • I would do this for sure. In fact, since you have 4x other computers, I would install a spare drive in each one of those (or just use empty space in the case of the laptops), and set up a copy update from the file server every day/week/month depending on what type of backup you want. I actually forgot that I've done that with a few of my systems on my network so I've got even more backups. :)
 
Jun 2, 2020
56
0
30
I have some older thin client repurposed as file servers connected with 3x external drives. One drive is shared and every 15 minutes the new files on that drive is copied to the other two drives. This gives us 3x copies very nearline.

This is what I was exactly looking for. I have decided to partition my SSD in the file server PC. That all the Partition in that drive will be shared through the Network.
I will install a Hard Disk in the 2nd PC.
You said that new files on that drive is copied to other two drives.
How to do it automatically?
Will it only copy new files or even modification in existing files?
What if a file is opened and being used when this 15 minute sync is happening?

For now I have decided not to use cloud as it is a monthly sub. So, I will have a backup in 2nd PC Hard Drive and will buy an external Hard drive and back up it weekly. And by the end of this year, as @USAFRet suggested I will install a NAS which would be better idea. I decided to buy WD Blue 7200RPM HDD and Seagate Backup Plus External HDD. Would they do my job?

And also to achieve this 15 min periodic sync, should I need to use any special software? If there are 100 files and within that 15 minutes I didn't add any new files but just modified few files will the modification reflect?
Thank You for your help.
 

USAFRet

Titan
Moderator
Macrium Reflect will do that periodic backup/update. Even if a file is "open" at the time.
Whatever schedule you set.
Keep however many iterations, or days, or when the taget drive gets XX% full.

There are other tools that will also do it.
 
This is what I was exactly looking for. I have decided to partition my SSD in the file server PC. That all the Partition in that drive will be shared through the Network.
I will install a Hard Disk in the 2nd PC.
You said that new files on that drive is copied to other two drives.
How to do it automatically?
Will it only copy new files or even modification in existing files?
What if a file is opened and being used when this 15 minute sync is happening?

For now I have decided not to use cloud as it is a monthly sub. So, I will have a backup in 2nd PC Hard Drive and will buy an external Hard drive and back up it weekly. And by the end of this year, as @USAFRet suggested I will install a NAS which would be better idea. I decided to buy WD Blue 7200RPM HDD and Seagate Backup Plus External HDD. Would they do my job?

And also to achieve this 15 min periodic sync, should I need to use any special software? If there are 100 files and within that 15 minutes I didn't add any new files but just modified few files will the modification reflect?
Thank You for your help.
I use a batch file and just xcopy with the switches /f/r/e/s/h/k/y/d/c/v. So the batch file starts automatically when the system boots and it waits for all the usb drives to come online and then copies newer files from the source to the other two. This is my batch file:
Code:
ECHO OFF
REM WAIT FOR DRIVES TO ATTACH AT FIRST STARTUP
PING 127.0.0.1 -n 90 > NUL
:START
IF EXIST COPYERR.LOG DEL COPYERR.LOG
XCOPY E:\ F:\ /F/R/E/S/H/K/Y/D/C/V
IF ERRORLEVEL 1 ECHO F-ERROR >> COPYERR.LOG
XCOPY E:\ G:\ /F/R/E/S/H/K/Y/D/C/V
IF ERRORLEVEL 1 ECHO G-ERROR >> COPYERR.LOG
PING 127.0.0.1 -n 1000 > NUL
GOTO START
This batch file loops infinitely.

It copies new and modified files, and if a file is moved from one location to another, it is considered new and is copied again to the new place (the old one is not deleted). If you want more of 'sync' operation where all the drives are exactly the same then I would use robocopy instead. I actually have a different batch file that uses robocopy to sync files to 4x nas units, one across the country in another state. But this script runs only nightly due to how many hours it takes to execute.

If a file is open, it is usually not copied since the system will 'lock' access to it since another user has it open. It is usually not a problem since we don't keep files open overnight, so they will be copied at some point in the day. And you could adjust the 15 minutes to whatever you want. The 1000 in the second to last line in the batch file is 1000 seconds (and technically 16.67 minutes), and you can change that to any number of seconds you want.

If you are planning to buy an external drive for backup and since your data set is small, I would highly recommend an external ssd. Drives are delicate and one drop and render them useless which is a likely scenario if a drive is being shuffled weekly. I would also buy 2x drives, keeping one off site and just swapping them weekly. You can even integrate these into the 15 minute backups so that you literally just have to swap drives each week.

I would not invest in a new NAS. Today's NAS units are overly complex machines that are basically computers--so now there's another device to worry about getting hacked. Better to either just keep a computer dedicated for simple file serving or find an older nas that did not have to be connected to the outside world or try to always be connected to the Internet. I have older nas units like that, specifically the Intel ss4200-e, WD My Cloud EX2 (not ultra), and Netgear ReadyNas Ultra 6 (not 6 plus). These units are limited to smaller drive sizes (not a problem for you), and also cannot saturate a gigabit network link (also probably not a problem for you). But they are perfect for reliable, no nonsense storage appliances with some redundancy built-in. They're also cheap as any of these you can get for under $100 (but it will take some searching). I also have a synology unit and it's a completely different animal as I feel like I'm 'logging in' to a computer and sometimes wonder what the hell it is transferring over the network when I'm not using it. (None of my simple nas units have me guessing like this.) But in your use case, a dedicated storage appliance doesn't make sense when you're using a PC to do the same job which doesn't have drive size limitations or performance limitations.

As far as drives, if your data matters only get top tier enterprise storage--WD Gold, HGST, Seagate Exos--basically any drive that is 7200rpm, comes with a 5 year warranty and a 2.5M MTBF spec. These are top tier enterprise drives. They are usually much more expensive, but their reliability is second to none.

The batch file basically just runs the xcopy command every 15 minutes. It would be no different than if you were running the command manually. And this is just the start of the command--this is not how long a copy may take. It takes time to scan every single file for a chance and then copy it. By the time this process is done, it's probably been at least 30 minutes--even longer if there was a lot of data copied.