My backup strategy has changed over the years and I will share my horror story so you can see why.
Back in 2020 I thought I had a great backup solution. I setup a home server which had two separate raid cards with two separate JBOD enclosures.
I used JBOD one for my primary data and I mirrored that to JBOD 2. I had a backup rsync running over the network each night to another servers JBOD.
Of course, my rack was installed in the worst possible place and positioning so that at the time I could not fully access the back of the rack.
I was using a VRTX enclosure, which for those who are not familiar, runs (4) blades and has multiple PCIe slots for passthrough. This enclosure has a feature where PCIe slots can be turned off so they can be removed or added without taking down the other running systems.
My first raid card was throwing errors so I slid the chassis out to replace it. Reaching behind it, I thought I was unplugging the SFF cables from it. Suddenly my phone started chiming that raid 2 was offline. I look further to see that one of the SFF cables I unplugged was from the wrong raid card. In a panic, I shutdown the second blade and re-plugged the SFF cable and rebooted the machine.
That's when things took a horrible turn. Raid 2 was now offline and degraded. All data was unreadable and two drives were now offline. Unfortunately they were from the same mirror.
With raid 1 down to replace the raid card, I assured myself that everything would be ok. I installed the new raid card and booted the server. On boot, two drives would not spin up. My panic quickly turned to horror as raid 1 was now offline and degraded. I go into the controller again to see... yes, both drives from the same mirrored pair. Now realizing I just lost both of my raids I log into my third backup only to find I had not updated the keys and it was backing up 0KB archives...
My panic quickly turned to horror as I realized I just lost 4 years of my kids photos. Multiple TB of data gone in an instant but the only thing that could not be rebuilt or replaced was gone..... Anything not backed up to google photos was gone forever.
Learn from my lesson and plan appropriately.
Now, my strategy is this:
Snapshots are great but I use them as a quick oops I need a quick restore to fix something. I do not use them as a full backup because too many things can go wrong.
The old backup strategy 3-2-1 is great, but becoming more outdated. I would now consider this the BARE minimum of a good backup plan. (3) copies of your data, (2) onsite and (1) offsite. I would go further saying it needs to be (2) on site copies on FULLY SEPERATE HARDWARE. My current backup plan is 3-3-2-1. (3) Backups, (3) Separate machines, (2) Locations, (1) local copy/snapshot on host machine for quick restore. Also, RAID is not a backup, its just one way to bring fault tolerance to a copy of your data.
NEVER consider a backup on the same hardware as an actual backup. When your raid card dies, or your array holding your primary information degrades or you accidentally unplug the wrong cable... everything in that server goes poof, its all gone. I now consider a copy of your data on the same server as a convenience, not a backup (in my eyes).
If you do not have the ability to do this then I suggest:
- Create a copy of your data locally on your webhost.
- Use a cheap NUC to backup daily to your home, use a USB drive for space if needed.
- Create a cheap VM at a low cost host to store minimum 3-7 days backup offsite.
This ensures that even if you forget and one of the 3 backups stop working, you still have 2 backups.
Also another thing, periodically check your backups. Nothing is worse than needing your backups only to find your archives are 0KB and have been failing to login for the last month because your changed your password or your keys. Write a script to check the file sizes of the backups and have it email you if sizes change by more than the set amount. I have mine set to email me if a backup is now 5% smaller or larger in size than the previous backup, or if that backup is 0KB in size.
In the end your backup plan should be equal to the value of your data.
Can you afford to live without that data? Can you rebuild that data cheaper than the system to back it up costs? Are you backing up files which can easily be found and redownloaded from the internet?
Backup, Backup, Backup.