Introduction
Imagine this - You’ve spent countless hours building and fine-tuning your homelab, creating the perfect digital sanctuary for your experiments and projects. It houses critical databases, sensitive data, and maybe even those important backups of your services you can’t afford to lose.
Now, think about what would happen if disaster struck - a hard drive failure, a power surge, or even a simple human error. That’s why backups in a homelab are not just a good idea; they’re a necessity.
A while back I asked a question on reddit of self hosting password managers and the general consensus was that if you are self hosting one on premise then you absolutely cannot compromise on taking backups.
Hence I started exploring options of taking off-site backups of my homelab data. In this blog, I’m going to share with you how I ensure the safety of my homelab’s essential data using off-site cloud storage, and the tools that make it all possible.
My Backup Strategy
There are essentially two pieces to my backup strategy:
-
Local backups - I take local backups of my homelab data on a regular basis. This includes my databases, important files, and even my password manager’s database.
-
Off-site backups - For off-site backups I use rclone ↗️ to sync my local backups to cloud storage (Google Drive currently).
Let’s break each section into more detail.
Local Backups
Taking local backups is the first step in my backup strategy. This basically means that whatever data I have in my homelab which requires regular backups, I perform their backup strategy, for example a database dump or a service backup file and then move it to a local folder (~/backups
in my case).
So in simple terms the first step is to move your backups to a source folder which we will then sync to the cloud. How you push your backups to this folder is up to you. You can use a cron job, a systemd timer, or even a simple bash script.
Off-site Backups
Once I have my local backups in place, I use rclone ↗️ to sync them to my cloud storage. I use Google Drive for this purpose, but you can use any cloud storage provider ↗️ that rclone supports.
Rclone is an awesome tool that lets you sync files and directories to and from cloud storage providers. It supports a wide range of cloud storage providers, including Google Drive, Dropbox, Amazon S3, and more.
I have a detailed guide on how you can setup and use rclone to sync your backups to cloud storage.
Automating the Backup Process
Now that we have our backup strategy in place, let’s automate the process of taking backups and syncing them to the cloud.
This is what the bash script should do:
- On startup, sync everything from the backups source folder to the cloud storage.
- Everytime something changes in the backups folder, sync that file/folder to the cloud storage.
The first part is pretty simple with rclone. We can use the rclone copy
command to copy everything from the source folder to the cloud storage.
Note that I used copy
command instead of sync
command. This is because I
want to keep a copy of all my backups in the cloud, even if I delete them from
my local machine.
To tackle the second part, I use the inotifywait
tool which comes from the inotify-tools
package. This tool lets you monitor file system events, such as when a file is created, modified, or deleted.
With this, I can monitor the backups folder for any changes, and then sync the changed file/folder to the cloud storage.
Here’s the script that I am using currently:
Install the inotify-tools
package using your package manager, and then save the above script as rclone-backup.sh
in your preferred location.
Make sure to change the SOURCE_DIR
and DESTINATION_DIR
variables to match your setup.
Now, we need to make sure that this script runs on startup, and also restarts automatically if it crashes. For this, we can use systemd to run the script as a service.
Create a new service file at /etc/systemd/system/rclone-backup.service
with the following contents:
Make sure to replace the ExecStart
and User
values with the correct path to the script and your username respectively.
Now, enable the service and start it:
That’s it! Now, your backups will be automatically synced to the cloud whenever you make any changes to the backups folder 🥳. You can also check the status of the service using the following command:
Conclusion
The importance of backups cannot be overstated. They are the only thing that stands between you and a complete loss of data. That’s why it’s important to have a solid backup strategy in place, and to make sure that you’re taking regular backups of your data.
I hope this blog post has given you some ideas on how you can safeguard your homelab’s data using off-site backups. If you have any questions or suggestions, feel free to reach out to me on Twitter ↗️.
Until next time, happy hacking! đź‘‹