Introduction
We often need to do things that may put us at risk of losing data in Linux, nevertheless, itās always better to take systematic backups from time to time.
Today we will be learning about how to use the powerful tool called tar in Linux to help in the backup process.
Tar Command
Letās take a quick look at the tar command and the flags that we will be using with it.
tar -zcvpf /[Backup_Location]/[Backup_Filename] /[User_Home_Directory_Location]Now letās understand these flags.
- z : Compress the backup file with āgzipā to make it a small size.
- c : Create a new backup archive.
- v : verbosely list files that are processed.
- p : Preserves the permissions of the files put in the archive for later restoration.
- f : use archive file or device ARCHIVE.
Usage
Let us now backup the home directory, in my case, the user name is akash.
tar -zcvpf /backup/akash-backup-$(date +%d-%m-%Y).tar.gz /home/akashThis will produce the output (for the /backup directory) as
ls -lh /backup
total 15G-rw-r--r--. 1 root root 15G Mar 1 12:09 akash-backup-01-03-2021.tar.gzIf you wish to exclude some directory to be archived, you can use --exclude flag, something like this
tar --exclude='/home/akash/Documents/test-folder' -zcvpf /backup/akash-backup-$(date +%d-%m-%Y).tar.gz /home/akashThis will archive everything excluding the test-folder inside Documents.
And thatās it, this will help you create a backup and store it with the date inside /backup directory.
Death is hard enough. Accessing accounts shouldn't be.
When someone dies, you don't get even one extra second to access the documents and information they meant to share it with you. Trying to fix this problem with Eternal Vault.
Bonus
Doing this once is fine, but what if there was a way to regularly backup your content and automatically wipe out the old backups, wouldnāt that be great. So letās just do that.
We will use a shell script and cron job to automate this task for us.
Backup Script
First, letās move into a better directory where we will store our script.
nano /opt/scripts/home-dir-backup.shNow copy the mentioned script, it is pretty straightforward. We use tar to compress and create a backup file under the $BACKUP_DIR and use find command to delete old backups.
#!/bin/bashDATE=$(date +%d-%m-%Y)BACKUP_DIR="/backup"
## To backup akash's home directorytar -zcvpf $BACKUP_DIR/akash-$DATE.tar.gz /home/akash
## To delete files older than 15 daysfind $BACKUP_DIR/* -mtime +15 -exec rm {} \;Set executable permissions for this file.
chmod +x home-dir-backup.shNow your script is ready to be used.
Cron Job
The final step is to set up the cron job to run this script automatically. We will enter into crontab using
crontab -eInside it paste the following lines at the bottom.
0 12 * * 5 /opt/scripts/home-dir-backup.shThis will run this script every Friday at 12:00. To learn more about how to configure these values, I would highly recommend using crontab.guru āļø.
Hope you found this helpful, see you in the next one.