Self-Host Vaultwarden with Scheduled Backups

πŸ“† Β· ⏳ 5 min read Β· Β·

Introduction

Finally I self hosted Vaultwarden in my homelab. I am a happy user of Bitwarden and haven’t faced any issues with their service, however since I am a curious homelabber, I wanted to self-host my own password manager as well.

This thought led to researching about self-hosted solutions and if it’s really even a good idea to self-host password manager in the first place. With the backup solution that I have in place now, I am much more confident with it and it has been working great from past few weeks now.

In this guide, I will share how I self-hosted Vaultwarden and the backup solution that I have in place.

Self-Host Vaultwarden

Vaultwarden ↗️ is an unofficial Bitwarden compatible server written in Rust. It is a fork of the official Bitwarden server with additional features and security enhancements.

This is the compose file that I am using to run Vaultwarden:

version: '3.3'
services:
vaultwarden:
image: vaultwarden/server:latest
container_name: vaultwarden
volumes:
- ./data:/data
env_file:
- ./.env
ports:
- 4400:80
restart: unless-stopped

This is pretty straight forward, we are using the latest image of Vaultwarden and mounting the data directory to the container. I am using a .env file to store the environment variables, which looks like this:

Terminal window
ADMIN_TOKEN=super_secret_text # this is the admin token that you will use to login to the admin panel
DOMAIN=https://vault.satoru.local # this is the domain that I am using for Vaultwarden
SIGNUPS_ALLOWED=true # turn this to false after you have created your account
WEBSOCKET_ENABLED=true

Once you have the compose file and the .env file ready, you can run the container using the following command:

Terminal window
docker-compose up -d

Once the container is running, you can access the vault at https://your-domain:4400. You will be asked to create an account and then you can start using it. Once you have created a user, I would suggest you can turn off the SIGNUPS_ALLOWED environment variable to false in the .env file if you don’t want to allow more users to sign up.

SSL for Vaultwarden

One important thing to note here is that Vaultwarden requires HTTPS to work, so you will need to setup SSL for it. I am using self signed certificates in my homelab and I have a detailed blog talking about setting up self-signed certificates in your homelab, so make sure to check that out.

Backup Vaultwarden

Now that we have Vaultwarden running, we need to make sure that we have a backup solution in place in case of any disaster. I would personally put passwords in the Tier 1 list of data that I cannot afford to lose, so I want to make sure that I have a good backup solution in place.

The 3-2-1 backup rule is a good rule to follow, which means that you should have 3 copies of your data, 2 of which are local but on different mediums (e.g. disk and tape) and at least 1 copy offsite.

To handle the offsite backup, I am using the same strategy that I mentioned in this blog post: How I Safeguard Essential Data in My Homelab with Off-site Backup on Cloud.

So the first step is to gather the data that we want to backup, which is the data directory that we mounted to the container. If you inspect the data directory, you will see that it has the following structure:

Terminal window
.
β”œβ”€β”€ attachments
β”œβ”€β”€ icon_cache
β”œβ”€β”€ sends
β”œβ”€β”€ tmp
β”œβ”€β”€ db.sqlite3
β”œβ”€β”€ db.sqlite3-shm
β”œβ”€β”€ db.sqlite3-wal
β”œβ”€β”€ rsa_key.pem
└── rsa_key.pub.pem

The db.sqlite3 file is the database file that contains all the data that we want to backup. The rsa_key.pem and rsa_key.pub.pem files are the private and public keys that are used to encrypt the data in the database.

Now as per the official docs from Vaultwarden ↗️, they mention which are these folders are important to backup, and which all are recommended or optional.

However what I am doing is that I am backing up the entire data directory. I am create a tar archive of the data directory and then encrypting it using gpg and then uploading it to my cloud storage.

Here is what the script looks like, I am excluding the gpg part since that is specific to my setup, but you can use the same script and add that part or use any other encryption method that you prefer.

#!/bin/bash
# Set the script to exit immediately if any command fails
set -e
DATE=$(date +%Y-%m-%d)
BACKUP_DIR=~/backups/vaultwarden
BACKUP_FILE=vaultwarden-$DATE.tar.gz
CONTAINER=vaultwarden
CONTAINER_DATA_DIR=~/dev/services/container-data/vaultwarden
# create backups directory if it does not exist
mkdir -p $BACKUP_DIR
# Stop the container
/usr/bin/docker stop $CONTAINER
# Backup the vaultwarden data directory to the backup directory
tar -czf "$BACKUP_DIR/$BACKUP_FILE" -C "$CONTAINER_DATA_DIR" .
# Restart the container
/usr/bin/docker restart $CONTAINER
# To delete files older than 30 days
find $BACKUP_DIR/* -mtime +30 -exec rm {} \;

The script is pretty self explanatory, it stops the container, creates a tar archive of the data directory, restarts the container and then deletes the backups older than 30 days.

I have hooked this up with a cronjob to run every day and it backs up the data and sync it to cloud storage.

You can setup a cron job by running the following command:

Terminal window
crontab -e

And then add the following line to the file:

Terminal window
0 0 * * * /path/to/backup/script.sh

This will run the script every day at 12:00 AM. Make sure the script is executable by running chmod +x /path/to/backup/script.sh.

Conclusion

I hope this guide was helpful to you and you were able to self-host Vaultwarden along with a backup solution that will periodically backup your data safely.

If you have any questions or suggestions, feel free to reach out to me on Twitter ↗️ / Reddit ↗️.

See you in another one! πŸ‘‹

You may also like

  • # homelab# selfhosted# networking

    Setup Caddy with automatic SSL certificates with Cloudflare

    Recently I migrated my homelab from using Nginx with local domain certificates to using Caddy with automatic SSL certificates from Cloudflare. This post will go over the steps I took to set up Caddy with Cloudflare.

  • # homelab# selfhosted

    PairDrop β€” Transfer files between devices seamlessly

    PairDrop is a self-hosted file transfer service that allows you to transfer files between devices seamlessly. It is a great alternative to services like Airdrop, Snapdrop, and ShareDrop.

  • # linux# homelab# selfhosted

    Setup Jellyfin with Hardware Acceleration on Orange Pi 5 (Rockchip RK3558)

    Recently I moved my Jellyfin to an Orange Pi 5 Plus server. The Orange Pi 5 has a Rockchip RK3558 SoC with integrated ARM Mali-G610. This guide will show you how to set up Jellyfin with hardware acceleration on the Orange Pi 5.