Safely Backup PostgreSQL Database in Docker / Podman Containers

📆 · ⏳ 6 min read · ·

Introduction

In one of a previous article we learned about how to locally backup Postgres database using pg_dump tool. That works great when you are taking backups directly from a local instance or a remote instance.

But what if you are running your database in a container? How do you backup your database then? In this article, we will learn how to backup PostgreSQL database in Docker/Podman containers safely. We will also learn how to restore the database from the backup.

Prerequisites

A docker or podman container running a PostgreSQL database. If you don’t have one, you can create one using the following command:

Note: I am writing all the commands in terms of docker however you can replace it with podman if you are using podman since the commands are almost the same.

Terminal window
docker run --name my_db -e POSTGRES_DB=my_db -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=postgres

Backup PostgreSQL Database

First thing to do is to stop all the services which would be connecting to this database and doing any read/write operations. This is to ensure that the database is in a consistent state when we take the backup.

Terminal window
docker stop my_app

Replace my_app with the name(s) of your application(s) which are connecting to the database.

Now we will use the pg_dump tool to take the backup of the database by using docker exec command.

Terminal window
docker exec my_db sh -c "pg_dump -U postgres my_db > my_db.bak"

Replace my_db with the name of your database. This will create a file named my_db.bak in the current directory.

Now, we have the backup file created but it is inside the container, we need to copy it to our local machine. We can do that using the docker cp command.

Terminal window
docker cp my_db:my_db.bak my_db.bak

Optionally we can verify our backup file by reading it

Terminal window
head my_db.bak

It should have content similar to this

Terminal window
--
-- PostgreSQL database dump
--
-- Dumped from database version 12.12 (Debian 12.12-1.pgdg110+1)
-- Dumped by pg_dump version 12.12 (Debian 12.12-1.pgdg110+1)
SET statement_timeout = 0;
SET lock_timeout = 0;
SET idle_in_transaction_session_timeout = 0;

At this point we have a backup of our database in our local machine. We can now start our application(s) again.

Terminal window
docker start my_app

Restore PostgreSQL Database

A backup is only good if we can restore it if something goes wrong. So let’s learn how we can restore our database from the backup we just created.

Once again, we will first stop all the services which are connecting to the database.

Terminal window
docker stop my_app

Now let’s say we have a new container which is running PostgreSQL, so first thing would be to create a new database in that container.

Terminal window
docker exec my_db createdb -U postgres my_db

Once we have the database created, we will copy our backup file into the container and start the restore process.

Terminal window
docker cp my_db.bak my_db:my_db.bak

This will copy the backup file in the container, now we will use psql tool to restore the database.

Terminal window
docker exec my_db sh -c "psql -U postgres -d my_db < my_db.bak"

This will restore the database from the backup file. We can verify that by connecting to the database and checking the tables.

Terminal window
docker exec -it my_db psql -U postgres -d my_db

This will open the psql prompt, now we can check the tables by using \dt command.

Terminal window
my_db=# \dt

This will show us all the tables in the database.

Terminal window
List of relations
Schema | Name | Type | Owner
--------+-------------------+-------+----------
public | my_table | table | postgres

We can now start our application(s) again.

Terminal window
docker start my_app

One Script to Rule Them All

Whenever we are working with multiple commands that are likely to be used by us again and again, it is a good idea to create a script for them.

If you are an old reader here, you might have seen in some other blogs, we try to convert our steps into a single script. So let’s create a script which will take care of all the steps we learned above.

backup.sh
#!/bin/bash
BACKUP_DIR=~/backups/my_db
BACKUP_FILE=my_db.bak
DB_CONTAINER=my_db
DB_NAME=my_db
APP_CONTAINERS="my_app1 my_app2"
# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Stop all the application containers
for container in $APP_CONTAINERS; do
docker stop $container
done
# Take backup of the database
docker exec $DB_CONTAINER sh -c "pg_dump -U postgres $DB_NAME > $BACKUP_FILE"
# Copy the backup file to local machine
docker cp $DB_CONTAINER:$BACKUP_FILE $BACKUP_DIR/$BACKUP_FILE
# Start all the application containers
for container in $APP_CONTAINERS; do
docker start $container
done

And let’s create another script for restoring the database.

restore.sh
#!/bin/bash
BACKUP_DIR=~/backups/my_db
BACKUP_FILE=my_db.bak
DB_CONTAINER=my_db
DB_NAME=my_db
APP_CONTAINERS="my_app1 my_app2"
# Stop all the application containers
for container in $APP_CONTAINERS; do
docker stop $container
done
# Create database
docker exec $DB_CONTAINER createdb -U postgres $DB_NAME
# Copy backup file to container
docker cp $BACKUP_DIR/$BACKUP_FILE $DB_CONTAINER:$BACKUP_FILE
# Restore database
docker exec $DB_CONTAINER sh -c "psql -U postgres -d $DB_NAME < $BACKUP_FILE"
# Start all the application containers
for container in $APP_CONTAINERS; do
docker start $container
done

This way we can easily backup and restore our database without worrying about the commands. Now hook this up with a cron job or systemd timer and you have a fully automated periodic backup system.

Once again this is just the very basic script, you can modify and go crazy with it as much as you want.

For example what I personally prefer to do is to encrypt the backups with my gpg key and then upload them to my cloud storage. This way I can be sure that the backups are safe and secure. But this is optional and hence I haven’t included it in the script.

Let me know if you are interested in learning how to do that and I will write a separate article for that.

What about other databases?

While in this article we learned how to backup PostgreSQL database in a containered environment, the fundamental approach remains the same for other databases as well.

So if you are using MySQL or MongoDB or any other database, you just need to replace the commands with the ones for your database and you are good to go.

To summarise it, these are the pointers we need to keep in mind while taking backups of databases in containers:

  • Stop all the services which are connecting to the database.
  • Take the backup of the database using the preferred tool for your database.
  • Copy the backup file to local machine using docker cp command.
  • Start all the services again.

Conclusion

In this article, we learned how to safely backup PostgreSQL database in Docker/Podman containers. We also learned how to restore the database from the backup.

While learning this, we also created some handy scripts which we can use to backup and restore our database without worrying about the commands.

I hope you found this article useful. If you have any questions or feedback, please feel free to reach out to me on X / Twitter ↗️.

Until next time đź‘‹.

You may also like

  • Optimizing cAdvisor for Lower CPU Usage

    So you are seeing high CPU usage from cAdvisor, well I was too. Here is how I fixed it.

  • Resolving Missing Memory Stats in Docker Stats on Raspberry Pi

    Running Docker containers on your Raspberry Pi can be a great way to explore different software and services. However, encountering issues with retrieving memory usage information using docker stats can be frustrating. In this blog post will dive into the common reasons behind missing memory stats and guide you through the troubleshooting steps to resolve them.

  • Reduce Docker Image Size Like a Pro

    Dive into the realm of Docker image optimization and learn how to sculpt sleek and efficient images that won't weigh down your containers. Join me on this journey as we unravel practical techniques and step-by-step methods to trim the excess fat, from choosing the right base image to mastering multi-stage builds. Let's embark on a quest to craft Docker images that are both agile and powerful.