DEV Community

AJAY SHRESTHA
AJAY SHRESTHA

Posted on

Automating PostgreSQL Backups with a Shell Script

Backups serve as a safety net for any application that stores critical data. If you’re running a PostgreSQL database on a Linux server, automating regular backups is essential for disaster recovery and peace of mind.
In this blog, we’ll explore simple yet powerful shell script that:

  • Dumps a PostgreSQL database
  • Compresses the backup
  • Stores it with a timestamp
  • Transfers it to a remote server
  • Keeps only the 10 most recent backups

Why Automate PostgreSQL Backups?

Manually backing up a database is risky. You might forget, or worse, do it incorrectly. Automating the process ensures:

  • Consistency: Backups happen the same way every time.
  • Accountability: Timestamped files provide a history of backups.
  • Security: Offsite backups reduce data loss risk.
  • Efficiency: Old backups are purged automatically.

Before using this script, make sure:

  • You have a PostgreSQL database running.
  • Your user has sudo access.
  • You can scp to a remote server using SSH keys (no password prompts).
  • The target backup directory exists on the remote machine (/home/ubuntu/backups/).

The Script

Here’s the complete script that automates your PostgreSQL backups:

#!/bin/sh

# Set timestamp using system's local time
timestamp=$(date +%Y-%m-%d_%H-%M-%S)
backup_dir="/home/ubuntu/backups"
backup_file="${backup_dir}/${timestamp}.psql.gz"

# Dump the PostgreSQL database
sudo su postgres -c "pg_dump -O db_name > /tmp/back.psql"

# Compress the backup
gzip -f /tmp/back.psql

# Ensure backup directory exists
mkdir -p "$backup_dir"

# Move the compressed backup to the backup directory
mv /tmp/back.psql.gz "$backup_file"

# Copy the backup file to the remote server
scp "$backup_file" ubuntu@IP:/home/ubuntu/backups/

# Retain only the 10 most recent backups
if [ -d "$backup_dir" ]; then
    echo "Backup folder exists."

    cd "$backup_dir" || { echo "Failed to cd into $backup_dir"; exit 1; }

    ls -t *.psql.gz | tail -n +11 | xargs -r rm -f
else
    echo "Backup folder does not exist."
fi

Enter fullscreen mode Exit fullscreen mode

How to Use This Script

  • Replace db_name with your actual database name.
  • Replace IP in the scp line with your remote server’s IP address or hostname.
  • Make the script executable:
chmod +x backup.sh
Enter fullscreen mode Exit fullscreen mode
  • Run it manually or automate it with cron:
crontab -e

# Example for daily backups at 2 AM:
0 2 * * * /path/to/backup.sh
Enter fullscreen mode Exit fullscreen mode

Script Breakdown

1. Timestamping the Backup

Generates a clean, colon-free timestamp using the system's current local time. This helps uniquely name each backup file.

timestamp=$(date +%Y-%m-%d_%H-%M-%S)
Enter fullscreen mode Exit fullscreen mode

2. Database Dump and Compression

The script uses pg_dump to export the database and compresses the result using gzip. The -O flag omits ownership commands in the SQL dump.

sudo su postgres -c "pg_dump -O db_name > /tmp/back.psql"
gzip -f /tmp/back.psql
Enter fullscreen mode Exit fullscreen mode

3. Local and Remote Storage

Backups are first stored locally with a timestamped filename. Then, they're securely copied to a remote server using scp.

mv /tmp/back.psql.gz "$backup_file"
scp "$backup_file" ubuntu@IP:/home/ubuntu/backups/
Enter fullscreen mode Exit fullscreen mode

4. Cleaning Up Old Backups

This line ensures only the 10 most recent backups are kept, preventing unnecessary disk usage over time.

ls -t *.psql.gz | tail -n +11 | xargs -r rm -f
Enter fullscreen mode Exit fullscreen mode

Enhancing the Script: Cloud Storage Integration (Optional)

While local and remote backups are great, integrating cloud storage can elevate your backup strategy.

# Amazon S3 using the AWS CLI
aws s3 cp "$backup_file" s3://your-s3-bucket-name/backups/

# Google Cloud Storage
gsutil cp "$backup_file" gs://your-gcs-bucket/backups/
Enter fullscreen mode Exit fullscreen mode

Backing up your data is not optional; it’s a necessity. With automation in place, you can sleep better knowing your data is safe.

Top comments (0)