- Posted on
- • Administration
Using Bash to Automate Database Backups
- Author
-
-
- User
- Linux Bash
- Posts by this author
- Posts by this author
-
Automating database backups with Bash is a reliable way to ensure data integrity and recovery in case of failures. Here's how you can achieve it for common databases like MySQL, PostgreSQL, and SQLite.
1. General Steps for Automating Database Backups
- Determine Database Type: MySQL, PostgreSQL, SQLite, etc.
- Install Backup Tools:
- MySQL:
mysqldump
- PostgreSQL:
pg_dump
- SQLite: Direct file copy.
- MySQL:
- Write a Backup Script:
- Specify the database credentials, backup location, and naming conventions.
- Schedule the Script: Use
cron
for periodic execution. - Secure Backups: Encrypt sensitive data and restrict access to backup files.
2. MySQL/MariaDB Backup Script
Step 1: Install MySQL Utilities
Ensure mysqldump
is installed:
sudo apt install mysql-client # Debian/Ubuntu
sudo yum install mysql # CentOS/RHEL
Step 2: Create the Backup Script
Example:
#!/bin/bash
# Configuration
DB_USER="root"
DB_PASS="password"
DB_NAME="my_database"
BACKUP_DIR="/backups/mysql"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/$DB_NAME-$TIMESTAMP.sql"
# Ensure backup directory exists
mkdir -p $BACKUP_DIR
# Perform the backup
mysqldump -u $DB_USER -p$DB_PASS $DB_NAME > $BACKUP_FILE
# Verify success
if [ $? -eq 0 ]; then
echo "Backup successful: $BACKUP_FILE"
else
echo "Backup failed!" >&2
fi
Step 3: Make the Script Executable
chmod +x /path/to/mysql_backup.sh
Step 4: Schedule with Cron
Edit crontab:
crontab -e
Add a daily backup job:
0 2 * * * /path/to/mysql_backup.sh >> /var/log/mysql_backup.log 2>&1
3. PostgreSQL Backup Script
Step 1: Install PostgreSQL Utilities
Ensure pg_dump
is installed:
sudo apt install postgresql-client # Debian/Ubuntu
sudo yum install postgresql # CentOS/RHEL
Step 2: Create the Backup Script
Example:
#!/bin/bash
# Configuration
DB_USER="postgres"
DB_NAME="my_database"
BACKUP_DIR="/backups/postgresql"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/$DB_NAME-$TIMESTAMP.sql"
# Ensure backup directory exists
mkdir -p $BACKUP_DIR
# Perform the backup
PGPASSWORD="password" pg_dump -U $DB_USER -d $DB_NAME > $BACKUP_FILE
# Verify success
if [ $? -eq 0 ]; then
echo "Backup successful: $BACKUP_FILE"
else
echo "Backup failed!" >&2
fi
4. SQLite Backup Script
Step 1: Identify SQLite Database File
SQLite databases are typically stored as .db
files.
Step 2: Create the Backup Script
Example:
#!/bin/bash
# Configuration
DB_FILE="/path/to/database.db"
BACKUP_DIR="/backups/sqlite"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/database-$TIMESTAMP.db"
# Ensure backup directory exists
mkdir -p $BACKUP_DIR
# Perform the backup
cp $DB_FILE $BACKUP_FILE
# Verify success
if [ $? -eq 0 ]; then
echo "Backup successful: $BACKUP_FILE"
else
echo "Backup failed!" >&2
fi
5. Compressing and Encrypting Backups
Compress the Backup
To save space, compress the backup file:
gzip $BACKUP_FILE
Encrypt the Backup
For security, encrypt backups using gpg
:
gpg --encrypt --recipient "your_email@example.com" $BACKUP_FILE
6. Automating with Cron
For all examples, schedule your scripts using cron
. Example:
0 3 * * * /path/to/postgresql_backup.sh >> /var/log/postgresql_backup.log 2>&1
7. Rotating Backups
To avoid filling up disk space, remove old backups:
find /backups/mysql -type f -mtime +7 -exec rm {} \;
-mtime +7
: Files older than 7 days.
Add this to your script after the backup step.
8. Uploading Backups to Cloud Storage
For remote storage:
- AWS S3:
bash
aws s3 cp $BACKUP_FILE s3://your-bucket-name/
- Google Drive: Use rclone
.
- FTP Server:
bash
curl -T $BACKUP_FILE ftp://ftp.example.com --user username:password
9. Monitoring Backup Status
Send email notifications:
echo "Backup completed successfully at $(date)" | mail -s "Backup Success" admin@example.com
10. Best Practices
- Test Regularly: Verify backups by restoring them.
- Secure Credentials: Use
.my.cnf
or.pgpass
for storing credentials securely. - Automate Cleanup: Use rotation policies to delete old backups.
- Log Output: Save script logs for troubleshooting.
- Store Offsite: Use cloud or remote servers for redundancy.
This approach ensures reliable, automated database backups. Let me know if you need help customizing your script!