Automatic local DB backups

It happened to me twice already, that I’ve messed something up with Docker and lost my local development DBs.

With my old MySQL server I had backups of my local databases in Time Machine and could restore a single DB or even a single table from those backup files. With Docker, there is no way of backing up databases with time machine.

→ I’d love it, if DevKinsta had a built-in flag to schedule automatic MySQL DB dumps on a hourly/daily/weekly basis. Ideally, also during server start/shutdown. For example, as /public/{project}/backup.sql that is overwritten on each interval.

1 Like

Here’s a quick shell script to set up such backups on the DevKinsta Docker container “devkinsta_fpm” (I’m using this container because it has access to both, the DB-server and to the host file system):

#!/bin/bash

# 1. Install crontab
apt-get update
apt-get install cron
update-rc.d cron defaults # Not sure if this is needed, but it cannot hurt.

# RC does not work on devkinsta_fpm, so:
# 2. Add crontab to the containers autostart via supervisor
cat > "/etc/supervisor/conf.d/supervisord.conf" <<-EOF

[program:cron]
command = cron
autostart=true
autorestart=true
priority=5
stdout_logfile=/var/log/cron.log
stdout_logfile_maxbytes=0
stderr_logfile=/var/log/cron-error.log
stderr_logfile_maxbytes=0
EOF

# 3. Extract the MySQL credentials from an existing DevKinsta site.
# Luckily all sites use the same DB credentials.
for f in $(find "/www/kinsta/public" -maxdepth 2 -type f -name "wp-config.php"); do
	db_user=$(grep "DB_USER" $f | cut -d "," -f2 | cut -d "'" -f2)
	db_password=$(grep "DB_PASSWORD" $f | cut -d "," -f2 | cut -d "'" -f2)
	db_host=$(grep "DB_HOST" $f | cut -d "," -f2 | cut -d "'" -f2)
	if [ -n "$db_password" ] && [ -n "$db_user" ] && [ -n "$db_host" ]; then
		break
	fi
done
if [ -z "$db_password" ]; then
	echo "Please create a website in DevKinsta before running this script"
	exit 1
fi

cat > /www/kinsta/mysql.conf <<-EOF
[client]
user=$db_user
password=$db_password
host=$db_host
EOF

# 4. Create a custom backup script that dumps all DBs into /private/backups/
#    I'd prefer /public/{project}/backup, but could not find a simple way to
#    link a db-name to a public-folder.
backup_script=/www/kinsta/run-backup.sh
cat > $backup_script <<-EOF
#!/bin/bash

list=\$(mysql --defaults-extra-file="/www/kinsta/mysql.conf" -Bse "SHOW DATABASES;")

for db in \${list[@]}; do
	if \
		[ "\$db" == "mysql" ] \
		|| [ "\$db" == "sys" ] \
		|| [ "\$db" == "performance_schema" ] \
		|| [ "\$db" == "information_schema" ]
	then
		continue
	fi

	tstamp=\$(date "+%Y%m%d.%H%M")

	rm -f /www/kinsta/private/backups/\$db.*sql
	mysqldump --defaults-extra-file="/www/kinsta/mysql.conf" --column-statistics=0 \$db > "/www/kinsta/private/backups/\$db.\$tstamp.sql"
done
EOF

# 5. Create backup dir and schedule the custom backup script
mkdir -p "/www/kinsta/private/backups"
echo "0 1,4,7,10,13,16,19,22 * * * bash $backup_script" > tmp_cron
crontab tmp_cron
rm tmp_cron
service cron start &>/dev/null

The above script sets up a crontab to run a backup script once every three hours:
echo "0 1,4,7,10,13,16,19,22 * * * bash $backup_script" → backs up my databases at 1:00, 4:00, 7:00, 10:00, 13:00, 16:00, 19:00 22:00. That’s not perfect, but a good starting point for me.

Maybe this helps someone else or serves as inspiration for the new feature in DevKinsta :wink:

1 Like