Our web host went down for a couple of hours last night, you couldn't even get to their own site so it must have been a serious issue. This got me worrying and wondering.
I have a full backup of the site files both locally on my Mac and in a private repository on GitHub but, were anything to happen with the database, I would currently rely on their nightly backups should I need to recover anything.
But what if they went down and couldn't come back up? What if the backups were corrupted or went missing? 14 years of data would be gone, just like that. What if I could automatically backup the database to online storage such as Dropbox?
I couldn't get to sleep so, in the early hours, a quick search found this and I set about creating my own solution for automated backups.
My host has disabled the exec()
command so I couldn't just cut and paste the solution, resorting to cron jobs for mysqldump
and compressing the sql file. As my database is small I also removed the functions for uploading the compressed file in chunks as they're not needed.
I created a Dropbox app to access the API but, despite things being considerably simpler than the original example, errors were being thrown meaning I had to rewrite parts of the code to get it to work.
And it does!
I now have three cron jobs that run in sequence:
- dump the database to a .sql file
- compress that to a .tgz file
- run the php to upload the compressed file to Dropbox
Now, even if the host has a catastrophic failure or data is unrecoverable from their backups, I'll be able to restore the site to within the last couple of posts which I'll be able to recreate from the RSS feed if needed.