I recently joined RunCloud the other day and didn’t like their backup function, so I decided to create my own Backup Script that puts the files directly on Google Drive!
Update 2020: The old method no longer works. The new backup script example using rclone is below. It’s very quick!
How to backup your files to Google Drive using rclone
I followed this tutorial to set up rclone, and when I got to the question “use auto config?” – select No and it will provide you a Google authentication link which you can copy/paste into your browser to complete the set-up.
#!/bin/bash DATE=`date +%m-%d-%Y` for dir in /home/*/webapps/* do base=$(basename "$dir") tar -cf "/tmp/${base}-${DATE}.tar" -C "$dir" . rclone sync /tmp/${base}-${DATE}.tar "Google Drive:My Folder/My Sub Folder" rm -rf /tmp/${base}-${DATE}.tar done USER="root" PASSWORD="<MYSQL_ROOT_PASSWORD>" databases=`mysql -u $USER -p$PASSWORD --batch --skip-column-names -e "SHOW DATABASES;" | grep -E -v "(information|performance)_schema"` for db in $databases do echo "Dumping database: $db" mysqldump -u $USER -p$PASSWORD --databases $db > /tmp/$db-`date +%m-%d-%Y`.sql tar -cf /tmp/$db-`date +%m-%d-%Y`.sql.tar -C /tmp/ $db-`date +%m-%d-%Y`.sql rclone sync /tmp/$db-`date +%m-%d-%Y`.sql.tar "Google Drive:My Folder/My Sub Folder" rm -rf /tmp/$db-`date +%m-%d-%Y`.sql rm -rf /tmp/$db-`date +%m-%d-%Y`.sql.tar done
** This is the end of the tutorial. The older content which no longer works is below.
Install Google Drive to your Linux Server
First, you’ll need to install Google Drive to your Linux machine. This simply installs Drive – not the sync functions, etc.
wget -O drive https://drive.google.com/uc?id=0B3X9GlR6EmbnMHBMVWtKaEZXdDg mv drive /usr/sbin/drive chmod 755 /usr/sbin/drive
Next, you’ll need to start drive and then follow the instructions. Once you’ve done that and linked Google Drive to your linux server, we’re ready!
drive
Install the Backup Script
Ok, so the script below basically backups all the web applications and databases into their own .tar.gz file. I run this below script once a day during the morning hours.
#!/bin/bash DATE=`date +%m-%d-%Y` for dir in /home/*/webapps/* do base=$(basename "$dir") tar -czf "/tmp/${base}-${DATE}.tar.gz" -C "$dir" . drive upload -p '<GOOGLE_DRIVE_FOLDER_ID>' -f /tmp/${base}-${DATE}.tar.gz rm -rf /tmp/${base}-${DATE}.tar.gz done USER="root" PASSWORD="<MYSQL_ROOT_PASSWORD>" databases=`mysql -u $USER -p$PASSWORD --batch --skip-column-names -e "SHOW DATABASES;" | grep -E -v "(information|performance)_schema"` for db in $databases do mysqldump -u $USER -p$PASSWORD --databases $db > /tmp/$db-`date +%m-%d-%Y`.sql tar -czf /tmp/$db-`date +%m-%d-%Y`.sql.tar.gz -C /tmp/ $db-`date +%m-%d-%Y`.sql drive upload -p '<GOOGLE_DRIVE_FOLDER_ID>' -f /tmp/$db-`date +%m-%d-%Y`.sql.tar.gz rm -rf /tmp/$db-`date +%m-%d-%Y`.sql rm -rf /tmp/$db-`date +%m-%d-%Y`.sql.tar.gz done
How to make the backup script work
- Replace <GOOGLE_DRIVE_FOLDER_ID> with the Google Drive folder ID. This usually looks like random characters. To get it, just double click the backup folder and look in the address bar.
- Once you have set-up a server with RunCloud it provides you with the MySQL root password. Replace <MYSQL_ROOT_PASSWORD> with that password.
That’s it! You can now run the script to make sure it sends the files directly to your selected Google Drive folder.
Delete the files after X days
If you’re looking to delete files from google drive after X amount of days, then look no further because I created that script too.
Thank you very much for the rclone tutorial!
How to use rclone for making database backup on RunCloud? Is it possible?
The sign in with this app has been temporarily disabled. Any help?
Hi Lance,
Unfortunately not. I’ve had to switch to rclone – which seems better in my opinion.
I followed a tutorial (link below), with the exception of the “use auto config” part – select No and it will provide you a Google authentication link which you can copy/paste.
https://www.howtogeek.com/451262/how-to-use-rclone-to-back-up-to-google-drive-on-linux/
I will be updating this post soon with more details.
Which command for cronta -e do you use? I put GoogleBackup.sh into root directory but when i set into crontab -e
* 4 * * * /bin/bash /GoogleBackup.sh I give error
which command do you use in contab -e? I have error :((
Thank you all working. Could you tell me how to have different folders for each websites files and database as I am currently backing up 12 websites and having 7 days worth of data in 1 folder is overwhelming.
Many thanks
Hi Philip,
Unfortunately it will require re-coding the script. Google Drive runs using folder ID’s, not by “free balling” and creating folders as-and-when. Because of this, you can only upload to one folder. In order to upload to multiple folders you’d have to rewrite the script so that it checks a username and/or name, then match that to a Google folder ID and then use that ID for the upload. More effort than it’s worth.
When i try to run the script from root using “./googlebackup.sh” it says the following:
-bash: ./googlebackup.sh: /bin/bash^M: bad interpreter: No such file or directory
Iv tried running it using the cron in runcloud but no backups? Also in the “command” line under add new cron job in runcloud do i just put googlebackup.sh or /googlebackup.sh (file is in root directory)
This means you’re using Windows and incorrectly coding the document. If you’re using NotePad++ in the lower right corner, double click Windows (CR LF) and then click on Unix (LF), then save the document again.
https://uploads.disquscdn.com/images/d58bb6e28017e53dd73cc4200538a0e7886f05c2d0fe99a59fd02f969c5fc2bb.jpg
If you’re not using Notepad++ then you’ll have to search for the error and use other techniques.
what directory do you put the backup script into and what do you name the script?
I just named it GoogleBackup.sh and placed it in the /root directory. You can name it whatever you like and also place it wherever you like.
Do you know this: https://rclone.org/drive/ ?
🙂 WOW! I will try it. Thanks Steven.