#You need to modify the place from here MYSQL_USER=root #mysql username MYSQL_PASS=123456 #mysql password [email protected] #The mailbox to which the database is sent FTP_USER=cat #ftpl username FTP_PASS=123456 #ftp password FTP_IP=imcat.in #ftp address FTP_backup=backup #The directory where backup files are stored on FTP. This needs to be created on FTP. WEB_DATA=/home/www #Website data to be backed up #The place you want to modify ends here. Don't forget to create a directory: mkdir -p /home/backup If your website data is larger than 5G and smaller than 10G, it will be a bit difficult to compress it... If it is less than 5G, there is no problem... But it all depends on the performance of the vps... ————————————————————————– Work is boring. If I don't find something to do, I will fall asleep... So this script was born.. The main functions are: (Isn't the title saying it!!!!), in detail, it automatically backs up website files and databases and uploads them to FTP space, retains 3 days of backup locally, and retains 5 days of backup in remote FTP space. Database backup is sent to Email and FTP space, double backup to ensure data security. First install the Email sending component yum install sendmail mutt
The script is as follows: #!/bin/bash #You need to modify the place from here MYSQL_USER=root #mysql username MYSQL_PASS=123456 #mysql password [email protected] #The mailbox to which the database is sent FTP_USER=cat #FTP username FTP_PASS=123456 #ftp password FTP_IP=imcat.in #ftp address FTP_backup=backup #The directory where backup files are stored on FTP. This needs to be created on FTP. WEB_DATA=/home/www #Website data to be backed up #The place you want to modify ends here #Define the name of the database and the name of the old database DataBakName=Data_$(date +”%Y%m%d”).tar.gz WebBakName=Web_$(date +%Y%m%d).tar.gz OldData=Data_$(date -d -5day +”%Y%m%d”).tar.gz OldWeb=Web_$(date -d -5day +”%Y%m%d”).tar.gz #Delete local data from 3 days ago rm -rf /home/backup/Data_$(date -d -3day +”%Y%m%d”).tar.gz /home/backup/Web_$(date -d -3day +”%Y%m%d”).tar.gz cd /home/backup #Export database, one database and one compressed file for db in `/usr/local/mysql/bin/mysql -u$MYSQL_USER -p$MYSQL_PASS -B -N -e 'SHOW DATABASES' | xargs`; do (/usr/local/mysql/bin/mysqldump -u$MYSQL_USER -p$MYSQL_PASS ${db} | gzip -9 -> ${db}.sql.gz) done #Compress the database file into one file tar zcf /home/backup/$DataBakName /home/backup/*.sql.gz rm -rf /home/backup/*.sql.gz #Send the database to Email. If the database is too large after compression, please comment this line echo "Subject: Database backup" | mutt -a /home/backup/$DataBakName -s "Content: Database backup" $MAIL_TO #Compress website data tar zcf /home/backup/$WebBakName $WEB_DATA #Upload to FTP space, delete data from FTP space 5 days ago ftp -v -n $FTP_IP << END user $FTP_USER $FTP_PASS type binary cd $FTP_backup delete $OldData delete $OldWeb put $DataBakName put $WebBakName bye END
Download address: http://imcat.in/down/AutoBackupToFtp.sh Download the script and add permissions to the script: chmod +x AutoBackupToFtp.sh
Use crontab to achieve automatic backup, under ssh, crontab -e
No such command? Please refer to CentOS installation crontab and use method to enter the following content: 00 00 * * * /home/AutoBackupToFtp.sh
This will enable automatic backup of website files and databases to be uploaded to the FTP space at 00:00 every day. Reprinted from: http://imcat.in/auto-backup-site-files-database-upload-ftp/ In addition, the novice backup command: 1. How to save and exit after editing crontab? esc : wq! or ctrl+c 2. Check the mysql running path which mysql 3. Check the MySQL installation path whereis mysql |