Automatically backup website files and databases and upload them to FTP space

Automatically backup website files and databases and upload them to FTP space

#You need to modify the place from here
MYSQL_USER=root #mysql username
MYSQL_PASS=123456 #mysql password
[email protected] #The mailbox to which the database is sent
FTP_USER=cat #ftpl username
FTP_PASS=123456 #ftp password
FTP_IP=imcat.in #ftp address
FTP_backup=backup #The directory where backup files are stored on FTP. This needs to be created on FTP.
WEB_DATA=/home/www #Website data to be backed up
#The place you want to modify ends here. Don't forget to create a directory: mkdir -p /home/backup
If your website data is larger than 5G and smaller than 10G, it will be a bit difficult to compress it...
If it is less than 5G, there is no problem... But it all depends on the performance of the vps...
————————————————————————–
Work is boring. If I don't find something to do, I will fall asleep...
So this script was born..
The main functions are: (Isn't the title saying it!!!!), in detail, it automatically backs up website files and databases and uploads them to FTP space, retains 3 days of backup locally, and retains 5 days of backup in remote FTP space.
Database backup is sent to Email and FTP space, double backup to ensure data security.
First install the Email sending component

yum install sendmail mutt

The script is as follows:

#!/bin/bash
#You need to modify the place from here
MYSQL_USER=root #mysql username
MYSQL_PASS=123456 #mysql password
[email protected] #The mailbox to which the database is sent
FTP_USER=cat #FTP username
FTP_PASS=123456 #ftp password
FTP_IP=imcat.in #ftp address
FTP_backup=backup #The directory where backup files are stored on FTP. This needs to be created on FTP.
WEB_DATA=/home/www #Website data to be backed up
#The place you want to modify ends here
#Define the name of the database and the name of the old database
DataBakName=Data_$(date +”%Y%m%d”).tar.gz
WebBakName=Web_$(date +%Y%m%d).tar.gz
OldData=Data_$(date -d -5day +”%Y%m%d”).tar.gz
OldWeb=Web_$(date -d -5day +”%Y%m%d”).tar.gz
#Delete local data from 3 days ago
rm -rf /home/backup/Data_$(date -d -3day +”%Y%m%d”).tar.gz /home/backup/Web_$(date -d -3day +”%Y%m%d”).tar.gz
cd /home/backup
#Export database, one database and one compressed file
for db in `/usr/local/mysql/bin/mysql -u$MYSQL_USER -p$MYSQL_PASS -B -N -e 'SHOW DATABASES' | xargs`; do
(/usr/local/mysql/bin/mysqldump -u$MYSQL_USER -p$MYSQL_PASS ${db} | gzip -9 -> ${db}.sql.gz)
done
#Compress the database file into one file
tar zcf /home/backup/$DataBakName /home/backup/*.sql.gz
rm -rf /home/backup/*.sql.gz
#Send the database to Email. If the database is too large after compression, please comment this line
echo "Subject: Database backup" | mutt -a /home/backup/$DataBakName -s "Content: Database backup" $MAIL_TO
#Compress website data
tar zcf /home/backup/$WebBakName $WEB_DATA
#Upload to FTP space, delete data from FTP space 5 days ago
ftp -v -n $FTP_IP << END
user $FTP_USER $FTP_PASS
type binary
cd $FTP_backup
delete $OldData
delete $OldWeb
put $DataBakName
put $WebBakName
bye
END

Download address: http://imcat.in/down/AutoBackupToFtp.sh
Download the script and add permissions to the script:

chmod +x AutoBackupToFtp.sh

Use crontab to achieve automatic backup, under ssh,

crontab -e

No such command? Please refer to CentOS installation crontab and use method to enter the following content:

00 00 * * * /home/AutoBackupToFtp.sh

This will enable automatic backup of website files and databases to be uploaded to the FTP space at 00:00 every day.
Reprinted from: http://imcat.in/auto-backup-site-files-database-upload-ftp/
In addition, the novice backup command:
1. How to save and exit after editing crontab?
esc
:
wq!
or
ctrl+c
2. Check the mysql running path
which mysql
3. Check the MySQL installation path
whereis mysql

<<:  BuyVM.net VPS account is suspended

>>:  AlienVPS offers $4 512MB OpenVZ VPS

Recommend

iFog 2GB RAM 1Gbps Bandwidth Fremont KVM VPS Review

Details : iFog: $3.9/month/512MB memory/10GB SSD ...

One-click installation package for Nginx/MariaDB/HHVM under CentOS 6 64-bit

1. Installation requirements : 1. This script is ...

$49/month/2G memory/500G space/2T traffic dedicated server - Hivelocity

Hivelocity is an American hosting company founded...

ipsystemsltd OpenVZ 512MB SSD hard drive Dallas simple review

Original text: ipsystemsltd: $6/year/512MB memory...

Server Stadium: A US hosting company that only provides dedicated hosting

Server Stadium only provides dedicated hosting an...

$3.99/month/10G space/100G traffic/Phoenix virtual host—— RLS Hosting

RLS Hosting is an American hosting company founde...