Beginner's Tutorial: Automatically back up data under VPS and upload it to FTP

Beginner's Tutorial: Automatically back up data under VPS and upload it to FTP

1. Installation of crontab

To use the crontab function of the VPS, you may need to install it, ssh in:
1. Install Crontab under CentOS
yum install vixie-cron crontabs //Install Crontab
chkconfig crond on //Set to auto-start at boot
service crond start //Start
2. Install Crontab under Debian
apt-get install cron // In most cases, Debian has already installed it.
/etc/init.d/cron restart //Restart Crontab

2. Install the Email sending component

1. Install the Email component under CentOS
yum install sendmail mutt
2. Install the Email component under Debian
sudo apt-get install email-reminder

3. Use automatic backup script

Script content:

!/bin/bash

Start your edit here

MYSQL_USER=root #mysql username
MYSQL_PASS=123456 #mysql password
[email protected] #The mailbox to which the database is sent
FTP_USER=cat #FTP username
FTP_PASS=123456 #ftp password
FTP_IP=imcat.in #ftp address
FTP_backup=backup #The directory where backup files are stored on FTP. This needs to be created on FTP.
WEB_DATA=/home/www #Website data to be backed up

Your edit ends here

Define the name of the database and the name of the old database

DataBakName=Data_$(date +”%Y%m%d”).tar.gz
WebBakName=Web_$(date +%Y%m%d).tar.gz
OldData=Data_$(date -d -5day +”%Y%m%d”).tar.gz
OldWeb=Web_$(date -d -5day +”%Y%m%d”).tar.gz

Delete local data older than 3 days

rm -rf /home/backup/Data_$(date -d -3day +"%Y%m%d").tar.gz /home/backup/Web_$(date -d -3day +”%Y%m%d”).tar.gz
cd /home/backup

Export database, one database one compressed file

/usr/local/mysql/bin/mysql -u$MYSQL_USER -p$MYSQL_PASS -B -N -e 'SHOW DATABASES' | xargs > mysqldata
sed -i 's/information_schema //g' mysqldata
sed -i 's/mysql //g' mysqldata
for db in cat mysqldata ; do
(/usr/local/mysql/bin/mysqldump -u$MYSQL_USER -p$MYSQL_PASS –databases ${db}> ${db}.sql)
done

Compress the database file into one file

tar zcf /home/backup/$DataBakName /home/backup/*.sql.gz
rm -rf /home/backup/*.sql mysqldata

Send the database to Email. If the database is too large after compression, please comment this line

echo "Subject: Database backup" | mutt -a /home/backup/$DataBakName -s "Content: Database backup" $MAIL_TO

Compress website data

tar zcf /home/backup/$WebBakName $WEB_DATA

Upload to FTP space, delete data from FTP space 5 days ago

ftp -v -n $FTP_IP $FILE
done
for(( i=0; i/dev/nul 2&1

via: Automatic backup script
via command

<<:  Free SSL and Cheap SSL Certificates

>>:  Comcure: Provide 2G website backup space

Recommend

First Root: €2.69/month/512MB RAM/50GB storage/10TB bandwidth/KVM/Germany

First Root is a German hosting provider and a for...

$6.49/month/25G space/350G traffic virtual host - HostMantis

HostMantis is an American hosting company that st...

Uptime Robot: Free website monitoring, supports 50 sites, once every 5 minutes

Uptime Robot, founded in 2010, mainly provides we...

Vultr: $5/month/1 CPU/768MB memory/15GB SSD/400GB traffic/KVM/Japan

Vultr, an American hosting company, is a world-re...

Tutorial on installing VNC service on OpenVZ VPS

Under SSH, it is best to be the root user. Assume...

GeekStorage launches 50% discount code

GeekStorage has been introduced in the previous a...

How do I subscribe to the blog?

Most blogs or websites have RSS function, so why ...