我想要将压缩的 mysql 转储传输到 s3。
我试过:
mysqldump -u root -ppassword -all-databases | gzip -9 | s3cmd put s3://bucket/sql/databases.sql.gz
但后来我得到:
ERROR: Not enough paramters for command 'put'
我怎样才能做到这一点(一行)?
答案1
1.5+即可实现s3cmd
(关联):
$ mysqldump ... | s3cmd put - s3://bucket/file-name.sql
答案2
现在看来这是可能的。使用 s3cmd v1.6.1:
curl -v "http://remote-server/file.mp4" |
s3cmd [-c .s3cfg-aws] put - [-P] s3://my-bucket/[folder/]filename.mp4
答案3
您缺少要备份的实际文件。
s3cmd put /backup_dir/somefile.sql.gz s3://bucket/sql/
s3cmd 接受两个基本参数:文件和要备份的存储桶。
其次,我不能为以下内容负责,但它基本上是用中间脚本做你想做的事情。基本上,用以下内容创建一个 bak.sh 文件,然后该 shell 脚本将可以通过 bash 运行。(来源:http://www.wong101.com/tech-cloud/configure-s3cmd-cron-automated-mysql-backup)
S3BUCKET="<bucketname>"
# Array of Databases
DBS=("<db1>" "<db2>" "<db3>" "<db4>")
for i in "${DBS[@]}"
do
DBNAME=$i
FILE=$DBNAME-`date "+%Y%m%d-%H%M"`.sql.gz
mysqldump $DBNAME -u[uname] -p[password] | gzip -9> /home/$FILE
#location of s3cmd may vary, modify if needed
/usr/bin/s3cmd --config /root/.s3cfg put /home/$FILE s3://$S3BUCKET/DBBackup-$DBNAME/ >> /var/log/mysqlback.log
sleep 5
rm /home/$FILE
done
答案4
创建 mysql.sh 以便 mysqldump 通过管道直接存储到 s3 不会存储任何本地文件。
#!/bin/bash
#/home/ec2-user/s3/mysql.sh
#logging
LOG_ROOT="/var/www/html/laravel/storage/logs/laravel-$(date +%Y-%m-%d).log"
# Basic variables
# Database credentials
USER=root
PASSWORD=password
HOST=localhost
DB_NAME=dbname
#Backup_Directory_Locations
S3BUCKET=s3://bucket/database/$(date +%a).sql.gz
# Timestamp (sortable AND readable)
TSTAMP=$(date +"%Y-%m-%d %H:%M:%S")
#Dump of Mysql Database into S3\
echo "[$TSTAMP] MYSQL_Dump.WARNING: MySQL Dump for DB_NAME started from cron s3/mysql.sh" >> "$LOG_ROOT"
echo "Stack trace:" >> "$LOG_ROOT"
echo "#wil be stored as s3://bucket/database/$(date +%a).sql.gz" >> "$LOG_ROOT"
mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - $S3BUCKET
$TSTAMPEND=$(date +"%Y-%m-%d %H:%M:%S")
echo "[$TSTAMPEND] MYSQL_Dump.INFO: MySQL Dump finished." >> "$LOG_ROOT"
echo "Stack trace:" >> "$LOG_ROOT"
echo "#Backup stored at s3://bucket/database/$(date +%a).sql.gz" >> "$LOG_ROOT"
创建 Cronjob 进行每日备份
#crontab -e
0 0 * * * bash /home/ec2-user/s3/mysql.sh to >/dev/null 2>&1
#midnight 12 daily