使用 Duplicity 和 Amazon S3 排除大文件的备份脚本

使用 Duplicity 和 Amazon S3 排除大文件的备份脚本

我正在尝试编写一个备份脚本来排除超过一定大小的文件。

我的脚本给出了正确的命令,但在脚本中运行时会输出错误。但是,如果手动运行相同的命令,一切正常...???

以下是基于 Google 找到的一个简单脚本


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="7743E14E"

# exclude files over 100MB
exclude ()
{
 find /home/jason -size +100M \
 | while read FILE; do 
  echo -n " --exclude "
  echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ "
 done
}

echo "Using Command"
echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST"

duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

如果运行,我会收到错误;


Command line error: Expected 2 args, got 6
Enter 'duplicity --help' for help screen.

如您能提供任何帮助我将非常感激。

答案1

我解决了这个问题..


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="gpgkey"

# Generate a filelist of excluded files over 100MB
find $SOURCE -size +100M > /tmp/filelist

duplicity --exclude-filelist /tmp/filelist --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

相关内容