使用 Duplicity + S3 排除大文件的脚本

使用 Duplicity + S3 排除大文件的脚本

我正在尝试编写一个备份脚本来排除超过一定大小的文件。

如果我运行脚本,duplicity 会出现错误。但是,如果我复制并粘贴脚本生成的相同命令,一切正常...

以下是脚本


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="gpgkey"

# exclude files over 100MB
exclude ()
{
 find /home/jason -size +100M \
 | while read FILE; do 
  echo -n " --exclude "
  echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ "
 done
}

echo "Using Command"
echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST"

duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

当脚本运行时我收到错误;


Command line error: Expected 2 args, got 6

我哪里错了??

答案1

我解决了这个问题..


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="gpgkey"

# Generate a filelist of excluded files over 100MB
find $SOURCE -size +100M > /tmp/filelist

duplicity --exclude-filelist /tmp/filelist --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

相关内容