Skip to content
Snippets Groups Projects
Commit 64c4d190 authored by intrigeri's avatar intrigeri
Browse files

dup: support backups to Amazon S3 buckets

Thanks to stefan <s.freudenberg@jpberlin.de> for the patch.
This fixes Redmine bug #658.
parent 0fbd8744
Branches
Tags 0.9.0
No related merge requests found
...@@ -31,3 +31,4 @@ dan@garthwaite.org -- reportspace bugfix ...@@ -31,3 +31,4 @@ dan@garthwaite.org -- reportspace bugfix
Tuomas Jormola <tj@solitudo.net> -- "when = manual" option Tuomas Jormola <tj@solitudo.net> -- "when = manual" option
Ian Beckwith <ianb@erislabs.net> -- dup bandwidthlimit fix Ian Beckwith <ianb@erislabs.net> -- dup bandwidthlimit fix
Olivier Berger <oberger@ouvaton.org> -- dup debug output bugfix, reportinfo option Olivier Berger <oberger@ouvaton.org> -- dup debug output bugfix, reportinfo option
stefan <s.freudenberg@jpberlin.de> -- dup support for Amazon S3 buckets
...@@ -59,6 +59,7 @@ version 0.9.7 -- UNRELEASED ...@@ -59,6 +59,7 @@ version 0.9.7 -- UNRELEASED
. Report duplicity output as "info" so that it can be included in . Report duplicity output as "info" so that it can be included in
report e-mail when reportinfo is on (Closes: #563734) report e-mail when reportinfo is on (Closes: #563734)
. Fix include/exclude paths with spaces . Fix include/exclude paths with spaces
. Support backups to Amazon S3 buckets, thanks to stefan for the patch.
helper changes helper changes
dup: dup:
. Do not propose to exclude /home/*/.gnupg twice anymore . Do not propose to exclude /home/*/.gnupg twice anymore
......
...@@ -182,6 +182,16 @@ blank by hitting return. ...@@ -182,6 +182,16 @@ blank by hitting return.
The included helper program "ninjahelper" will walk you through creating The included helper program "ninjahelper" will walk you through creating
an rdiff-backup configuration, and will set up the ssh keys for you. an rdiff-backup configuration, and will set up the ssh keys for you.
Amazon Simple Storage Service (S3)
==================================
Duplicity can store backups on Amazon S3 buckets, taking care of encryption.
Since it performs incremental backups it minimizes the number of request per
operation therefore reducing the costs. The boto Python interface to Amazon
Web Services is needed to use duplicity with S3 (Debian package: python-boto).
INSTALLATION INSTALLATION
============ ============
......
...@@ -8,6 +8,8 @@ ...@@ -8,6 +8,8 @@
## passed directly to duplicity, e.g. to increase verbosity set this to: ## passed directly to duplicity, e.g. to increase verbosity set this to:
## options = --verbosity 8 ## options = --verbosity 8
## when using the Amazon S3 backend to create buckets in Europe:
## options = --s3-european-buckets --s3-use-new-style
## ##
## Default: ## Default:
# options = # options =
...@@ -158,11 +160,21 @@ exclude = /home/*/.gnupg ...@@ -158,11 +160,21 @@ exclude = /home/*/.gnupg
## examples include: ## examples include:
## desturl = file:///usr/local/backup ## desturl = file:///usr/local/backup
## desturl = rsync://user@other.host//var/backup/bla ## desturl = rsync://user@other.host//var/backup/bla
## desturl = s3+http://
## the default value of this configuration option is not set: ## the default value of this configuration option is not set:
## ##
## Default: ## Default:
# desturl = # desturl =
## Amazon Web Services Access Key ID and Secret Access Key, needed for backups
## to S3 buckets.
## awsaccesskeyid = YOUR_AWS_ACCESS_KEY_ID
## awssecretaccesskey = YOUR_AWS_SECRET_KEY
##
## Default:
# awsaccesskeyid =
# awssecretaccesskey =
## bandwith limit, in kbit/s ; default is 0, i.e. no limit an example ## bandwith limit, in kbit/s ; default is 0, i.e. no limit an example
## setting would be: ## setting would be:
## bandwidthlimit = 128 ## bandwidthlimit = 128
......
...@@ -403,6 +403,12 @@ keep = $dup_keep ...@@ -403,6 +403,12 @@ keep = $dup_keep
# bandwithlimit. For details, see duplicity manpage, section "URL FORMAT". # bandwithlimit. For details, see duplicity manpage, section "URL FORMAT".
#desturl = file:///usr/local/backup #desturl = file:///usr/local/backup
#desturl = rsync://user@other.host//var/backup/bla #desturl = rsync://user@other.host//var/backup/bla
#desturl = s3+http://your_bucket
# Amazon Web Services Access Key ID and Secret Access Key, needed for backups
# to S3 buckets.
#awsaccesskeyid = YOUR_AWS_ACCESS_KEY_ID
#awssecretaccesskey = YOUR_AWS_SECRET_KEY
# bandwith limit, in kbit/s ; default is 0, i.e. no limit # bandwith limit, in kbit/s ; default is 0, i.e. no limit
#bandwidthlimit = 128 #bandwidthlimit = 128
......
...@@ -26,6 +26,8 @@ setsection dest ...@@ -26,6 +26,8 @@ setsection dest
getconf incremental yes getconf incremental yes
getconf keep 60 getconf keep 60
getconf desturl getconf desturl
getconf awsaccesskeyid
getconf awssecretaccesskey
getconf sshoptions getconf sshoptions
getconf bandwidthlimit 0 getconf bandwidthlimit 0
getconf desthost getconf desthost
...@@ -38,6 +40,9 @@ destdir=${destdir%/} ...@@ -38,6 +40,9 @@ destdir=${destdir%/}
[ -n "$desturl" -o -n "$destdir" ] || fatal "The destination directory (destdir) must be set when desturl is not used." [ -n "$desturl" -o -n "$destdir" ] || fatal "The destination directory (destdir) must be set when desturl is not used."
[ -n "$include" -o -n "$vsinclude" ] || fatal "No source includes specified" [ -n "$include" -o -n "$vsinclude" ] || fatal "No source includes specified"
[ -n "$password" ] || fatal "The password option must be set." [ -n "$password" ] || fatal "The password option must be set."
if [ "`echo $desturl | @AWK@ -F ':' '{print $1}'`" == "s3+http" ]; then
[ -n "$awsaccesskeyid" -a -n "$awssecretaccesskey" ] || fatal "AWS access keys must be set for S3 backups."
fi
### VServers ### VServers
# If vservers are configured, check that the ones listed in $vsnames do exist. # If vservers are configured, check that the ones listed in $vsnames do exist.
...@@ -227,6 +232,12 @@ set +o noglob ...@@ -227,6 +232,12 @@ set +o noglob
execstr_source=${execstr_source//\\*/\\\\\\*} execstr_source=${execstr_source//\\*/\\\\\\*}
### If desturl is an S3 URL export the AWS environment variables
if [ "`echo $desturl | @AWK@ -F ':' '{print $1}'`" == "s3+http" ]; then
export AWS_ACCESS_KEY_ID="$awsaccesskeyid"
export AWS_SECRET_ACCESS_KEY="$awssecretaccesskey"
fi
### Cleanup commands (duplicity >= 0.4.4) ### Cleanup commands (duplicity >= 0.4.4)
# cleanup # cleanup
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment