Jojodae Ganesh Sivaji
March 25, 2013
I came across this interesting utility to sync backup files to Amazon S3. With this utility in less than ten minutes, I learned how easy it is to upload a local file to a remote folder in an S3 bucket.
I already had a script in place to back up the site's files and database dump. To make this script complete I needed a script/ utility to sync backup files. And s3cmd came there as a savior. I would like to share in this blog post how easy it was to install, configure, and get started to make the potential use of the same.
Our servers are running Ubuntu 12.04.1 LTS. Luckily this s3cmd package is available in Ubuntu's official repository. In no time, I got this installed in our machine with apt-get command.
The following command did the trick of installing,
$ sudo apt-get install s3cmd
$ s3cmd
1 ERROR: /home/user/.s3cfg: No such file or directory
2 ERROR: Configuration file not available.
3 ERROR: Consider using --configure parameter to create one.
1 $ s3cmd --configure
1 New settings:
2 Access Key: [your access key]
3 Secret Key: [your securet key]
4 Encryption password: somepassword
5 Path to GPG program: /usr/bin/gpg
6 Use HTTPS protocol: False
7 HTTP Proxy server name:
8 HTTP Proxy server port: 0
9
10 Test access with supplied credentials? [Y/n] Y
11 Please wait...
12 Success. Your access key and secret key worked fine :-)
13
14 Now verifying that encryption works...
15 Success. Encryption and decryption worked fine :-)
16
17 Save settings? [y/N] y
$ s3cmd --delete-removed --reduced-redundancy sync my_backup.tar.gz s3://backup.example.com/project_backup/
1 my_backup.tar.gz -> s3://backup.example.com/project_backup//kst_backup/kst_.tar.gz [1 of 1]
2 63935864 of 63935864 100% in 186s 335.43 kB/s done
3 Done. Uploaded 63935864 bytes in 186.1 seconds, 335.42 kB/s
Just like how your fellow techies do.
We'd love to talk about how we can work together
Take control of your AWS cloud costs that enables you to grow!