There’s a great (and free) command line tool set called s3cmd that makes it simple to push and pull files to AWS S3 buckets, and is easy to script with. We use it for cheep, reliable, offsite backups of media and database files.
The tool set can be downloaded form the GitHub repo here. There’s a simple howto guide at the bottom. One slight bump we did run into is that some of the older versions struggle with larger file sizes so make sure you get version 1.5.0-alpha3 at the minimum.
To install the tool simply download the repo onto your server / laptop and cd into the directory to run
s3cmd --configure. You’ll need to have generated iAM credentials through the AWS control panel first. Once you’ve got it configured you can push files to a bucket with the following command:
s3cmd put /local/file/path s3://bucket-name
Below is a super simple (and fairly crude) bash script we call by cron every night that backups all db’s on a server and sends the backups to s3:
#!/bin/bash echo "------- Starting " $(date) " -------" rm -rf /backups/*.out cd /backups/ mysqldump --all-databases -uroot -ppassword > $(date +%m%d%Y).out cd /root/s3cmd-1.5.0-alpha3 ./s3cmd put /backups/$(date +%m%d%Y).out s3://backups echo "------- Finished " $(date) " -------"
It’s also good for backing up crucial log files in environments where a dedicate syslog server isn’t really justifiable or is perhaps a little too pricy.
Side note – you can also use this to push data into s3 that is to be served through cloudfront, making scripting media into a CDN simple.