Copy your server logs to Amazon S3 using Logrotate and s3cmd
Hold on Cowboy
This blog post is pretty old. Be careful with the information you find in here. The Times They Are A-Changin'
You want to keep those server logs right? I’ve had customers ask for analytical data for last year and by george Google Analytics doesn’t cover everything on the server.
What you’ll need
- logrotate (installed on most systems). It’s beyond the scope of this article to install logrotate* s3cmd This you can install on a RedHat based server with their yum.repos.d file that’s easy enough to install* An Amazon S3 account (I hope this goes without saying)
- Logs you want to rotate (in this case Nginx)
Setting up s3cmd
- After you get it installed, you’ll want to run config (probably as root)
s3cmd --configure
- This will ask you for your API KEY and API SCERET
- This will also ask if you want to encrypt it on the disk or during transfer (HTTPS)
- After you get it configured try running
s3cmd ls
That should list your buckets
Getting the logrotate set up
- Go to the logrotate dir
cd /etc/logrotate.d/
* Edit the nginx filevim nginx
to look like
/var/log/nginx/*log {
daily
rotate 10
missingok
notifempty
compress
sharedscripts
postrotate
/etc/init.d/nginx reopen_logs
nice /usr/bin/s3cmd sync /var/log/nginx/*.gz s3://<YOUR-S3-BUCKET-NAME/nginx/
endscript
}
- This will sync all
.gz
files to a directory callednginx
on the S3 server
Now wasn’t that simple?
Important notes
- I’m using dates on my access files e.g.
access.log-20130326.gz
, if you use numbers doing a sync could really mess things up in your back ups. To change this you need to edit yourlogrotate.conf
file addingdateext
which makes the date the suffix.
Reference
http://www.lustforge.com/2012/07/15/logrotate-apache-logs-to-amazon-s3/