'logrotate: move only the new compressed files to S3

I have an application generating output and error logs. I need to compress them and keep 5 logs on server. Also, the logs should be copied to S3 bucket right after there are compressed.

lastaction seems to me the right place where I should write my script, since I want the files to be compressed.

My configuration file looks like this:

/var/log/nodejs/*out.log /var/log/nodejs/*err.log {
  size 10M
  missingok
  notifempty
  rotate 5 
  sharedscripts
  compress
  copytruncate
  dateext
  dateformat -%Y%m%d-%s
  olddir /var/log/nodejs/rotated
  lastaction
    echo $@
    INSTANCE_ID="`wget -q -O - http://instance-data/latest/meta-data/instance-id`"
    HOSTNAME=`hostname`
    BUCKET="my-logs"
    REGION="us-west-2"
    read DAY MONTH YEAR <<< `date "+%d %m %Y"`  
    aws s3 sync /var/log/nodejs/rotated/ "s3://$BUCKET/${INSTANCE_ID}_${HOSTNAME}/$YEAR/$MONTH/$DAY/" --region $REGION
  endscript
}

The problem here with the aws s3 sync is that if there are old logs from yesterday (in the 5 logs that are kept in the rotated folder), it will upload them again to the new folder of today.

Is there a way to get in lastaction (or other script provided by logroate) ONLY the files that were rotated now, but in their new location, so I could use aws s3 copy instead of sync?

For example: I print the args at the lastaction, and got:

/var/log/nodejs/app-out.log /var/log/nodejs/app-err.log

While I would like to get the new location:

/var/log/nodejs/rotated/app-out.log-20190131-1548925261.gz /var/log/nodejs/rotated/app-err.log-20190131-1548925261.gz


Solution 1:[1]

I found a solution...

I added --exclude --include to the aws s3 sync command. Now my script looks like:

/var/log/nodejs/*out.log /var/log/nodejs/*err.log {
  size 10M
  missingok
  notifempty
  rotate 5 
  sharedscripts
  compress
  copytruncate
  dateext
  dateformat -%Y%m%d-%s
  olddir /var/log/nodejs/rotated
  lastaction
    echo $@
    INSTANCE_ID="`wget -q -O - http://instance-data/latest/meta-data/instance-id`"
    HOSTNAME=`hostname`
    BUCKET="my-logs"
    REGION="us-west-2"
    read DAY MONTH YEAR <<< `date "+%d %m %Y"` 
    FORMAT=`date "+%Y%m%d"` 
    aws s3 sync /var/log/nodejs/rotated/ "s3://$BUCKET/${INSTANCE_ID}_${HOSTNAME}/$YEAR/$MONTH/$DAY/" --region $REGION --exclude "*" --include "*.log-$FORMAT*"
  endscript
}

Solution 2:[2]

logrotate does not run bash, so this is my script

/home/ubuntu/logs/*log {
    copytruncate
    daily
    dateext
    rotate 30
    compress
    missingok
    su root root
    lastaction
            HOSTNAME=$(hostname -f)
            DD=$(date +%d)
            MM=$(date +%m)
            YYYY=$(date +%Y)
            BUCKET="s3://s3-logs-archive/$YYYY/$MM/$DD/$HOSTNAME/"
            FILE="/home/ubuntu/logs/batch.log-$YYYY$MM$DD"
            /usr/local/bin/aws s3 cp "$FILE" "$BUCKET"
    endscript
}

Important

In order to run aws s3 cp you need Instance Profile attached to the running EC2 server, otherwise cron job will not be able to get the access key (even if you assigned the key). If you are not running EC2 then you can separate the above script into a file.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2