I have a number of log files in the form:
log.2014-02-19-10_24_22
I.e. log.YYYY-MM-DD-H24_MI_SS
The date that's part of the log file's name is when the log file was first created. So at any given moment in time I can have the following log files in my directory:
log.2014-02-19-10_18_54
log.2014-02-19-10_21_20
log.2014-02-19-10_23_11
etc.
Now I have a script that's invoked by a cronjob and that deletes "old" log files:
$ cat delete-old-rotated-logs
#!/usr/bin/env bash
find /home/foo -maxdepth 1 -iname log\* -type f -mmin +1800 -exec rm {} \;
The problem I am facing is that sometimes the process that's logging has crashed so the "latest" log file also becomes "old" after some time (since no process is writing on it) and gets deleted, thus losing me the trace. How can I re-write the delete-old-rotated-logs
script so that it deletes old files except the last one (or the last N
) ? For the ordering one can use both the filename itself or the modification timestamp (more robust).
Best Answer
Or if you want to use
mtime
instead of the filename:From @Stephane's comments, a more robust approach would be to do:
Or for POSIX shell (still requires GNU tools):
A single (robust) pipeline can be used with a recent version of GNU
sed
/sort
(and GNU find as with all of the above):