I'm trying to create a script and run it in crontab every 5 min, so that the number of files in a folder always remain 50000. If there are more, I want the script to delete the old files.
#!/bin/bash
LIMIT=500000
NO=0
#Get the number of files, that has `*.pcap` in its name, with last modified time 5 days ago
NUMBER=$(find /mnt/md0/capture/DCN/ -maxdepth 1 -name "*.pcap" |wc -l)
if [[ $NUMBER -gt $LIMIT ]] #if number greater than limit
then
del=$(($NUMBER-$LIMIT))
if [ "$del" -lt "$NO" ]
then
del=$(($del*-1))
fi
echo $del
FILES=$(
find /mnt/md0/capture/DCN/ -maxdepth 1 -type f -name "*.pcap" -print0 |
xargs -0 ls -lt |
tail -$del |
awk '{print $8}'
)
rm -f ${FILES[@]}
#delete the originals
fi
It doesn't really work, it doesn't run as the number of files are too large. Is there any other method to get this done?
Best Answer
I ran the command:
The problem that I observed was that
awk '{print $8}'
prints the time, not the file name.awk '{print $9}'
would solve that.Another problems is that
xargs
may runls -lt
several times which would give you a number of sorted lists of files one after the other, but the whole list would not be sorted.But, there appear to be other simplifications one could make. You can get the oldest files with:
This assumes, as your post seemed to, that the file names have no spaces, tabs, or newline characters in them.
So, the full command for deleting the oldest
$del
files could be:MORE: If your file names may contain spaces, tabs, backslashes, or quotes in them (but not newlines), use (assuming GNU
ls
4.0 (1998) or newer):