So, I have a folder with alot of files, they are generated by second and need to be kept in I only can delete them after 90 days, so as you may guess, I would generate tons of files and then after the 90 days I'm able to delete those files that is older than 90 days.
But, I'm stuck in the part where I search for those files, since I have alot, the system complains that the list is too large and thus I can't remove them.
What is the best solution so I can pass this? The file names are in timestamp mode, so I could start by it, but I want to make sure all files are deleted after some time….
I have tried these methods
rm -rf *
find /path/to/files/ -type f -name '*.ts' -mtime +90 -exec rm {} \;
I have also managed to create a script to delete by filename, but with this method I have no guarantee that all the files are deleted.
Best Answer
If the files are not modified after initial creation, you could delete if they have not been modified in over 90 days:
or
(for versions of
find
which do not support the-delete
action).As a matter of safety, you should use a non-destructive version of this command first and ensure it will delete exactly what you want deleted, especially if you intend to automate this action via
cron
or similar, e.g.: