So, I have a folder with alot of files, they are generated by second and need to be kept in I only can delete them after 90 days, so as you may guess, I would generate tons of files and then after the 90 days I'm able to delete those files that is older than 90 days. But, I'm stuck in the part where I search for those files, since I have alot, the system complains that the list is too large and thus I can't remove them.
What is the best solution so I can pass this? The file names are in timestamp mode, so I could start by it, but I want to make sure all files are deleted after some time....
I have tried these methods
rm -rf *
find /path/to/files/ -type f -name '*.ts' -mtime +90 -exec rm {} \;
I have also managed to create a script to delete by filename, but with this method I have no guarantee that all the files are deleted.
rm -rf *. You should always be as specific as possible when removing files with globbing.rm -- ./*would be better for you, no need for-ras you want to remove files not directories, and you should only use-fif you have to.