Timeline for Efficiently delete large directory containing thousands of files
Current License: CC BY-SA 3.0
        7 events
    
    | when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jun 29, 2015 at 12:51 | comment | added | Marki555 | @maxschlepzig for such files you can use find . -print0 | xargs -0 rm, which will use the NULL char as filename separator. | |
| Jun 29, 2015 at 12:50 | comment | added | Marki555 | @camh that's true. But removing files in sorted order is faster than in unsorted (because of recalculating the btree of the directory after each deletion). See this answer for an example serverfault.com/a/328305/105902 | |
| Jan 5, 2014 at 7:53 | comment | added | maxschlepzig | Does not work on filenames that contain newlines. | |
| Aug 14, 2012 at 0:23 | review | Low quality posts | |||
| Aug 14, 2012 at 0:41 | |||||
| Apr 26, 2012 at 10:59 | comment | added | camh | @Toby: Try ls -f, which disables sorting. Sorting requires that the entire directory be loaded into memory to be sorted. An unsortedlsshould be able to stream its output. | |
| Apr 26, 2012 at 8:19 | comment | added | Toby | lswon't work because of the amount of files in the folder. This is why I had to usefind, thanks though. | |
| Apr 26, 2012 at 8:17 | history | answered | PsyStyle | CC BY-SA 3.0 |