Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

1
  • 9
    This is the only solution that worked: Run rm -Rf bigdirectory several times. I had a directory with thousands of millions of subdirectories and files. I couldn’t even run ls or find or rsync in that directory, because it ran out of memory. The command rm -Rf quit many times (out of memory) only deleting part of the billions of files. But after many retries it finally did the job. Seems to be the only solution if running out of memory is the problem. Commented Apr 9, 2014 at 13:01