Linked Questions

13 votes
8 answers
73k views

I have a directory where lots of cached files are getting generated very quickly. Since these are very small files, it is consuming all my inodes very quickly. Currently I am running the following ...
pradeepchhetri's user avatar
17 votes
5 answers
18k views

I have a folder with 266778 subfolders. How can I delete it? I have tried cd ~/.local/share/Trash/ sudo rm -rf * but it takes much time. After 1 minute 25 seconds real time and 0.072 seconds user ...
Martin Thoma's user avatar
  • 2,912
2 votes
1 answer
2k views

I've been trying to figure out how to delete millions of files from a mounted NAS drive. The OS I'm accessing it from is RHEL 7.6. The directory is actively being written to, with tens or hundreds ...
Logic Crypto's user avatar
0 votes
2 answers
312 views

in the folder cd /var/log/hive I guess we have thundered of log files I say that because if I do under this folder ls -l then its stuck and only CNTRL C , will exit so I cant to vew all files ...
yael's user avatar
  • 14k
1 vote
2 answers
415 views

I have a very large number of files to delete, which go by the following format: esymac_logEvents.log.5_2017-Feb-06_02-39-17.Z_2017-Feb-08_02-39-14.Z_2017-Feb-09_02-39-14.Z_2017-Feb-11_02-39-11....
EXHALEXHALE's user avatar
1 vote
0 answers
161 views

I want to purge storage data from SAN which accumulates to 10 TB. The OS is linux. The storage is mounted as mount-points on a system for which it is visible I was recommended by a friend to use ...
user3066819's user avatar
74 votes
8 answers
160k views

I have a directory tree that I would like to shred with the Linux 'shred' utility. Unfortunately, shred has no -R option for recursive shredding. How can I shred an entire directory tree ...
Steve V.'s user avatar
  • 1,055
44 votes
6 answers
30k views

I have a directory of 30 TB having billions of files in it which are formally all JPEG files. I am deleting each folder of files like this: sudo rm -rf bolands-mills-mhcptz This command just runs and ...
Junaid Farooq's user avatar
23 votes
4 answers
80k views

One program created lots of nested sub-folders. I tried to use command rm -fr * to remove them all. But it's very slow. I'm wondering is there any faster way to delete them all?
Lei Hao's user avatar
  • 341
15 votes
6 answers
4k views

I have a filesystem with many small files that I erase regularly (the files are a cache that can easily be regenerated). It's much faster to simply create a new filesystem rather than run rm -rf or ...
davidvandebunte's user avatar
31 votes
2 answers
22k views

I don't understand iotop output: it shows ~1.5 MB/s of disk write (top right), but all programs have 0.00 B/s. Why? The video was taken as I was deleting the content of a folder with a few millions of ...
Franck Dernoncourt's user avatar
12 votes
3 answers
33k views

I have a host that I can only access with sftp, scp, and rsync-- no ssh. I have a large tree of directories that I want to delete, but my sftp client apparently does not support recursive rms or ...
user394's user avatar
  • 14.8k
4 votes
3 answers
4k views

I have a dir with a gigantic amount of very small files that I want to remove and simply removing the dir with rm -rf /path/to/the/dir is already taking multiple days. It might sound strange that ...
Eduardo J. Culpepper's user avatar
2 votes
1 answer
14k views

I am trying to remove large amount of mails (mostly mail delivery failed) from my server using rm -rf /home/*/mail/new/* And I am getting -bash: /usr/bin/rm: Argument list too long I tried using ...
Luka's user avatar
  • 2,177
5 votes
4 answers
869 views

I have a backup disk which contains hundreds of backups of the same machine from different dates. The backup was made with rsync and hardlinks, i.e. if a file doesn't change the backup script just ...
student's user avatar
  • 18.9k

15 30 50 per page