9

I've got a command to find big files in a particular folder but for some reason it won't work in certain situations and I get an "Argument list too long" error. How do I fix this command so it works every time?

jbsmith:/tmp$ sudo du -hsx * | sort -rh | head -10
-bash: /usr/bin/sudo: Argument list too long

6 Answers 6

9

You could replace that command with find instead of globbing and do it

sudo find . -maxdepth 1 ! -name "." -exec du -hsx {} + | sort -rh | head -10

assuming your find supports the + notation.

This will find everything under the current directory without descending deeper, and ignore the "." (thanks for that reminder @rudimeier!)

This will include all of the files in the current directory, like the glob you had originally. Unlike that glob, this will also find files that start with . (unless you were playing with your shell options to enable dotglob already).

4
  • This doesn't work Commented Oct 25, 2016 at 18:28
  • @JoelSmith what doesn't work about it? Commented Oct 25, 2016 at 18:29
  • it returns nothing Commented Oct 25, 2016 at 18:30
  • @JoelSmith no errors or anything? oh, I did get an order backwards, see my update Commented Oct 25, 2016 at 18:32
8

I landed on this question while looking for a way to handle "argument list too long" with du. In my case I didn't want to filter the output but instead to get a total of all files that match a pattern. With the approaches in other questions I could not get a grand total as they end up calling du several times with a subset of arguments.

The solution was to use --files0-from= instead of passing filenames as arguments.

In the end this worked for me:

du -Lhsc --files0-from=<(find -L -maxdepth 2 \( -name "*.gz" -o -name "*.xz" \) -print0)
  • -L follow symlinks both in finding and size calculation.
  • -c get the cumulative total
  • <() process substitution to create a file on the fly
  • -print0 to match du's expectation
  • \( \) to be able to use -o with two -name args

In the same way the answer to the above question could be:

du -hsx --files0-from=<(find -maxdepth 1 ! -name "." -print0) | sort -rh | head -10
3
  • I wish BSD du had --files0-from=, but unfortunately this only works with gnu du Commented Jun 21, 2022 at 23:55
  • Or in two lines without brackets nor back slashes: find ROOT_FOLDER -name "PATTERN*" -print0 > files followed by du -hsc --files0-from=files | tail -1 Commented May 30, 2024 at 10:09
  • You may prefer find /path -print0 | du --files0-from=- to read directly from stdin. Commented Oct 3 at 3:36
2

With recent versions of GNU coreutils, you can use the --max-depth option instead of enumerating the files with *. This way you don't risk running into the command line length limit if there are too many files. There is no --min-depth, so the toplevel directory will be listed at the end.

du -x -d 1 | head -n -1 | sort -r | head -n 10
1

This is a common problem that command lines are not unlimited in size and number of arguments. A common way to get around these limits it to use find / xargs pipes.

Your case should work like this:

 sudo sh -c "find . -mindepth 1 -maxdepth 1 -print0 |xargs -0 du -hsx -- " |sort -rh |head

sudo requirement makes it look a bit more tricky than usual.

4
  • This doesn't work Commented Oct 25, 2016 at 18:29
  • Updated , regarding sudo. What else does not work? Commented Oct 25, 2016 at 18:32
  • @rudimeier It doesn't work because du has an argument list limit of 2539. It will wraparound after xargs builds arguments greater than du's limit. For example, try creating 2540 files in a directory and see if your command accurately totals the size. Commented Oct 3 at 3:41
  • @mvanle Seems that either your xargs or duis not the best implementation. What system are you working on, or what is the version of your du? However, you could pass option -n to xargs to set the limit explicitly: ... |xargs -0 -n 2539 du ... Commented Oct 15 at 14:55
1
sudo ls | sudo parallel -j1 du -hsx | sort -rh | head -10
-1

You can work around this with find, if you know the floor for the file size you want to see:

find /tmp -size 1G -type f
1
  • That's not what I want Commented Oct 25, 2016 at 18:28

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.