14

If parsing the output of ls is dangerous because it can break on some funky characters (spaces, \n, ... ), what's the best way to know the number of files in a directory?

I usualy rely on find to avoid this parsing, but similarly, find mydir | wc -l will break for the same reasons.

I'm working on Solaris right now, but I'm looking for a answer as portable across different unices and different shells as possible.

3
  • 3
    I'm not sure it's a duplicate, am I missing something? Commented Oct 17, 2014 at 11:32
  • 1
    This might be a duplicate, but not of the question indicated. find will get you number of files recursively (use -maxdepth 1 if you don't want that. find mydir -maxdepth 1 -type f -printf \\n | wc -l should handle the special characters in the filename, as they are never printed in the first place. Commented May 6, 2015 at 6:20
  • 1
    Looks like a duplicate of How can I get a count of files in a directory using the command line? Commented Dec 29, 2022 at 16:03

10 Answers 10

22

How about this trick?

find . -maxdepth 1 -exec echo \; | wc -l

As portable as find and wc.

6
  • 6
    This doesn't work (it displays n+1 files on my Debian system). It also doesn't filter for regular files. Commented Dec 23, 2011 at 13:35
  • 6
    I just gave a generic example. It does work, but how it works depends on how you adapt the find command to your specific needs. Yes, this one includes all the directories, including . (which might be why you see the result as n+1). Commented Dec 23, 2011 at 15:35
  • I like this trick, very clever; but I'm surprised there's no simple straightforward way to do that! Commented Dec 23, 2011 at 16:11
  • 4
    @ChrisDown the OP doesn't specify filtering for regular files, asks for number of files in a directory. To get rid of the n+1 issue, use find . -maxdepth 1 ! -name . -exec echo \; | wc -l; some older versions of find do not have -not. Commented Dec 23, 2011 at 16:50
  • 4
    Note that -maxdepth is not standard (a GNU extension now also supported by a few other implementations). Commented May 9, 2015 at 15:10
14

In bash, without external utilities, nor loops:

shopt -s dotglob
files=(*)
echo "${#files[@]}"

In ksh, replace shopt -s dotglob by FIGNORE=.?(.).
In zsh, replace it by setopt glob_dots, or remove the shopt call and use files=(*(D)). (Or just drop the line if you don't want to include dot files.)

Portably, if you don't care about dot files:

set -- *
echo "$#"

If you do want to include dot files:

set -- *
if [ -e "$1" ]; then c=$#; else c=0; fi
set .[!.]*
if [ -e "$1" ]; then c=$((c+$#)); fi
set ..?*
if [ -e "$1" ]; then c=$((c+$#)); fi
echo "$c"
1
  • 3
    The first example prints 1 for an empty directory when nullglob is not enabled. In zsh, a=(*(DN));echo ${#a} with the N (nullglob) qualifier does not result in an error for an empty directory. Commented May 11, 2016 at 23:17
10
find . ! -name . -prune -print | grep -c /

Should be fairly portable to post-80s systems.

That counts all the directory entries except . and .. in the current directory.

To count files in subdirectories as well:

find .//. ! -name . | grep -c //

(that one should be portable even to Unix V6 (1975), since it doesn't need -prune)

3
  • 1
    One of the rare portable answers on this page, if not the only one. Commented Aug 15, 2017 at 18:11
  • I upvoted this answer yesterday as I found it also works well for directories other than the current directory (find dirname ! -name dirname -prune -print). I have since been wondering if there's any particular reason to use grep -c / instead of wc -l (which is probably more commonly used for counting). Commented Nov 1, 2018 at 15:11
  • 2
    find dirname ! -name dirname doesn't work if there are other directories within that are named dirname. It's better to use find dirname/. ! -name .. wc -l counts the number of lines, file names can be made of several lines as the newline character is as valid as any in a file name. Commented Nov 1, 2018 at 15:14
7

ls

Try:

ls -b1A | wc -l

The -b will escape non-printable characters, -A will show all files except . and .., and one per line (the default on a pipe, but good to be explicit).

Python

As long as we're including higher-level scripting languages, here's a one-liner in Python:

python -c 'import os; print(len(os.listdir(os.sep)))'

Or with full 'find':

python -c 'import os; print(len([j for i in os.walk(os.sep) for j in i[1]+i[2]]))'
5
  • os.sep??? That's /, i.e. the root dir. Did you mean ".", for the current directory? You can actually remove the argument entirely (os.listdir()) and it'll automatically use the current dir. Commented Sep 3 at 12:56
  • Don't build a list just to get its len; use sum instead. Also, you can use unpacking instead of indexing into i. So: python -c 'import os; print(sum(len(d+f) for (_p, d, f) in os.walk(".")))' Commented Sep 3 at 13:02
  • oops, os.walk() requires an argument. Still, os.listdir() doesn't. Commented Sep 3 at 13:04
  • I submitted a suggestion to have os.walk() default to the current dir :) Commented Sep 3 at 13:28
  • Oh, even better is to avoid creating a new list unnecessarily: sum(len(d)+len(f) ... Commented Sep 6 at 0:53
2

The most simple version I use all the time and never had problems with is: ls -b1 | wc -l

5
  • You might run into problems if the file name contains a \n or other funky chars (yeah, certain unices allow this). Commented Aug 9, 2017 at 14:43
  • 1
    I tried this explicitly before posting my answer and had no problems with it. I used nautilus file manager to rename a file to contain \n to try this. Commented Aug 15, 2017 at 14:32
  • You'r right it doesn't work like that. I don't know what I did when I tested this first. Tried again and updated my answer. Commented Aug 15, 2017 at 14:39
  • No, the command is OK, but there is already a similar solution and hidden files are not counted. Commented Aug 15, 2017 at 17:54
  • ls -1 | wc -l fine on OpenBSD ksh Commented May 20, 2023 at 20:03
1

Yoc can use such construction:

I=0; for i in * ; do ((I++)); done ; echo $I

But I'm afraid, you can cath error like Argument list too long. in case you have too many files in directory. However I tested it on directory with 10 billion files, and it worked well.

3
  • 3
    THis won't work for hidden files either unless the shell is configured to expand those with *. Commented Dec 23, 2011 at 11:53
  • gnu find . -maxdepth 1 -type f | wc -l Commented Dec 23, 2011 at 14:41
  • 4
    @Rush: this command should never raise "arg list too long". That only happens with external command (so never with for. Commented Dec 23, 2011 at 16:16
1

Have you considered perl, which should be relatively portable?

Something like:

use File::Find;

$counter = 0;

sub wanted { 
  -f && ++$counter
}

find(\&wanted, @directories_to_search);
print "$counter\n";
0

Try this => Using ls with -i ( for node number ) & -F (appends directory name with '/' ) options.

ls -ilF | egrep -v '/' | wc -l
0

With a perl one-liner (reformatted for readability):

perl -e 'opendir($dh, ".");
         while ( readdir($dh) ) {$count++};
         closedir $dh;
         print "$count\n";'

or

perl -e 'opendir($dh, ".");
         @files = readdir($dh);
         closedir $dh;
         print $#files+1,"\n";'

You can use perl functions that modify arrays like grep or map with the second version. See perldoc -f readdir for an example using grep.

0

In addition to the find-based answer proposed by Stéphane, here is a POSIX-compliant answer based on ls:

ls -qf | tail -n +3 | wc -l

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.