419

I'm trying to check if a file exists, but with a wildcard. Here is my example:

if [ -f "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi

I have also tried it without the double quotes.

1
  • 29
    Two bugs with your code: (1) The asterisk has to be outside the double quotes (a quoted asterisk loses it special wildcard meaning), and (2) if multiple files match the pattern, multiple arguments will be passed to the [ command, most likely causing [ to exit with an error and therefore be interpreted as no files matching. Commented Jun 17, 2011 at 6:31

21 Answers 21

643

For Bash scripts, the most direct and performant approach is:

if compgen -G "${PROJECT_DIR}/*.png" > /dev/null; then
    echo "pattern exists!"
fi

This will work very speedily even in directories with millions of files and does not involve a new subshell.

Source


The simplest should be to rely on ls return value (it returns non-zero when the files do not exist):

if ls /path/to/your/files* 1> /dev/null 2>&1; then
    echo "files do exist"
else
    echo "files do not exist"
fi

I redirected the ls output to make it completely silent.


Here is an optimization that also relies on glob expansion, but avoids the use of ls:

for f in /path/to/your/files*; do

    ## Check if the glob gets expanded to existing files.
    ## If not, f here will be exactly the pattern above
    ## and the exists test will evaluate to false.
    [ -e "$f" ] && echo "files do exist" || echo "files do not exist"

    ## This is all we needed to know, so we can break after the first iteration
    break
done

This is very similar to grok12's answer, but it avoids the unnecessary iteration through the whole list.

Sign up to request clarification or add additional context in comments.

18 Comments

A word of warning: In the Debian Almquist Shell (dash) — installed at /bin/sh in Debian and Ubuntu — &> seems to discard the exit code and that breaks this solution. A workaround is to redirect with > /dev/null 2>&1 instead.
ls can be quite slow on a directory with many files (probably due to sorting). You may want to turn off sorting with -U, at least.
@CostiCiudatu have you checked how that alternative works when there are spaces in the directory name? Wouldn't e.g. for f in /path/to/your files* interpreted as two arguments, /path/to/your and files*? I've tried putting double-quotes around, but that didn't work out (never finds a file, even if there's one).
@Izzy, you are supposed to put that in double quotes, but leave the * outside: for f in "/path/to/your files"* should work.
Warning. This will fail if the directory contains a lot of files. The expansion of "*" will exceed the command line length limit.
|
87

If your shell has a nullglob option and it's turned on, a wildcard pattern that matches no files will be removed from the command line altogether. This will make ls see no pathname arguments, list the contents of the current directory and succeed, which is wrong. GNU stat, which always fails if given no arguments or an argument naming a nonexistent file, would be more robust. Also, the &> redirection operator is a bashism.

if stat --printf='' /path/to/your/files* 2>/dev/null
then
    echo found
else
    echo not found
fi

Better still is GNU find, which can handle a wildcard search internally and exit as soon as at it finds one matching file, rather than waste time processing a potentially huge list of them expanded by the shell; this also avoids the risk that the shell might overflow its command line buffer.

if test -n "$(find /dir/to/search -maxdepth 1 -name 'files*' -print -quit)"
then
    echo found
else
    echo not found
fi

Non-GNU versions of find might not have the -maxdepth option used here to make find search only the /dir/to/search instead of the entire directory tree rooted there.

7 Comments

Letting find handle the wildcard is best because bash, as it expands the pattern, tries to sort the list of the matching file names, which is wasteful and can be expensive.
@musiphil Launching an external process such as find is even more wasteful if there are only a few files (or none) in the directory.
@dolmen: You are right. I guess it all depends on the situation; on the other hand, if there are a huge number of files, the wildcard expansion of bash can take more time than launching find.
The find command creates an ugly error message if no files are found: find: '/dir/to/search': No such file or directory ; You can suppress this with -quit 1> /dev/null 2>&1
@dolmen: Running find with -quit as described in @flabdablet's post will not suffer from a huge number of files, because it quits as soon as it finds the first match and thus will not list all files. So it is not as big a waste of resources as you suggest. Moreover, find doesn't simply "expand" the wildcard as the shell does, but checks each file it finds against the pattern to see if it is a match, so it doesn't fail for a huge number of files.
|
42

Use:

files=(xorg-x11-fonts*)

if [ -e "${files[0]}" ];
then
    printf "BLAH"
fi

5 Comments

You should add unsetopt nomatch if zsh reports errors.
and shopt -s nullglob for bash
It shouild perhaps be pointed out more clearly that using an array makes this decidedly non-POSIX sh.
@nhed: yes, if only to detect a pathological case of a file named xorg-x11-fonts*. Otherwise, -e takes care of the check (the unexpanded xorg-x11-fonts* will fail the test).
The test should be ${files[0]-}, which is exactly same as ${files-}. Otherwise, if stars align in an unlucky way such that both set -u and shopt -s nullglob are in effect, the expansion would fail.
37

You can do the following:

set -- xorg-x11-fonts*
if [ -f "$1" ]; then
    printf "BLAH"
fi

This works with sh and derivatives: KornShell and Bash. It doesn't create any sub-shell. $(..) and `...` commands used in other solutions create a sub-shell: they fork a process, and they are inefficient. Of course it works with several files, and this solution can be the fastest, or second to the fastest one.

It works too when there aren't any matches. There isn't a need to use nullglob as one of the commentators say. $1 will contain the original test name, and therefore the test -f $1 won't success, because the $1 file doesn't exist.

7 Comments

The most portable solution!
Alas, it doesn't work when there's no matches. $1 will contain the original test name, including the *. You could set "nullglob" in bash so it WILL blank out. THat's not portable, though :)
Chris, when there isn't a match, $1 will contain the origintal test name, including the * as you say. Then the test: [ -f "$1" ] won't be sucessfull because the file "*" doesn't exist. Therefore you don't need nullglob, or other tricks. It is 100% portable.
Tried with zsh as shell. It works if typed as a command, but fails with 'no matches found' when called from a zsh script. (Strangely)
jmary: Which set command did you use? Some special character?
|
26
for i in xorg-x11-fonts*; do
  if [ -f "$i" ]; then printf "BLAH"; fi
done

This will work with multiple files and with white space in file names.

4 Comments

It will print multiple "BLAH" if there are multiple matches. Maybe add a break to exit the loop after the first match.
This (with @tripleee ‘s break) gets my vote. By using only native globbing and the file test operator, it avoids even raising the question of corner cases, that comes with using commands like ls or find or from forwarding globs. I think it is free of all the issues, like names with blanks, nullglob setting and bashisms, that were raised for some other answers. I made a function of it: existsAnyFile () { for file; do [ -f "$file" ] && return 0; done; false; }
Note this gets a stat failure if xorg-x11-fonts* does not exist, which is probably not what you want.
As a funcfion: exists() { for f in $@; do [ -f "$f" ]; return $?; done; } Usage: exists .changeset/*.md && echo Yes || echo No
21

The solution:

files=$(ls xorg-x11-fonts* 2> /dev/null | wc -l)
if [ "$files" != "0" ]
then
   echo "Exists"
else
    echo "None found."
fi

> Exists

5 Comments

In my shell (zsh) it works if there is only one match to the glob, otherwise it expands all the files and the test fails (too many arguments.)
Update my code. I'm sure this works, I just installed zsh and tested.
ls can be quite slow on a directory with many files (probably due to sorting). You may want to turn off sorting with -U, at least.
If the globbing matches a directory name, ls will spit out the contentes of that directory which may cause false positives.
Running ls and wc require to launch two external programs. Not efficient at all.
16

Use:

if [ "`echo xorg-x11-fonts*`" != "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi

6 Comments

This is the simplest, easiest and most elegant answer that actually works!
@SergeStroobandt Not sure I agree. The command substitution may be necessary here, but it tickles my cringe reflex.
yea... like what if the file with the literal name xorg-x11-fonts\* exists?
Not elegant at all because it forks a sub shell to run the echo command.
I think this solution is good when looking for files with extensions *.MIF
|
10

The Bash code I use:

if ls /syslog/*.log > /dev/null 2>&1; then
   echo "Log files are present in /syslog/;
fi

2 Comments

An explanation would be in order. E.g., what is the principle of operation? What are the performance characteristics (process start overhead, memory, scalability, etc.)? Please respond by editing (changing) your answer, not here in comments (without "Edit:", "Update:", or similar - the answer should appear as if it was written today).
Yeah, whatever, picky, picky, need performance, principals. This worked on positive and negative cases, others, not so much. Caveat: zsh on MacOS.
9

The PowerShell way - which treats wildcards different - you put it in the quotes like so below:

If (Test-Path "./output/test-pdf-docx/Text-Book-Part-I*"){
  Remove-Item -force -v -path ./output/test-pdf-docx/*.pdf
  Remove-Item -force -v -path ./output/test-pdf-docx/*.docx
}

I think this is helpful because the concept of the original question covers "shells" in general not just Bash or Linux, and would apply to PowerShell users with the same question too.

2 Comments

even more relevant now that pwsh is a thing.
The following year it was tagged with "sh" - "sh is the standard Unix shell since Version 7 Unix.". Perhaps protest and/or provide arguments in comments to the question?
7

Strictly speaking, if you only want to print "Blah", here is the solution:

find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 'BLAH' -quit

Here is another way:

doesFirstFileExist(){
    test -e "$1"
}

if doesFirstFileExist xorg-x11-fonts*
then printf "BLAH"
fi

But I think the most optimal is as follows, because it won't try to sort file names:

if [ -z $(find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 1 -quit) ]
then 
     printf "BLAH"
fi

1 Comment

You can also use -exec option of find like this: find . -maxdepth 1 -name '*.o' -exec rm {} \;
6

Here's a solution for your specific problem that doesn't require for loops or external commands like ls, find and the like.

if [ "$(echo xorg-x11-fonts*)" != "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi

As you can see, it's just a tad more complicated than what you were hoping for, and relies on the fact that if the shell is not able to expand the glob, it means no files with that glob exist and echo will output the glob as is, which allows us to do a mere string comparison to check whether any of those files exist at all.

If we were to generalize the procedure, though, we should take into account the fact that files might contain spaces within their names and/or paths and that the glob char could rightfully expand to nothing (in your example, that would be the case of a file whose name is exactly xorg-x11-fonts).

This could be achieved by the following function, in bash.

function doesAnyFileExist {
   local arg="$*"
   local files=($arg)
   [ ${#files[@]} -gt 1 ] || [ ${#files[@]} -eq 1 ] && [ -e "${files[0]}" ]
}

Going back to your example, it could be invoked like this.

if doesAnyFileExist "xorg-x11-fonts*"; then
    printf "BLAH"
fi

Glob expansion should happen within the function itself for it to work properly, that's why I put the argument in quotes and that's what the first line in the function body is there for: so that any multiple arguments (which could be the result of a glob expansion outside the function, as well as a spurious parameter) would be coalesced into one. Another approach could be to raise an error if there's more than one argument, yet another could be to ignore all but the 1st argument.

The second line in the function body sets the files var to an array constituted by all the file names that the glob expanded to, one for each array element. It's fine if the file names contain spaces, each array element will contain the names as is, including the spaces.

The third line in the function body does two things:

  1. It first checks whether there's more than one element in the array. If so, it means the glob surely got expanded to something (due to what we did on the 1st line), which in turn implies that at least one file matching the glob exist, which is all we wanted to know.

  2. If at step 1. we discovered that we got less than 2 elements in the array, then we check whether we got one and if so we check whether that one exist, the usual way. We need to do this extra check in order to account for function arguments without glob chars, in which case the array contains only one, unexpanded, element.

4 Comments

This is inefficient because $(..) launches a sub-shell.
@dolmen a sub-shell is just a process like any other. The accepted answer launches the ls command, which for all intent and purposes is as efficient (or inefficient) as a sub-shell is.
I've never written that the accepted answer is better and that I would have accepted if I had been the submitter.
Notice that the generalized method I explain in this very same answer doesn't use any subshell at all.
3

I found a couple of neat solutions worth sharing. The first still suffers from "this will break if there are too many matches" problem:

pat="yourpattern*" matches=($pat) ; [[ "$matches" != "$pat" ]] && echo "found"

(Recall that if you use an array without the [ ] syntax, you get the first element of the array.)

If you have "shopt -s nullglob" in your script, you could simply do:

matches=(yourpattern*) ; [[ "$matches" ]] && echo "found"

Now, if it's possible to have a ton of files in a directory, you're pretty well much stuck with using find:

find /path/to/dir -maxdepth 1 -type f -name 'yourpattern*' | grep -q '.' && echo 'found'

Comments

2

I use this:

filescount=`ls xorg-x11-fonts* | awk 'END { print NR }'`  
if [ $filescount -gt 0 ]; then  
    blah  
fi

2 Comments

wc -l is more efficient than awk for this task.
Counting the number of results is an antipattern anyway. Usually you simply want to see whether ls returned success or not (or better yet avoid ls too and use the shell's built-in functionality).
2

Using new fancy shmancy features in KornShell, Bash, and Z shell shells (this example doesn't handle spaces in filenames):

# Declare a regular array (-A will declare an associative array. Kewl!)
declare -a myarray=( /mydir/tmp*.txt )
array_length=${#myarray[@]}

# Not found if the first element of the array is the unexpanded string
# (ie, if it contains a "*")
if [[ ${myarray[0]} =~ [*] ]] ; then
   echo "No files not found"
elif [ $array_length -eq 1 ] ; then
   echo "File was found"
else
   echo "Files were found"
fi

for myfile in ${myarray[@]}
do
  echo "$myfile"
done

Yes, this does smell like Perl. I am glad I didn't step in it ;)

Comments

1

IMHO it's better to use find always when testing for files, globs or directories. The stumbling block in doing so is find's exit status: 0 if all paths were traversed successfully, >0 otherwise. The expression you passed to find creates no echo in its exit code.

The following example tests if a directory has entries:

$ mkdir A
$ touch A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty'
not empty

When A has no files grep fails:

$ rm A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . || echo 'empty'
empty

When A does not exist grep fails again because find only prints to stderr:

$ rmdir A
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty' || echo 'empty'
find: 'A': No such file or directory
empty

Replace -not -empty by any other find expression, but be careful if you -exec a command that prints to stdout. You may want to grep for a more specific expression in such cases.

This approach works nicely in shell scripts. The originally question was to look for the glob xorg-x11-fonts*:

if find -maxdepth 0 -name 'xorg-x11-fonts*' -print | head -n1 | grep -q .
then
    : the glob matched
else
    : ...not
fi

Note that the else-branched is reached if xorg-x11-fonts* had not matched, or find encountered an error. To distinguish the case use $?.

1 Comment

You probably meant -maxdepth 1 when using -name, since -maxdepth 0 will look at the current directory and not its contents.
0

If there is a huge amount of files on a network folder using the wildcard is questionable (speed, or command line arguments overflow).

I ended up with:

if [ -n "$(find somedir/that_may_not_exist_yet -maxdepth 1 -name \*.ext -print -quit)" ] ; then
  echo Such file exists
fi

Comments

0
if [ `ls path1/* path2/* 2> /dev/null | wc -l` -ne 0 ]; then echo ok; else echo no; fi

2 Comments

This looks like a grab bag of "don't do that" shell antipatterns. Don't use ls in scripts, don't check if the word count is zero, watch out for pretzel logic with backticks.
An explanation would be in order. E.g., what is the idea/gist? Please respond by editing (changing) your answer, not here in comments (without "Edit:", "Update:", or similar - the answer should appear as if it was written today).
0

Try this

fileTarget="xorg-x11-fonts*"

filesFound=$(ls $fileTarget)

case ${filesFound} in
  "" ) printf "NO files found for target=${fileTarget}\n" ;;
   * ) printf "FileTarget Files found=${filesFound}\n" ;;
esac

Test

fileTarget="*.html"  # Where I have some HTML documents in the current directory

FileTarget Files found=Baby21.html
baby22.html
charlie  22.html
charlie21.html
charlie22.html
charlie23.html

fileTarget="xorg-x11-fonts*"

NO files found for target=xorg-x11-fonts*

Note that this only works in the current directory, or where the variable fileTarget includes the path you want to inspect.

7 Comments

Your code will fail if fileTarget contains whitespace (e.g., fileTarget="my file*").
@RichardHansen what the solution when there is whitespace?
@Ross: Use the accepted answer: if ls "my file"* >/dev/null 2>&1; then ...
@RichardHansen thanks, sorry – not working for me. Have it fixed now .
@Ross, I've added an edit to mine that should work with files with spaces. Basically case "${filesFound}" in .... . Good luck to all.
|
-1

You can also cut other files out

if [ -e $( echo $1 | cut -d" " -f1 ) ] ; then
   ...
fi

1 Comment

this would be slow because of the subshell. And what if the file name contains space?
-1

Use:

if ls -l  | grep -q 'xorg-x11-fonts.*' # grep needs a regex, not a shell glob
then
     # do something
else
     # do something else
fi

3 Comments

No, don't use ls in scripts and the .* wildcard is redundant (you probably meant grep -q '^xorg-x1-fonts').
While this code may answer the question, providing additional context regarding why and/or how this code answers the question improves its long-term value.
-17

man test.

if [ -e file ]; then
...
fi

will work for directory and file.

3 Comments

This will not work with wildcards (which is what is asked in this question). If it matches more than one file you will get bash: [: too many arguments
A little unfair as this works very well on Solaris........
heh, old post, thanks for the support Chris - i was indeed working with Solaris back then as well.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.