开发者

Check if a file exists with a wildcard in a shell script [duplicate]

This question already has answers 开发者_如何转开发here: Test whether a glob has any matches in Bash (22 answers) Closed 4 years ago.

I'm trying to check if a file exists, but with a wildcard. Here is my example:

if [ -f "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi

I have also tried it without the double quotes.


For Bash scripts, the most direct and performant approach is:

if compgen -G "${PROJECT_DIR}/*.png" > /dev/null; then
    echo "pattern exists!"
fi

This will work very speedily even in directories with millions of files and does not involve a new subshell.

Source


The simplest should be to rely on ls return value (it returns non-zero when the files do not exist):

if ls /path/to/your/files* 1> /dev/null 2>&1; then
    echo "files do exist"
else
    echo "files do not exist"
fi

I redirected the ls output to make it completely silent.


Here is an optimization that also relies on glob expansion, but avoids the use of ls:

for f in /path/to/your/files*; do

    ## Check if the glob gets expanded to existing files.
    ## If not, f here will be exactly the pattern above
    ## and the exists test will evaluate to false.
    [ -e "$f" ] && echo "files do exist" || echo "files do not exist"

    ## This is all we needed to know, so we can break after the first iteration
    break
done

This is very similar to grok12's answer, but it avoids the unnecessary iteration through the whole list.


If your shell has a nullglob option and it's turned on, a wildcard pattern that matches no files will be removed from the command line altogether. This will make ls see no pathname arguments, list the contents of the current directory and succeed, which is wrong. GNU stat, which always fails if given no arguments or an argument naming a nonexistent file, would be more robust. Also, the &> redirection operator is a bashism.

if stat --printf='' /path/to/your/files* 2>/dev/null
then
    echo found
else
    echo not found
fi

Better still is GNU find, which can handle a wildcard search internally and exit as soon as at it finds one matching file, rather than waste time processing a potentially huge list of them expanded by the shell; this also avoids the risk that the shell might overflow its command line buffer.

if test -n "$(find /dir/to/search -maxdepth 1 -name 'files*' -print -quit)"
then
    echo found
else
    echo not found
fi

Non-GNU versions of find might not have the -maxdepth option used here to make find search only the /dir/to/search instead of the entire directory tree rooted there.


Use:

files=(xorg-x11-fonts*)

if [ -e "${files[0]}" ];
then
    printf "BLAH"
fi


You can do the following:

set -- xorg-x11-fonts*
if [ -f "$1" ]; then
    printf "BLAH"
fi

This works with sh and derivatives: KornShell and Bash. It doesn't create any sub-shell. $(..) and `...` commands used in other solutions create a sub-shell: they fork a process, and they are inefficient. Of course it works with several files, and this solution can be the fastest, or second to the fastest one.

It works too when there aren't any matches. There isn't a need to use nullglob as one of the commentators say. $1 will contain the original test name, and therefore the test -f $1 won't success, because the $1 file doesn't exist.


for i in xorg-x11-fonts*; do
  if [ -f "$i" ]; then printf "BLAH"; fi
done

This will work with multiple files and with white space in file names.


The solution:

files=$(ls xorg-x11-fonts* 2> /dev/null | wc -l)
if [ "$files" != "0" ]
then
   echo "Exists"
else
    echo "None found."
fi

> Exists


Use:

if [ "`echo xorg-x11-fonts*`" != "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi


The PowerShell way - which treats wildcards different - you put it in the quotes like so below:

If (Test-Path "./output/test-pdf-docx/Text-Book-Part-I*"){
  Remove-Item -force -v -path ./output/test-pdf-docx/*.pdf
  Remove-Item -force -v -path ./output/test-pdf-docx/*.docx
}

I think this is helpful because the concept of the original question covers "shells" in general not just Bash or Linux, and would apply to PowerShell users with the same question too.


The Bash code I use:

if ls /syslog/*.log > /dev/null 2>&1; then
   echo "Log files are present in /syslog/;
fi


Strictly speaking, if you only want to print "Blah", here is the solution:

find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 'BLAH' -quit

Here is another way:

doesFirstFileExist(){
    test -e "$1"
}

if doesFirstFileExist xorg-x11-fonts*
then printf "BLAH"
fi

But I think the most optimal is as follows, because it won't try to sort file names:

if [ -z $(find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 1 -quit) ]
then 
     printf "BLAH"
fi


Here's a solution for your specific problem that doesn't require for loops or external commands like ls, find and the like.

if [ "$(echo xorg-x11-fonts*)" != "xorg-x11-fonts*" ]; then
    printf "BLAH"
fi

As you can see, it's just a tad more complicated than what you were hoping for, and relies on the fact that if the shell is not able to expand the glob, it means no files with that glob exist and echo will output the glob as is, which allows us to do a mere string comparison to check whether any of those files exist at all.

If we were to generalize the procedure, though, we should take into account the fact that files might contain spaces within their names and/or paths and that the glob char could rightfully expand to nothing (in your example, that would be the case of a file whose name is exactly xorg-x11-fonts).

This could be achieved by the following function, in bash.

function doesAnyFileExist {
   local arg="$*"
   local files=($arg)
   [ ${#files[@]} -gt 1 ] || [ ${#files[@]} -eq 1 ] && [ -e "${files[0]}" ]
}

Going back to your example, it could be invoked like this.

if doesAnyFileExist "xorg-x11-fonts*"; then
    printf "BLAH"
fi

Glob expansion should happen within the function itself for it to work properly, that's why I put the argument in quotes and that's what the first line in the function body is there for: so that any multiple arguments (which could be the result of a glob expansion outside the function, as well as a spurious parameter) would be coalesced into one. Another approach could be to raise an error if there's more than one argument, yet another could be to ignore all but the 1st argument.

The second line in the function body sets the files var to an array constituted by all the file names that the glob expanded to, one for each array element. It's fine if the file names contain spaces, each array element will contain the names as is, including the spaces.

The third line in the function body does two things:

  1. It first checks whether there's more than one element in the array. If so, it means the glob surely got expanded to something (due to what we did on the 1st line), which in turn implies that at least one file matching the glob exist, which is all we wanted to know.

  2. If at step 1. we discovered that we got less than 2 elements in the array, then we check whether we got one and if so we check whether that one exist, the usual way. We need to do this extra check in order to account for function arguments without glob chars, in which case the array contains only one, unexpanded, element.


I found a couple of neat solutions worth sharing. The first still suffers from "this will break if there are too many matches" problem:

pat="yourpattern*" matches=($pat) ; [[ "$matches" != "$pat" ]] && echo "found"

(Recall that if you use an array without the [ ] syntax, you get the first element of the array.)

If you have "shopt -s nullglob" in your script, you could simply do:

matches=(yourpattern*) ; [[ "$matches" ]] && echo "found"

Now, if it's possible to have a ton of files in a directory, you're pretty well much stuck with using find:

find /path/to/dir -maxdepth 1 -type f -name 'yourpattern*' | grep -q '.' && echo 'found'


I use this:

filescount=`ls xorg-x11-fonts* | awk 'END { print NR }'`  
if [ $filescount -gt 0 ]; then  
    blah  
fi


Using new fancy shmancy features in KornShell, Bash, and Z shell shells (this example doesn't handle spaces in filenames):

# Declare a regular array (-A will declare an associative array. Kewl!)
declare -a myarray=( /mydir/tmp*.txt )
array_length=${#myarray[@]}

# Not found if the first element of the array is the unexpanded string
# (ie, if it contains a "*")
if [[ ${myarray[0]} =~ [*] ]] ; then
   echo "No files not found"
elif [ $array_length -eq 1 ] ; then
   echo "File was found"
else
   echo "Files were found"
fi

for myfile in ${myarray[@]}
do
  echo "$myfile"
done

Yes, this does smell like Perl. I am glad I didn't step in it ;)


IMHO it's better to use find always when testing for files, globs or directories. The stumbling block in doing so is find's exit status: 0 if all paths were traversed successfully, >0 otherwise. The expression you passed to find creates no echo in its exit code.

The following example tests if a directory has entries:

$ mkdir A
$ touch A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty'
not empty

When A has no files grep fails:

$ rm A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . || echo 'empty'
empty

When A does not exist grep fails again because find only prints to stderr:

$ rmdir A
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty' || echo 'empty'
find: 'A': No such file or directory
empty

Replace -not -empty by any other find expression, but be careful if you -exec a command that prints to stdout. You may want to grep for a more specific expression in such cases.

This approach works nicely in shell scripts. The originally question was to look for the glob xorg-x11-fonts*:

if find -maxdepth 0 -name 'xorg-x11-fonts*' -print | head -n1 | grep -q .
then
    : the glob matched
else
    : ...not
fi

Note that the else-branched is reached if xorg-x11-fonts* had not matched, or find encountered an error. To distinguish the case use $?.


If there is a huge amount of files on a network folder using the wildcard is questionable (speed, or command line arguments overflow).

I ended up with:

if [ -n "$(find somedir/that_may_not_exist_yet -maxdepth 1 -name \*.ext -print -quit)" ] ; then
  echo Such file exists
fi


if [ `ls path1/* path2/* 2> /dev/null | wc -l` -ne 0 ]; then echo ok; else echo no; fi


Try this

fileTarget="xorg-x11-fonts*"

filesFound=$(ls $fileTarget)

case ${filesFound} in
  "" ) printf "NO files found for target=${fileTarget}\n" ;;
   * ) printf "FileTarget Files found=${filesFound}\n" ;;
esac

Test

fileTarget="*.html"  # Where I have some HTML documents in the current directory

FileTarget Files found=Baby21.html
baby22.html
charlie  22.html
charlie21.html
charlie22.html
charlie23.html

fileTarget="xorg-x11-fonts*"

NO files found for target=xorg-x11-fonts*

Note that this only works in the current directory, or where the variable fileTarget includes the path you want to inspect.


You can also cut other files out

if [ -e $( echo $1 | cut -d" " -f1 ) ] ; then
   ...
fi


Use:

if ls -l  | grep -q 'xorg-x11-fonts.*' # grep needs a regex, not a shell glob
then
     # do something
else
     # do something else
fi


man test.

if [ -e file ]; then
...
fi

will work for directory and file.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜