开发者

How do I run a Perl script on multiple input files with the same extension?

How do I run a Perl script on multiple input files 开发者_C百科with the same extension?

 perl scriptname.pl file.aspx

I'm looking to have it run for all aspx files in the current directory

Thanks!


In your Perl file,

 my @files = <*.aspx>;
 for $file (@files) {

      # do something.

 }

The <*.aspx> is called a glob.


you can pass those files to perl with wildcard

in your script

foreach (@ARGV){
    print "file: $_\n";
    # open your file here...
       #..do something
    # close your file
}

on command line

$ perl myscript.pl *.aspx


You can use glob explicitly, to use shell parameters without depending to much on the shell behaviour.

for my $file ( map {glob($_)} @ARGV ) { 
   print $file, "\n";
};

You may need to control the possibility of a filename duplicate with more than one parameter expanded.


For a simple one-liner with -n or -p, you want

perl -i~ -pe 's/foo/bar/' *.aspx

The -i~ says to modify each target file in place, and leave the original as a backup with an ~ suffix added to the file name. (Omit the suffix to not leave a backup. But if you are still learning or experimenting, that's a bad idea; removing the backups when you're done is a much smaller hassle than restoring the originals from a backup if you mess something up.)

If your Perl code is too complex for a one-liner (or just useful enough to be reusable) obviously replace -e '# your code here' with scriptname.pl ... though then maybe refactor scriptname.pl so that it accepts a list of file name arguments, and simply use scriptname.pl *.aspx to run it on all *.aspx files in the current directory.

If you need to recurse a directory structure and find all files with a particular naming pattern, the find utility is useful.

find . -name '*.aspx' -exec perl -pi~ -e 's/foo/bar/' {} +

If your find does not support -exec ... + try with -exec ... \; though it will be slower and launch more processes (one per file you find instead of as few as possible to process all the files).

To only scan some directories, replace . (which names the current directory) with a space-separated list of the directories to examine, or even use find to find the directories themselves (and then perhaps explore -execdir for doing something in each directory that find selects with your complex, intricate, business-critical, maybe secret list of find option predicates).

Maybe also explore find2perl to do this directory recursion natively in Perl.


If you are on Linux machine, you could try something like this.

for i in `ls /tmp/*.aspx`; do perl scriptname.pl $i; done


For example to handle perl scriptname.pl *.aspx *.asp

In linux: The shell expands wildcards, so the perl can simply be

for (@ARGV) {
  operation($_); # do something with each file
}

Windows doesn't expand wildcards so expand the wildcards in each argument in perl as follows. The for loop then processes each file in the same way as above

for (map {glob} @ARGV) {
  operation($_); # do something with each file
}

For example, this will print the expanded list under Windows

print "$_\n" for(map {glob} @ARGV);


You can also pass the path where you have your aspx files and read them one by one.

#!/usr/bin/perl -w

use strict;

my $path = shift;
my @files = split/\n/, `ls *.aspx`;

foreach my $file (@files) {
        do something...
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜