开发者

uncompressing a large number of files on the fly

I have a script that I need to run on a large number of files with the extension **.tar.gz*.

Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the unc开发者_如何学JAVAompressed folder, all with a single command.

I think a pipe is a good solution for this but i haven't used it before. How would I do this?


The -v orders tar to print filenames as it extracts each file:

tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"

This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).

Edit: the -I {} -d\\n part will make it work with spaces in filenames.


The following three lines of bash...

for archive in *.tar.gz; do
    tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done

...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.

N.B: tar xf can only take one file per invocation.


You can use a for loop:

for file in *.tar.gz; do tar -xf "$file"; your commands here; done

Or expanded:

for file in *.tar.gz; do
    tar -xf "$file"
    # your commands here
done
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜