perl, unix: fastest way to merge thousands of small files into one file
what is the fastest way to merge thousands of small files into one file?
thanks开发者_如何学编程
The cat
command works nicely:
cat *someglob* > output.txt
It's name (short for concatenate) even gives away its purpose.
If your argument list is too long (i.e. too many files are matched by the glob) you can always use the find
command and pipe the arguments to xargs
.
find . -name \*someglob\* -print0 | xargs -0 cat > output.txt
In case it's more helpful, here's an example of how you might do that on the command-line:
cd dir_with_thousand_files
cat *.txt > onebigfile.txt
I'm sure cat
is faster, and simpler, but here's a perl version, just because you asked about it.
perl -pe1 *.txt > all.txt
Courtesy of ikegami
精彩评论