Add thousands of numbers stored in second 'column' in text files?
I've got a file with 400k+ numbers, each with a filename and its size in separate lines and I need to add them up to get a total.
See: https://superuser.com/questions/195493/unix-recursive-directory-listing-with-full-pathname-of-file-and-filesize
filename1 size1
filename2 size2
Its not going to be a very large number ... < ~50,000,000
They're all integers, no decimal points, none of them > 120
Need to do this on a standard linux command line. I can mod开发者_StackOverflow社区ify the script used to generate this output, which is:
find full_path_to_your_directory -type f -printf '%p %s\n'
find . -type f -printf '%p %s\n' | awk '{sum+=$NF}END{print sum}'
If you want to use Perl,
find . -type f -printf '%p %s\n' | perl -ane '$sum+=$F[1];END{print "$sum\n"}'
I got this :
find . -type f -printf '%p %s\n' | perl -n -a -e '$sum+=$S[1]; print "$sum\n"'
which displays the running total.
find . -type f -printf '%p %s\n' | perl -n -a -e '$sum+=$F[1]; print "$sum\n"' | tail -n 1
will just show the total.
With awk it is slightly more compact :
find . -type f -printf '%p %s\n' | awk '{ sum+=$2}; END { print sum}'
Since you don't need the filename to sum the sizes:
find path -type f -printf '%s\n' | awk '{sum += $1} END {print sum}'
精彩评论