Fastest ways to get a directory Size and Size on disk
I recently answered to a similar question but what I'd like to do now is to emulate using bash the Windows "Directory --> right click --> Properties" function (see fig.).
I was able to reproduce something like the Size: in bytes using this command:
echo $(find . -type f -printf "%s+") | sed 's/+$//g' | bc
which is quite fast but is it possible to get info faster (than find) or do the math faster (than bc)?
In addition I would use th开发者_如何学Pythone du -sb
command to emulate the Size on Disk: and probably another couple of find
to count files and directory and emulate the Contains: line.
Are there better ways to emulate such results?
Assuming cygwin:
cd /cygdrive/c
printf "Size: %s", $( du --apparent-size -sh )
printf "Size on disk: %s", $( du -sh )
find . -printf "%y\n" | awk '
$1 == "d" {dirs++}
END {printf("Contains: %d files, %d folders\n", NR-dirs, dirs)}
'
I just wrote a quick and dirty utility based on nftw(1).
The utility is basically just the manpage sample, with some stats added.
Functional
I opted to
- stay within a single mountpoint
- not follow symlinks (for simplicity and because it is usually what you want)
- note that it will still count the size of the symlinks themselves :)
- it shows apparent size (the length of a file) as well as the size on disk (allocated blocks).
Tests, speed
I tested the binary (/tmp/test) on my box:
# clear page, dentry and attribute caches
echo 3> /proc/sys/vm/drop_caches
time /tmp/test /
output
Total size: 28433001733
In 878794 files and 87047 directories (73318 symlinks and 0 inaccessible directories)
Size on disk 59942192 * 512b = 30690402304
real 0m2.066s
user 0m0.140s
sys 0m1.910s
I haven't compared to your tooling, but it does seem rather quick. Perhaps you can take the source and build your own version for maximum speed?
To test whether sparse files were in fact correctly reported:
mkdir sparse
dd bs=1M seek=1024 count=0 of=sparse/file.raw
ls -l sparse/
./test sparse/
Output:
total 0
-rw-r--r-- 1 sehe sehe 1073741824 2011-09-23 22:59 file.raw
Total size: 1073741884
In 1 files and 1 directories (0 symlinks and 0 inaccessible directories)
Size on disk 0 * 512b = 0
Code
#define _XOPEN_SOURCE 500
#include <ftw.h>
#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>
static uintmax_t total = 0ul;
static uintmax_t files = 0ul;
static uintmax_t directories = 0ul;
static uintmax_t symlinks = 0ul;
static uintmax_t inaccessible = 0ul;
static uintmax_t blocks512 = 0ul;
static int
display_info(const char *fpath, const struct stat *sb,
int tflag, struct FTW *ftwbuf)
{
switch(tflag)
{
case FTW_D:
case FTW_DP: directories++; break;
case FTW_NS:
case FTW_SL:
case FTW_SLN: symlinks++; break;
case FTW_DNR: inaccessible++; break;
case FTW_F: files++; break;
}
total += sb->st_size;
blocks512 += sb->st_blocks;
return 0; /* To tell nftw() to continue */
}
int
main(int argc, char *argv[])
{
int flags = FTW_DEPTH | FTW_MOUNT | FTW_PHYS;
if (nftw((argc < 2) ? "." : argv[1], display_info, 20, flags) == -1)
{
perror("nftw");
exit(EXIT_FAILURE);
}
printf("Total size: %7jd\n", total);
printf("In %jd files and %jd directories (%jd symlinks and %jd inaccessible directories)\n", files, directories, symlinks, inaccessible);
printf("Size on disk %jd * 512b = %jd\n", blocks512, blocks512<<9);
exit(EXIT_SUCCESS);
}
Compile with...
gcc test.c -o test
精彩评论