开发者

Directory size in linux. Performance question

I have a script to scan set of folders to get sizes; and display this info to browser. This script calls 'du' and parses output.

The question is about performance. How fast is it? for example if directory size in 4 GB and 100.000 files are in

p.s. I understand that these metrics depend on hardware, but if you开发者_StackOverflow have similar experience with scanning large directories for sizes - could you share your experience?

thank you


It heavily depends on the file system. It's usually pretty slow on ext3, and on most other file systems as well if there's lots of subdirectories.

I don't think there's any other way to do it in realtime, however. You can pre-scan the directory and cache the result in a file or to a database, but in that case you will increase the complexity pretty much.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜