开发者

Bash script to find directories on system contains > 1000 files?

I need a way of finding all the directories on a server that contain more than a set number of files (eg. 1000+). I work on many big servers, and sometimes find directories containing millions of small log files. I need a way to find these dirs.

I was thinking something along the lines of scripting a full dir listing and then doing... ll | wc -l ...but thought there might be a better/quicker way to do it?

I am ideally g开发者_StackOverflowoing to set this script to run in the crontab to run once a week.

Cheers, Stu


I would prefer:

[[ `ls -1 | wc -l` -gt 1000 ]] && echo `pwd` 

And a cycle around... But without a cycle:

 ls -1R|awk -F "\n" 'BEGIN {RS="\\n\\n"} NF>1000 {print $1, NF-1}'

EDIT:

as awk can't handle that large records (over 3000 bytes), I'd go with:

for dirz in `find . -type d -print` ; do
    [[ `ls -1 $dirz | wc -l` -gt 1000 ]] && echo $dirz
done

HTH


I'd do a two stage approach like so (untested):

find / -type d -exec sh -c "find {} -depth 1 -type f -print | wc -l" \; | while read F N
  do
     [ $N -gt 1000 ] && echo $F $N
  done
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜