Shell script to list out the larger files in a dir recursively
Shell script to list out the files in a dir recursively which is huge. I am using:
find <path> -mtime +20 -exec ls -ls {} \; | sort -n -r | head -100 | awk '{print $10}'
Issues:
- Slower execution
- I am not having read 开发者_JAVA百科permissions inside few sub-directories
Is there any better way to achieve this? I have tried:
du <path> | sort -n -r | head -n 100
Much faster but not that effective.
Depending on the size distribution of the files your find
is finding, you might consider using the -size
predicate to weed out a lot of the smaller fish before the list gets dumped onto sort
. If this is something you run regularly, make a note when you start getting less than 100 lines out of head
and use that as an indication that it's time to lower the size limit you're giving find
.
Lack of permissions is not a problem you're going to be able to overcome without getting the permissions on the directories in question changed or escalating your privileges so you can read them.
du
is almost there, try
du -aS | sort -n -r | head -n 100
which only return you the large files excluding any directories
find has a handy -printf directive:
find . -type f -printf "%s\t%p\n" | sort -nr | head -n 100
find -size +k -atime + -printf "%s,\t%a,\t%p\n"|sort -nr
will give you the desired output
here the sike is on kb and the age is the last access time.
精彩评论