deleting all the file of certain size
i have bunch of log files and I have to delete the files of some small sizes, w开发者_StackOverflow中文版hich were erroneous files that got created. ( 63bytes ). I have to copy only those files which have data in it .
Shell (linux);
find . -type f -size 63c -delete
Will traverse subdirectories (unless you tell it otherwise)
Since you tagged your question with "python" here is how you could do this in that language:
target_size = 63
import os
for dirpath, dirs, files in os.walk('.'):
for file in files:
path = os.path.join(dirpath, file)
if os.stat(path).st_size == target_size:
os.remove(path)
The Perl one liner is
perl -e 'unlink grep {-s == 63} glob "*"'
Although, it is always a good idea to test what it would do before running it:
perl -le 'print for grep {-s == 63} glob "*"'
If you want to walk an entire directory tree, you will need a different versions:
#find all files in the current hierarchy that are 63 bytes long.
perl -MFile::Find=find -le 'find sub {print $File::Find::name if -s == 63}, "."'
#delete all files in the current hierarchy that 63 bytes long
perl -MFile::Find=find -e 'find sub {unlink if -s == 63}, "."'
I am using need $File::Find::name
in the finding version so you get the whole path, the unlinking version doesn't need it because File::Find
changes directory into the each target directory and sets $_
to be the file name (which is how -s
and unlink
get the filename). You may also want to look up grep
and glob
精彩评论