开发者

Deleting large numbers of files in python

I'm trying to remove all files found in a directory. The accepted answer to Delete Folder Contents in Python suggests getting a list of all files and calling "unlink" on them in a loop.

Suppose I have thousands of files on a network share, and want to tie up the directory for as short a time as possible.

Is it more efficient to delete them all using a shell command like rm -f /p开发者_如何学JAVAath/* or by using shutils.rmtree or some such?


If you actually want to delete the whole directory tree, shutils.rmtree should be faster than os.remove (which is the same as os.unlink). It also allows you to specify a callback function to handle errors.

The suggestion in the comment by @nmichaels is also good, you can os.rename the directory then make a new one in its place and use shutils.rmtree on the original, renamed directory.


I tried this solution and it seems to work well:

while os.path.exists(file_to_delete):
  os.remove(file_to_delete)
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜