开发者

Python scripts times out when I copy too many files

I have a Python script which copies a bunch of files from an anonymous ftp site. When I try to copy several thousand, the script times out before all the files can be copied. However, if I run the script several times and only copy a few hundred files each time, it has no problem. The files are text files and are around 10KB of space each. Here is my code once i have logged in to the ftp site:

for row in rows:
开发者_如何学Python       stationFilePrefix = "%s" % (row[0])
       stationFile =  stationFilePrefix + ".dly"
       f = open(stationFile,"wb")
       ftp.retrbinary("RETR " + stationFile,f.write)
       f.close()

Does anyone have any suggestions on how to grab all the files at once without the script timing out? Thanks!


It's technically not grabbing all the files at once, but if calling the script multiple times on smaller sets of files works, then you might as well have the script itself copy only a few hundred files at a time, grabbing a new set each iteration.


Does ftp.retbinary() block when you call it? If not, it means that too many connections are opened at once, and the network can't handle them all.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜