开发者

urllib freeze if url is too big !

ok im trying to open a url using urllib but the problem is that the file is too big, so when i open the url python freezes, im also using wxpython which also freezes when i open the url my cpu goes to almost 100% when the url is opened

any solutions ? is there a way i can open the url in chunks and maybe have a time.sleep(0.5) in there so it does not freeze ? this is my code :

f = open("hello.txt",'wb')
datatowrite = urllib.urlopen(link).read()
f.w开发者_运维技巧rite(datatowrite)
f.close()

Thanks


You want to split the download into a separate thread, so your UI thread continues to work while the download thread does the work separately. That way you don't get the "freeze" while the download happens.

Read more about threading here:

http://docs.python.org/library/threading.html

Alternatively, you could use the system to download the file outside of python using curl or wget.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜