How use openurl & multiprocessing get URL different data in the same time?
This loop help get some text data from web service:
while True:
u = urllib2.urlopen('http://url/%d.csv' %inputd)
f=open('/folder/%s.csv' $inputd,'a')
csvread = csv.reader(u)
csvwrite = csv.writer(f)
csvwrite.writerows(csvread)
print 'OK','%e'%inputd
time.sleep(30)
now I try to get different data title in the same time use multiprocessing:
import multiprocessing as mp
import urllib2,csv,random,logging,csv,time
inputd=(abc,def,ghi)
def dataget():
u = urllib2.urlopen('http://url/%d.csv' %inputd)
f=open('/folder/%s.csv' $inputd,'a')
csvread = csv.reader(u)
csvwrite = csv.writer(f)
csvwrite.writerows(csvread)
print 'OK','%e' % inputd
time.sleep(30)
proc开发者_JS百科ess=[]
for s in inputd:
p = mp.Process(target=dataget)
ps.append(p)
p.start()
I hope this can do the getting and saving of "http://url/abc.csv"; "http://url/def.csv"; "http://url/ghi.csv" at the same time, then re-do it again after time.sleep
. But it doesn't work, or do same process in the same time, what logic error did I make?
Your multiprocess implementation will do the url grabbing only once.
If you want that your dataget do the stuff again and again, you missed the While True
.
Maybe like this it will work :
import multiprocessing as mp
import urllib2,csv,random,logging,csv,time
inputd=(abc,def,ghi)
def dataget(param):
while True:
u = urllib2.urlopen('http://url/%d.csv' %param)
f=open('/folder/%s.csv' %param,'a')
csvread = csv.reader(u)
csvwrite = csv.writer(f)
csvwrite.writerows(csvread)
print 'OK','%e'%param
time.sleep(30)
process=[]
for s in inputd:
p = mp.Process(target=dataget, args=(s,))
process.append(p)
p.start()
精彩评论