开发者

Exceeded soft process size limit error. How to fix?

I am getting the following error with my GAE application:

2011-06-25 00:15:59.023 /publish 500 30878ms 1796cpu_ms 0kb Picasa/117.430000 (gzip),gzip(gfe)

2011-06-25 00:15:59.010 Exceeded soft process size limit with 197.977 MB a开发者_Python百科fter servicing 16 requests total

Here is the code:

def post(self):
    '''Here we receive photos from the GetPostDetailsHandler (/).
    We upload them to vkontakte server one-by-one (vkontakte accepts
    5 files at once, but urlfetch allows only 1 Mb per request), 
    and then save details of photos uploaded in memcache.
    Cookies are also saved there.
    '''
    # list of files for upload (max 5 files can be in the list)
    files = []
    # counts number of uploads to vkontakte server
    posts_sent = 0
    # calculate total number of files received
    # and store field names with files in files_arguments list
    arguments = self.request.arguments()
    files_arguments = []
    for argument in arguments:
        if 'localhost' in argument:
        # if not (argument in ['album', 'albums_list', 'album_custom', 'upload_url', 'album_id', 'user_id', 'body', 'title']): 
            files_arguments.append(argument) 
    logging.info('(POST) ... number of photos received: '+str(len(files_arguments)))

    logging.info('(POST) ... upload process started')
    files_counter = 0 # counts total number of files sent
    for argument in files_arguments:
        files_counter +=1
        file_size = len(self.request.get(argument))/(1024.0*1024.0)
        logging.info('(POST) ... size of file '+str(files_counter)+' is '+str(file_size)+' Mb')
        if file_size <= 1:
            files.append(MultipartParam('file1', self.request.get(argument), 'file'+str(files_counter)+'.jpg', 'application/x-www-form-urlencoded'))
            # sending file
            data, headers = multipart_encode(files)
            # try 3 times to send the file
            for i in range(3):
                try:
                    result = urlfetch.fetch(url=self.request.get('upload_url'), 
                                            payload=''.join(data),
                                            method=urlfetch.POST,
                                            headers=headers,
                                            deadline=10
                                            )              
                    break
                except DownloadError:
                    logging.error('(POST) ... error during file upload, attempt ' + str(i))
                    pass
            if result.status_code == 200:
                # save result in the memcache for 10 minutes
                memcache.add(key=self.request.get('user_id')+'_'+self.request.get('album_id')+'_'+str(files_counter), value=result.content, time=600)
                # save description in the memcache
                memcache.add(key=self.request.get('user_id')+'_'+self.request.get('album_id')+'_'+str(files_counter)+'_desc', value=self.request.get('desc'+str(files_counter)), time=600)
                logging.info('(POST) ... result of photos upload ('+str(files_counter)+'): '+result.content)
                files = []      
    # save cookies
    cookies = self.request.headers.get('Cookie')
    logging.info(cookies)
    memcache.add(key=self.request.get('user_id')+'_'+self.request.get('album_id')+'_'+'cookies', value=cookies, time=600)
    logging.info('(POST) ... upload process finished')
    # return url for Picasa - SavePhotosHandler (/save) - it will be opened in user's default browser
    # so, we have to pass there key of data we saved in memcache
    self.response.out.write('http://picasa2vkontakte.appspot.com/save?'+self.request.get('user_id')+'_'+self.request.get('album_id'))

What is wrong here? I've also found that such error usually means that there is a memory leak in application. How to find that?


This probably isn't a memory leak - you're simply using all the available memory handling the uploaded files. The best option would be to use the Blobstore service to avoid handling the files yourself at all. Alternately, examine your code carefully for any instances where the contents of the uploaded files are copied (anything that manipulates the files, converts to/from strings, etc), and try and minimize that.

Alternately, since it looks like you're just immediately uploading the files to another service, have your users upload directly there instead.


So the problem comes before your script is executed. All files uploaded are stocked in the instance's memory and the instance shares her memory between requests. If a lot of users upload a too big picture, your instance can be killed by an "exceeded memory limit" exception.

Server side you can't correct the problem, but you can use a plugin like flash to check the size of files before the upload.

Or an another way is to use the service blobstoreService

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜