开发者

What's the most scalable way to handle somewhat large file uploads in a Python webapp?

We have a web appl开发者_开发问答ication that takes file uploads for some parts. The file uploads aren't terribly big (mostly word documents and such), but they're much larger than your typical web request and they tend to tie up our threaded servers (zope 2 servers running behind an Apache proxy).

I'm mostly in the brainstorming phase right now and trying to figure out a general technique to use. Some ideas I have are:

  • Using a python asynchronous server like tornado or diesel or gunicorn.
  • Writing something in twisted to handle it.
  • Just using nginx to handle the actual file uploads.

It's surprisingly difficult to find information on which approach I should be taking. I'm sure there are plenty of details that would be needed to make an actual decision, but I'm more worried about figuring out how to make this decision than anything else. Can anyone give me some advice about how to proceed with this?


If you're open to adding Django to your environment, its uploading file feature has built-in support for uploading files in multiple chunks.

Take a look at the File Uploads overview section of the django documentation.


See if you can push the files straight from the user's browser to a static file store, like Amazon S3 or rackspace's cloudfiles service.

Then you can handle a ton of parallel users.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜