开发者

How do I serve a large file using Pylons?

I am writing a Pylons-based download gateway. The gateway's client will address files by ID:

/file_gw/download/1

Internally, the file itself is accessed via HTTP from an internal file server:

http://internal-srv/path/to/file_1.content

The files may be quite large, so I want to stream the content. I store metadata about the file in a StoredFile model object:

class StoredFile(Base):
    id = Column(Integer, primary_key=True)
    name = Column(String)
    size = Column(Integer)
    conte开发者_C百科nt_type = Column(String)
    url = Column(String)

Given this, what's the best (ie: most architecturally-sound, performant, et al) way to write my file_gw controller?


One thing you'll want to avoid is loading the entire file into memory before returning the first byte to the client. In wsgi you can return an iterator for the body of the response. The webob documentation has an example of this that you should be able to work into your controller. After all, pylons uses webob.

The overall effect for this is the client gets feedback right away that the file is downloading however long it takes to return the first chunk.

You may also want to look at the GridFS implementation for MongoDB, it's a pretty nice way to get a distributed filesystem going that is optimized for write once read many type file operations.

The combination of these 2 things would be a good start if you have to do it yourself.


I would consider using nginx and http://wiki.nginx.org/XSendfile or an equivalent.


The most architecturally sound way would be to have the controller redirect to Amazon S3 to download the file and to store the files on Amazon S3.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜