开发者

gzip - questions about performance

Firstly, I'm using Django. Django provides gzip middleware which works just fine. Nginx also provides a gzip module. Would it make more sense to just use Nginx's gzip module because it is implemented purely in C, or are there 开发者_如何学Cother performance considerations I'm missing.

Secondly, Django doesn't gzip anything under 200 bytes. Is this because gzipping is too expensive to have any value when compressing output smaller than that?

Thirdly, the API I'm building will be almost purely dynamic with very little caching. Is gzipping expensive enough to make it unpractical to use in this situation (vs a situation where I could cache the gzipped output on the webserver)?


1) I imagine that one gzip compression is enough and nginx is faster, although I haven't benchmarked it yet. GzipMiddleware utilizes a few built-ins, which might be well optimized, too.

# From http://www.xhaus.com/alan/python/httpcomp.html#gzip
# Used with permission.
def compress_string(s):
    import cStringIO, gzip
    zbuf = cStringIO.StringIO()
    zfile = gzip.GzipFile(mode='wb', compresslevel=6, fileobj=zbuf)
    zfile.write(s)
    zfile.close()
    return zbuf.getvalue()

2) Small gzip'd files just can't take advantage from compression (in fact small files might be bigger, when processed), so one can save time by just skipping this step.

3) You could design a test suite including sample data. Then decide on that data, what works best for your application.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜