Why would HTTP transfer via wget be faster than lftp/pget?
I'm building software that needs to do massive amounts of file transfer via both HTTP and FTP. Often times, I get faster HTTP download with a multi-connection download accelerator like axel or lftp with pget. In some cases, I've seen 2x-3x faster file transfer using something like:
axel http://example.com/somefile
or
lftp -e 'pget -n 5 http://example.com/somefile;quit'
vs. just using wget:
wget http://example.com/somefile
But other times, wget is significantly faster than lftp. Strangly, this is even true even when I do lftp with get, like so:
lftp -e 'pget -n 1 http://example.com/somefile;quit'
I understand that downloading a file via multiple connections won't always result in a speedup, depending on how bandwidth is constrained. But: why would it be slower? Especially 开发者_JAVA技巧when calling lftp/pget with -n 1?
Is it possible that the HTTP server is compressing the stream using gzip? I can't remember if wget handles gzip Content Encoding or not. If it does, then this might explain the performance boost. Another possibility is that there is an HTTP cache somewhere in the pipeline. You can try something like
wget --no-cache --header="Accept-Encoding: identity"
and compare this to your FTP-based transfer times.
精彩评论