开发者

Running out of socket connections with Net::HTTP

I have some fairly boiler-plate Ruby code (running on Linux) which sends a GET request to a server ...

req = Net::HTTP::Get.new(path)
req.content_type = 'text/plain; charset=utf-8'
req.body = ''

port = 443

res = Net::HTTP.new($host, port)

res.use_ssl = true

res.start do |http|
    t = Benchmark.measure do
        _return = http.request(req).body
    end

    $time= t.real
end

req = nil
res = nil

The problem I'm having is that when I call this code in a tight loop, I eventually fill the system up with sockets in the TIME_WAIT 开发者_高级运维state (48687 at last count).

I guess there's nothing special about Ruby here, I'd run into the same problem with C, but is there any GC related problem here? Any tips or tricks for preventing this from happening?


On linux debian, if I set /proc/sys/net/ipv4/tcp_max_syn_backlog to 128 from the default 1024 I can make 1000 connections per second and not run out of sockets.


Not much you can do about it. Take a look at https://serverfault.com/questions/86550/apache-keep-alive-or-not-keep-alive/86565#86565. There's a way of reducing the time sockets stay in that state, but reducing it is "dangerous". The best you can do is to try reusing the connection if the server keeps it open.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜