开发者

I am getting the error: Too many open files after writing a lot of files in Ruby [closed]

This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center. Closed 10 years ago.

I have a script which generates like 16000 html pages and saves it in the system. after 1013 pages i get the error: Too many open files.

This is the Ruby code which generates the files

FileUtils.mkdir_p "public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}"
FileUtils.mkdir_p "public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}"

html_file = File.new("public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}/#{n}.html", "w")
html_file.write(html)
html_file.close

as you can see i close the file in the last line....

Does somebody know what i am doing wrong here? I have Ubuntu 8.04.4 LTS

Thanks a lot

Edit:

This is the whole script

    def self.fetching_directory_page(n=1, letter = nil)
      id = letter == '' ? "" : "/#{letter.upcase}"
      url = "this is a valid url :)"
      agent = WWW::Mechanize.new
      page = agent.get(url)
      html = page.search('div#my_profile_body').to_html

      prefix = id == '' ? 'all' : letter
      FileUtils.mkdir_p "public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}"
      FileUtils.mkdir_p "public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}"

      html_file = File.new("public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}/#{n}.html", "w")
      html_file.write(html)
      html_file.close

      puts "+ CREATED #{prefix}/#{n/1000}/#{n}.html" 

      new_url = page.parser.xpath("//a[@class='next_page']")[0]['href'] rescue nil

      if new_url.present?
        self.fetching_directory_page(n+1, letter)
      end
    end

It is fetching all the users of my users directory and saves the page for caching reasons. It generates 16000 files in total.

This is results for ulimit-a

    core file size          (blocks, -c) 0
    data seg size           (kbytes, -d) unlimited
    scheduling priority             (-e) 0
    file size               (blocks, -f) unlimited
    pending signals                 (-i) 24640
    max locked memory       (kbytes, -l) 32
    max memory size         (kbytes, -m) unlimited
    open files                      (-n) 24000
    pipe size            (512开发者_如何转开发 bytes, -p) 8
    POSIX message queues     (bytes, -q) 819200
    real-time priority              (-r) 0
    stack size              (kbytes, -s) 8192
    cpu time               (seconds, -t) unlimited
    max user processes              (-u) 24640
    virtual memory          (kbytes, -v) unlimited
    file locks                      (-x) unlimited

After editing /etc/security/limits i dont get the error Too many open files but it just gets stuck

lsof -u username returns a list of more or less 600 entries and it doesnt change while doing the script


I'm not certain if this is the best approach to your problem, but it may help:

Try commenting out half the code. If it still has the problem, then comment out half of the remainder. Keep on doing this until the problem goes away. Once the problem's gone away, try uncommenting some of the code. Keep on doing that until the problem returns. More likely than not, the line you've just uncommented is related to the bug. This approach to a problem is sometimes called a "binary chop".

With this particular case, you may want to make sure that whatever's calling fetching_directory_page isn't opening a new file each time without closing it.


Open files weren't causing the problem. It was the recursive method. I changed that and things work great.


The problem seems to be in operating system, not in ruby script itself.

Try this advice from an earlier SO question:

Check how many files your current user has permission to open: in terminal run ulimit -a and check the line open files (-n). Default is 1024.

To fix this you have to modify the following file: /etc/security/limits.conf


It's a minor point, but Ruby supports using a block with File.open, which will automatically close the opened file. It's considered idiomatic to use that form with Ruby, instead of the way you're doing it:

html_file = File.new("public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}/#{n}.html", "w")
html_file.write(html)
html_file.close

should be:

File.open("public/users_directory/#{DEFAULT_COUNTRY_CODE}/#{prefix}/#{n/1000}/#{n}.html", "w") do |html_file|
  html_file.print html
end

From the docs for IO.open, from which File.open is inherited:

With no associated block, IO.open is a synonym for ::new. If the optional code block is given, it will be passed io as an argument, and the IO object will automatically be closed when the block terminates. In this instance, ::open returns the value of the block.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜