wget in rails application- can not allocate memory
I am using wget
in my rails
application to fetch pages from website and storing some extracted data from them. It starts smoothly but after a while breaks saying can not allocate memory
.
Please let me know how to take on this issue??
UPDATE (Code added)
def self.crawl_my_links
puts "------------------- looping -------------------------------"
valid_domains = get_valid_domains
crawl_links = CrawlLink.all(:conditions => ["server_id = #{Monitoring::SERVER_ID} and crawl_status = 'Assigned'"], :开发者_C百科order => "url").shuffle
crawl_links.each do |crawl_link|
url_host = URI.parse(URI.encode(crawl_link.url)).host
next if crawl_link.url.blank? or !crawl_link.url.starts_with?("http")
site_domain = Domainatrix.parse(URI.encode(crawl_link.url)).domain
unless valid_domains.has_key?(site_domain)
logger.info "Domain - '#{site_domain}' not registered in the system "
next
end
url_protocol = crawl_link.url.split('://').first
#if not crawl_link.recently_updated?
html_content = `wget -qO- #{crawl_link.url}`
update_result(crawl_link, html_content)
site_url = "#{url_protocol}://#{url_host}"
#end
crawl_for_new_links(site_url, html_content)
crawl_link.update_attribute(:crawl_status, "Available")
end
sleep(10)
Monitoring.delay.assign_links
puts "++++++++++++++++ DONE ++++++++++++++++++"
end
精彩评论