Scaling out an app that hits external apis
I'm using beanstalkd to background process api calls to facebook graph api and I want the app to update, i.e. hits facebook api every 10 minutes get the info. I thought about creating a simple script that loads necessary info from db (fb ids/urls), queues jobs in beanstalkd and then sleeps for 9 minutes. Maybe use God to make sure the script keeps running/restart if memory consumption gets too big.
Then I started reading about drbs and wondered if t开发者_运维知识库here's a way/need to integrate the two.
I asked in #rubyonrails and got cron and regular rb script as two options. Just wondering if there's a better way.
I would recommend, for simplicity of configuration using delayed_job and a cronjob which calls a rake task which deals with queueing of the jobs.
Monit is also a good alternative to God and seems to be more stable and less memory hungry for process monitoring.
For delayed job you need to add the following to your deploy script (assuming you plan to deploy with capistrano)
namespace :delayed_job do
def rails_env
fetch(:rails_env, false) ? "RAILS_ENV=#{fetch(:rails_env)}" : ''
end
desc "Stop the delayed_job process"
task :stop, :roles => :app do
run "cd #{current_path};#{rails_env} script/delayed_job stop"
end
desc "Start the delayed_job process"
task :start, :roles => :app do
run "cd #{current_path};#{rails_env} script/delayed_job start"
end
desc "Restart the delayed_job process"
task :restart, :roles => :app do
run "cd #{current_path};#{rails_env} script/delayed_job stop"
run "cd #{current_path};#{rails_env} script/delayed_job start"
end
end
I had to extract these recipies from the delayed_job gem to get them to run.
精彩评论