Log from within collectiveidea/delayed_job
I try to log from within a delayed_job in rails.
I configure it as following:
Delayed::Worker.destroy_failed_jobs = false
Delayed::Worker.max_attempts = 3
Delayed::Worker.backend = :active_record
Delayed::Worker.logger = ActiveSupport::BufferedLogger.new("log/#{Rails.env}_delayed_jobs.log", Rails.logger.level)
Delayed::Worker.logger.auto_flushing = 1
Define my job:
class TestJob
def initialize(user)
@user = user
end
开发者_运维问答
#called when enqueue is performed
def enqueue(job)
Delayed::Worker.logger.info("TestJob: enqueue was called")
end
def perform
Delayed::Worker.logger.info("\n\n\n\nTestJob: in perform, for user #{@user.twitter_username}")
end
end
But when I call enquee on my job
Delayed::Job.enqueue(TestJob.new(user), 2)
The log files remain empty, even though the delayed_jobs table shows that the job was performed.
Any ideas?
Try it without using DJ's namespace.
Here's code from a job I was using a while ago. Note: Logger.new
will just append on to a .log file if it already exists.
class ScrapingJob < Struct.new(:keyword_id)
def perform
begin
...
rescue => error
log = Logger.new("#{Rails.root}/log/scraping_errors.log")
log.debug "logger created for '#{keyword.text}' (#{keyword.id}) on '#{keyword.website.url}' (#{keyword.website.id})"
log.debug "time: #{measured_at.to_s(:short)}"
log.debug "error: #{error}"
log.debug ""
end
end
end
My guess it is a serialization issue with DJ trying to serialize the User
object. DJ does a bad job of raising those errors.
Are you using Mongoid by any chance?? I know there are issues there with JSONizing a Mongoid object.
Does it work if you pass the user_id string to the Job, instead of the User object? like so ...
class TestJob
def initialize(user_id)
@user = User.find(user_id)
end
# ...
end
Even though it is not as pretty, I think it is best practice to pass literals (Strings, Floats, etc...) to DelayedJob or any background worker. This makes it easier to swap processers in case you want to migrate to Resque in the future.
精彩评论