开发者

Connecting and Saving Data With Redis Inside Celery Task

I have an object that saves data to Redis. It needs to block as less as possible, so I've decided to use Celery to offload the task. When I try to .save() the object outside of celery, it connects to Redis and stores the data just fine. However, when I try to do the exact same thing from a Celery task, it looks like it runs, but there is no connec开发者_StackOverflowtion to Redis, no exception, no error output and nothing gets saves to the Redis server. I replicated the problem with the small bit of code below. test.py:

from celery.decorators import task
import redis

class A(object):
    def __init__(self):
        print "init"

    def save(self):
        self.r = self.connect()
        self.r.set('foo', 'bar')
        print "saved"

    def connect(self):
        return redis.Redis(host="localhost", port=6379)

a = A()

@task
def something(a):
    a.save()

Here is the Python console output:

>>> from test import *
init
>>> a
<test.A object at 0x1010e3c10>
>>> result = something.delay(a)
>>> result.ready()
True
>>> result.successful()
True

And here is the celeryd output:

[2010-11-15 12:05:33,672: INFO/MainProcess] Got task from broker: test.something[d1d71ee5-7206-4fa7-844c-04445fd8bead]
[2010-11-15 12:05:33,688: WARNING/PoolWorker-2] saved
[2010-11-15 12:05:33,694: INFO/MainProcess] Task test.something[d1d71ee5-7206-4fa7-844c-04445fd8bead] succeeded in 0.00637984275818s: None

Any help would be awesome! I've replicated the issue on multiple computers, with multiple python versions.


The problem was being caused by a misconfiguration in the celeryconfig.py. CELERY_IMPORTS needed to include the task module. This is resolved.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜