I\'m using Celery to queue jobs from a CGI application I made. The way I\'ve set it up, Celery makes each job run one- or two-at-a-time by setting CELERYD_CONCURRENCY = 1 or = 2 (so they don\'t crowd
I\'m attempting to implement a following scenario with Celery: two queues of (same) long-running tasks, one for \"normal\" and the other for \"idle\" priority.
I use: Celery Django-Celery RabbitMQ I can see all my tasks in the Django admin page, but at the moment it has just a few states, like:
I\'m planning on deploying a dynamic site that needs certain tasks to be done periodically in the background, let\'s say every hour or two. The data that i need to output is strictly depending on the
I want to run a Django - Celery task with manual transaction management, but it seems that the annotations do not stack.
I am trying to set up Django with Celery so I can send bulk emails in the background. I am a little confused about how the different components play into Celery. Do I need to use RabbitMQ? Can I just
I started playing around with Celery and RabbitMQ this morning and defined some basic tasks to see how the performance will improve on my server.
I found that I can set the task to run at specific interval at specific times from here, but that was only done dur开发者_Python百科ing task declaration. How do I set a task to run periodically dynami
I have two different django projects say projA and projB, each have its own celery daemon running on separate queues but same vhost, projA have a task taskA and projB have a task taskB, I try to run t
am trying to do a transactional tasks whereby the task will rollback the database updates if it fails to send the email.