天道酬勤,学无止境

celerybeat

Correct setup of django redis celery and celery beats

I have being trying to setup django + celery + redis + celery_beats but it is giving me trouble. The documentation is quite straightforward, but when I run the django server, redis, celery and celery beats, nothing gets printed or logged (all my test task does its log something). This is my folder structure: - aenima - aenima - __init__.py - celery.py - criptoball - tasks.py celery.py looks like this: from __future__ import absolute_import, unicode_literals import os from django.conf import settings from celery import Celery # set the default Django settings module for the 'celery' program. os

2021-06-13 14:39:44    分类:问答    python   django   celery   celerybeat

本地主机上的 Django/Celery 多个队列 - 路由不起作用(Django/Celery multiple queues on localhost - routing not working)

问题 我按照 celery docs 在我的开发机器上定义了 2 个队列。 我的芹菜设置: CELERY_ALWAYS_EAGER = True CELERY_TASK_RESULT_EXPIRES = 60 # 1 mins CELERYD_CONCURRENCY = 2 CELERYD_MAX_TASKS_PER_CHILD = 4 CELERYD_PREFETCH_MULTIPLIER = 1 CELERY_CREATE_MISSING_QUEUES = True CELERY_QUEUES = ( Queue('default', Exchange('default'), routing_key='default'), Queue('feeds', Exchange('feeds'), routing_key='arena.social.tasks.#'), ) CELERY_ROUTES = { 'arena.social.tasks.Update': { 'queue': 'fs_feeds', }, } 我在项目的 virtualenv 中打开了两个终端窗口,并运行了以下命令: terminal_1$ celery -A arena worker -Q default -B -l debug --purge -n deafult_worker terminal_2$

2021-06-11 20:21:26    分类:技术分享    python   django   celery   celerybeat

Django Celery Scheduling a manage.py command

I need to update the solr index on a schedule with the command: (env)$ ./manage.py update_index I've looked through the Celery docs and found info on scheduling, but haven't been able to find a way to run a django management command on a schedule and inside a virtualenv. Would this be better run on a normal cron? And if so how would I run it inside the virtualenv? Anyone have experience with this? Thanks for the help!

2021-06-10 14:31:59    分类:问答    django   celery   django-celery   celerybeat

celery beat schedule: run task instantly when start celery beat?

If I create a celery beat schedule, using timedelta(days=1), the first task will be carried out after 24 hours, quote celery beat documentation: Using a timedelta for the schedule means the task will be sent in 30 second intervals (the first task will be sent 30 seconds after celery beat starts, and then every 30 seconds after the last run). But the fact is that in a lot of situations it's actually important that the the scheduler run the task at launch, But I didn't find an option that allows me to run the task immediately after celery starts, am I not reading carefully, or is celery missing

2021-06-01 23:05:04    分类:问答    python   celery   celerybeat

celerybeat - multiple instances & monitoring

I'm having application built using celery and recently we got a requirement to run certain tasks on schedule. I think celerybeat is perfect for this, but I got few questions: Is it possible to run multiple celerybeat instances, so that tasks are not duplicated? How to make sure that celerybeat is always up & running? So far I read this: https://github.com/celery/celery/issues/251 and https://github.com/ybrs/single-beat It looks like a single instance of celerybeat should be running. I'm running application inside AWS elasticbeanstalk docker containers and celery workers are also docker

2021-05-24 23:37:24    分类:问答    python   python-2.7   celery   celerybeat

Django/Celery multiple queues on localhost - routing not working

I followed celery docs to define 2 queues on my dev machine. My celery settings: CELERY_ALWAYS_EAGER = True CELERY_TASK_RESULT_EXPIRES = 60 # 1 mins CELERYD_CONCURRENCY = 2 CELERYD_MAX_TASKS_PER_CHILD = 4 CELERYD_PREFETCH_MULTIPLIER = 1 CELERY_CREATE_MISSING_QUEUES = True CELERY_QUEUES = ( Queue('default', Exchange('default'), routing_key='default'), Queue('feeds', Exchange('feeds'), routing_key='arena.social.tasks.#'), ) CELERY_ROUTES = { 'arena.social.tasks.Update': { 'queue': 'fs_feeds', }, } i opened two terminal windows, in virtualenv of my project, and ran following commands: terminal_1$

2021-05-24 20:39:32    分类:问答    python   django   celery   celerybeat

Celery Beat:一次限制为单个任务实例(Celery Beat: Limit to single task instance at a time)

问题 我有芹菜拍子和芹菜(四个工人)来做一些加工步骤。 这些任务中的一项大致如下:“为尚未创建Y的每个X创建一个Y。” 该任务以半快速速率(10秒)定期运行。 该任务很快完成。 还有其他任务也在进行。 我已经多次遇到拍子任务明显积压的问题,因此同一任务(来自不同拍子时间)同时执行,从而导致工作重复不正确。 看来任务是无序执行的。 是否可以限制芹菜节拍以确保一次仅完成一项任务? 在任务上设置诸如rate_limit=5之类的方法是“正确”的方法吗? 是否可以确保拍子任务按顺序执行,例如,拍子将其添加到任务链中而不是分派任务? 除了使这些任务本身原子执行并且可以安全地并行执行之外,什么是最好的处理方式? 这不是我期望的节拍任务的限制... 任务本身是天真的定义的: @periodic_task(run_every=timedelta(seconds=10)) def add_y_to_xs(): # Do things in a database return 这是一个实际的(清理过的)日志: [00:00.000] foocorp.tasks.add_y_to_xs已发送。 id->#1 [00:00.001]收到的任务:foocorp.tasks.add_y_to_xs [#1] [00:10.009] foocorp.tasks.add_y_to_xs已发送。 id->#2

2021-05-24 16:44:32    分类:技术分享    python   concurrency   rabbitmq   celery   celerybeat

芹菜打败方法任务不起作用(Celery beat with method tasks not working)

问题 我试图在方法任务上运行celerybeat,但无法正常工作。 这是一个示例设置: from celery.contrib.methods import task_method from celery import Celery, current_app celery=celery('tasks', broker='amqp://guest@localhost//') celery.config_from_object("celeryconfig") class X(object): @celery.task(filter=task_method, name="X.ppp") def ppp(self): print "ppp" 而我的celeryconfig.py文件是 from datetime import timedelta CELERYBEAT_SCHEDULE = { 'test' : { 'task' : 'X.ppp', 'schedule' : timedelta(seconds=5) }, } 当我运行celery beat ,出现如下错误: task X.ppp raised exception, TypeError('ppp() takes exactly 1 argument, (0 given) 当我将方法转换为普通函数并用@ celery

2021-05-18 02:57:40    分类:技术分享    python   celery   celerybeat

How to programmatically generate celerybeat entries with celery and Django

I am hoping to be able to programmatically generate celerybeat entries and resync celerybeat when entries are added. The docs here state By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database. So I am trying to figure out which classes i need to extend to be able to do this. I have been looking at celery scheduler docs and djcelery api docs but the documentation on what some of these methods do is non-existent so about to dive into some source and was just hoping someone could point me in the right

2021-05-16 14:50:55    分类:问答    django   celery   django-celery   celerybeat

Work around celerybeat being a single point of failure

I'm looking for recommended solution to work around celerybeat being a single point of failure for celery/rabbitmq deployment. I didn't find anything that made sense so far, by searching the web. In my case, once a day timed scheduler kicks off a series of jobs that could run for half a day or longer. Since there can only be one celerybeat instance, if something happens to it or the server that it's running on, critical jobs will not be run. I'm hoping there is already a working solution for this, as I can't be the only one who needs reliable (clustered or the like) scheduler. I don't want to

2021-05-16 13:11:16    分类:问答    celery   celerybeat