Django, Celery and RabbitMQ
12th September, 2017
Running Asynchronous tasks using Django, Celery with RabbitMQ
- Install RabbitMQ
# For Mac $ brew install rabbitmq $ export PATH=$PATH:/usr/local/sbin # Start server (Mac) $ sudo rabbitmq-server -detached #-detached flag indicates the server to run in the background # To stop server sudo rabbitmqctl stop #Add user settings (optional) $ sudo rabbitmq-server -detached $ sudo rabbitmqctl add_user myuser mypassword $ sudo rabbitmqctl add_vhost myvhost $ sudo rabbitmqctl set_permissions -p myvhost myuser ".*" ".*" ".*" # For Ubuntu $ apt-get install -y erlang $ apt-get install rabbitmq-server # Then enable and start the RabbitMQ service: $ systemctl enable rabbitmq-server $ systemctl start rabbitmq-server #Check rabbitmq server status $ systemctl status rabbitmq-server​
- Assuming you have a Django application set up, install celery and dj-celery
$ pip install celery django-celery-beat
- In your Django settings.py file, add the following
CELERY_BROKER_URL = "amqp://myuser:mypassword@localhost:5672/myvhost" - Create a Celery instance file called celery.py . Create this file at the same level of settings.py file and add the following:
import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings') app = Celery('proj') # Using a string here means the worker doesn't have to serialize # the configuration object to child processes. # - namespace='CELERY' means all celery-related configuration keys # should have a `CELERY_` prefix. app.config_from_object('django.conf:settings', namespace='CELERY') # Load task modules from all registered Django app configs. app.autodiscover_tasks()
- Now you need to ensure that this app is loaded. To do so, add the following in the __init__.py file.
# This will make sure the app is always imported when # Django starts so that shared_task will use this app. from .celery import app as celery_app __all__ = ['celery_app']
- Add 'django_celery_beat' to the list of Installed apps in the Django settings.py.
- Add the following settings for Celery in the settings.py file.
# Celery settings CELERY_BROKER_URL = "amqp://celeryuser:celerypassword@localhost:5672/celeryhost" # for security reasons, mention the list of accepted content-types (in this case json) CELERY_ACCEPT_CONTENT = ['json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_TIMEZONE = 'Europe/Berlin' CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
- Run migration for the django_celery_beat app
$ python manage.py migrate
- Create a periodic task in tasks.py file in respective django app. e.g.:
from celery.task.schedules import crontab from celery.decorators import periodic_task from celery import shared_task from celery.utils.log import get_task_logger logger = get_task_logger(__name__) @shared_task def test_celery_worker(): print("Celery Workers are working fine.") # A periodic task that will run every minute (the symbol "*" means every) @periodic_task(run_every=(crontab(hour="*", minute="*", day_of_week="*"))) def task_example(): logger.info("Task started") # add code logger.info("Task finished")​
- Finally check the server is running and initiate the django celery processes.
$ sudo rabbitmq-server -detached #for starting rabbitmq server (if its not running) $ python manage.py shell >>> from django_celery_beat.models import PeriodicTask >>> PeriodicTask.objects.update(last_run_at=None) #Initiating a worker $ celery -A proj worker -l info -n worker # Initiating the scheduler $ celery -A proj beat -l info
- At the same time run the Django runserver command in another terminal.