Picking up from Django app template w/ Docker, here are the steps to add Celery to the Django app.
Add RabbitMQ as the message queue
Modify docker-compose.yml
to include:
services:
...
rabbitmq:
image: rabbitmq:3-management
ports:
- "15672:15672"
expose:
- "15672"
I use the “3-management
” tag so that it includes a management plugin (accessible at http://localhost:15672/). However, simpler tags (e.g. “3
” or “latest
“) can be used if the management UI is not needed.
Install celery to the Django project
docker-compose run --rm app /bin/bash
...
pip install celery
pip freeze -r requirements.txt > requirements.txt
exit
Rebuild the container with the new packages added by celery.
Add a couple of files to set up Celery in our Django project
The Celery stuff will be added into the myapp
Django app.
myapp/celery.py
from celery import Celery
app = Celery(
'celery_app',
broker='amqp://rabbitmq',
# backend='rpc://',
# This should include modules/files that define tasks. This is a list of strs
# to be evaluated later in order to get around circular dependencies, I suspect.
include=[
'myapp.tasks', # This is our file containing our task
]
)
# Optional configuration, see the application user guide.
app.conf.update(result_expires=3600)
if __name__ == '__main__':
app.start()
myapp/tasks.py
import logging
from myapp.celery import app
logger = logging.getLogger(__name__)
# This is where Celery tasks are defined
@app.task
def add(x: int, y: int) -> int:
logger.info(f"add({x}, {y}) called.")
return x + y
Add a Celery service to docker-compose.yml
Modify docker-compose.yml
again to add a Celery service. This can be done together with the RabbitMQ service above, but it is shown here separately for readability.
services:
...
rabbitmq:
...
app-celery:
build: .
environment:
- DJANGO_SETTINGS_MODULE=myapp.settings
command: >
sh -c "celery -A myapp.celery worker --loglevel=INFO"
volumes:
- ./:/code
depends_on:
rabbitmq:
condition: service_started
Things to watch out for
A bunch of things to highlight to show where the connection points are:
- The broker URL when instantiating the Celery app is
amqp://rabbitmq
(notamqp://localhost
) because that’s how networking in Docker works. The “rabbitmq
” in this case the name of the service we use for the RabbitMQ container. So if a different container name is used, this AMQP URL needs to use that corresponding name. - The Celery app parameter (
-A myapp.celery
) is the path to themyapp/celery.py
file where the Celery app (app = Celery('celery_app', ...)
) is created. - Speaking of which, when defining the Celery app, its
include=[ ... ]
should includestr
values that point to files where Celery tasks are defined. - And the task files that define the Celery tasks need to import the Celery app and use its
@app.task
decorator for the task functions.
Complete docker-compose.yml
The entire file looks like:
services:
app:
build: .
command: >
sh -c "python manage.py migrate &&
python manage.py runserver 0.0.0.0:8000"
ports:
- "8000:8000"
expose:
- "8000"
volumes:
- ./:/code
depends_on:
rabbitmq:
condition: service_started
tty: true
stdin_open: true
app-celery:
build: .
command: >
sh -c "celery -A myapp.celery worker --loglevel=INFO"
volumes:
- ./:/code
depends_on:
rabbitmq:
condition: service_started
rabbitmq:
image: rabbitmq:3-management
ports:
- "15672:15672"
expose:
- "15672"