How to run Django, Celery, and Flower in a single Docker Compose file and CD to AWS

Jan 30, 2026

In this guide, we’ll walk through the process of setting up a robust Django application environment using Docker Compose. We’ll include Celery for background tasks and Flower for monitoring, all running in a single orchestration file.

Why Docker Compose for Django?

Docker Compose simplifies the management of multi-container applications. Instead of manual installations, you define your stack in a docker-compose.yml file.

The Stack

  • Django: Our core web application.
  • PostgreSQL: The database.
  • Redis: The message broker for Celery.
  • Celery: Distributed task queue.
  • Flower: Real-time monitoring tool for Celery.

Docker Compose Configuration

Here is a snippet of how your docker-compose.yml might look:

version: '3.8'

services:
  db:
    image: postgres:15
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres

  redis:
    image: redis:7-alpine

  web:
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    environment:
      - DATABASE_URL=postgres://postgres:postgres@db:5432/postgres
    depends_on:
      - db
      - redis

  worker:
    build: .
    command: celery -A myproject worker -l info
    depends_on:
      - db
      - redis

  flower:
    image: mher/flower
    environment:
      - CELERY_BROKER_URL=redis://redis:6379/0
    ports:
      - "5555:5555"
    depends_on:
      - redis

Continuous Deployment to AWS

Once your containers are ready, you can automate the deployment using GitHub Actions and AWS Elastic Beanstalk or ECS.

  1. Build Images: Build your Docker images in the CI pipeline.
  2. Push to ECR: Push images to AWS Elastic Container Registry.
  3. Deploy: Update your ECS task definition or Beanstalk environment.

Stay tuned for more deep dives into AWS infrastructure!