Celery + Docker | Dagster Integrations
Back to integrations
Scale up execution with the power of Docker

Scale up execution with the power of Docker

Launches Celery-based tasks in docker containers.

About this integration

This integration provides you with a Celery-based app executor which launches tasks in Docker containers.

Installation

pip install dagster-celery-docker

Example

# Materialize your assets with Docker and Celery
# Read the docs on Executors to learn more: https://docs.dagster.io/deployment/executors

from dagster import define_asset_job
from dagster_celery_docker.executor import celery_docker_executor
import os

executor = celery_docker_executor.configured({
  'docker': {
    'image': 'my_repo.com/image_name:latest',
    'registry': {
      'url': 'my_repo.com',
      'username': 'my_user',
      'password': os.environ['DOCKER_REGISTRY_PASSWORD'],
    },
    'env_vars': ['DAGSTER_HOME'], # environment vars to pass from celery worker to docker
    'container_kwargs': { # keyword args to be passed to the container. example:
      'volumes': ['/home/user1/:/mnt/vol2', '/var/www:/mnt/vol1'],
    },
  },
  'broker': 'pyamqp://guest@localhost//', # The URL of the Celery broker
  'backend': 'rpc://', # The URL of the Celery results backend
  'include': ['my_module'], # Modules every worker should import
})

celery_enabled_job = define_asset_job("celery_enabled_job", executor_def=executor)

About Celery + Docker

Using the Celery Executor along with Docker makes it easier for developers to scale up applications by distributing the tasks on multiple machines. Package up and run Celery via a Docker image, and isolate processes in containers.