Back to integrations
Using Airflow with Dagster

Using Airflow with Dagster

Looking to move off Apache Airflow? Looking to run both platforms and incrementally adopt Dagster? We have you covered.

About this integration

Because Airflow is often embedded within the data platform, migrating off it is not a straightforward affair. For many organizations, it is preferable to accept the technical debt and let legacy pipelines run on Airflow.

But those same teams want to protect their future investments by building any new pipelines on a more contemporary platform like Dagster and enjoy the full-development-lifecycle benefits that Dagster provides.

To facilitate this, the dagster-airflow package allows you to pick your preferred approach:

Run Dagster Assets from Airflow

  • Orchestrate Dagster from Airflow
  • Export Dagster jobs as Airflow DAGs

Run Airflow DAGs from Dagster

  • Import Airflow DAGs into Dagster jobs
  • Port airflow datasets to dagster assets

With this flexibility, teams can migrate over time to Dagster, prioritizing critical parts fo the data platform without having to do an entire lift-and-shift migration.


pip install dagster-airflow


from dagster_airflow.dagster_job_factory import make_dagster_job_from_airflow_dag

# Import an Airflow DAG from your existing project
from my_airflow_project import my_airflow_dag

# Run your Airflow DAG on Dagster
my_job = make_dagster_job_from_airflow_dag(my_airflow_dag)

About Airflow

Airflow is an open source platform that is used to programmatically author, schedule and monitor workflows. Many organizations are now looking to migrate off Airflow due to its intrinsic shortcomings.