Dagster Deep Dives


The Dagster team walks us through key capabilities of the Dagster framework, with guidance on best practices and hands-on coding examples.

Building a True Data Platform: Beyond the Modern Data Stack

The Modern Data Stack offers countless tools but often creates hard-to-manage pipelines. This talk shows how to transform disjointed tools into a unified, observable data platform with default high-quality data.

Dagster, SDF, & the Evolution of the Data Platform

The Dagster Labs and SDF teams show how the combined strengths of the two solutions can enhance your developer experience, streamline your data pipelines, reduce costs, and enhance data quality and reliability.

Data Quality: Building Reliable Data Platforms

Colten Padden walks us through why—and how to—implement data quality checks in the orchestrator. We discuss Implementing data quality checks and building orchestration logic around the outcomes of quality checks.

Flexible Scheduling with Automation

Pedram Navid shares insights into how to use Dagster's flexible scheduling options to automate and optimize your data platform.

Configuration & Resources

Colton Padden guides us through Dagster's configuration and resource systems, which allow you to develop locally and ship confidently.

Thinking in Partitions

By embracing partitioning you can speed up execution and greatly reduce the cost of running pipelines. Gain control over backfills, and observe your data assets at a higher level of granularity.

Dagster and the Data Mesh

Effective collaboration around data requires the right tooling for balancing local autonomy and unified orchestration. Tim walks us through how to set Dagster up to support a ‘Data Mesh’ and other approaches to empowering local data teams.