Blog
Bridging Business Intelligence and Data Orchestration with Dagster + Sigma

Bridging Business Intelligence and Data Orchestration with Dagster + Sigma

November 14, 2024
Bridging Business Intelligence and Data Orchestration with Dagster + Sigma
Bridging Business Intelligence and Data Orchestration with Dagster + Sigma

Break down the silos between data engineering and BI tools

Traditionally, there's been a wall between data and BI tools, resulting in a lack of visibility into data dependencies and lineage from upstream data sources to downstream BI dashboards and views. This often results in stale insights, inaccurate reports, and redundant work, which lowers the quality of decision-making and slows down business productivity.

What's causing it?

The traditional silos that exist between data engineering and BI tools, make it difficult to trace data from raw sources to the BI dashboards that drive business decisions.

Announcing the New Dagster + Sigma Integration

To break down the traditional barriers between data engineering and BI tools, Dagster Labs and Sigma Computing have launched a powerful new integration that enables data teams to easily track, orchestrate, and analyze their data flows end-to-end.

Now generally available (GA), this integration creates a unified lineage from upstream data managed in Dagster to BI insights generated in Sigma, ensuring data flows smoothly across all stages of the pipeline.

With Sigma, your datasets and workbooks are assets in the Dagster asset graph. Now, data teams using this integration can easily track dependencies and lineage between Sigma and upstream Dagster-managed data. In turn, that helps them gain a comprehensive view of data pipelines that now include BI assets - letting data platform teams understand, monitor, and manage their data flows with unprecedented visibility and control.

The Dagster asset graph with Sigma datasets and workbooks

Here's what you can expect:

  • Improved Data Lineage and Visibility: By representing Sigma objects in Dagster's asset graph, data teams gain a comprehensive view of their data flows. This makes it easy for data engineers and business analysts to track how data moves from its source through transformation stages to the dashboards and reports that drive business decisions.
  • End-to-End Data Pipeline Orchestration: With Sigma integrated into the Dagster environment, users can build complete end-to-end pipelines encompassing data processing and analytics. This means upstream data changes can automatically trigger updates in Sigma, ensuring that BI insights remain accurate and timely without manual intervention.
  • Enhanced Collaboration Across Teams: The integration fosters collaboration between data analysts and upstream practitioners by providing a shared view of data lineage and dependencies. When changes occur upstream, analysts can quickly assess potential impacts on Sigma reports and dashboards.

Key Benefits

By connecting Dagster's orchestration power with Sigma's user-friendly analytics and BI platform, data teams can unlock a range of benefits:

  • Operational Efficiency: Automating the refresh and orchestration of Sigma BI assets reduces manual processes, leading to more efficient workflows and up-to-date analytics.
  • Greater Scalability: The integration's end-to-end pipeline visibility and automated updates support data platform scalability, allowing teams to build a unified data ecosystem that grows with their needs.
  • Centralized Data Management: For data platform teams, Dagster becomes a one-stop shop for managing, monitoring, and orchestrating data assets, including those used in Sigma's analytics and BI platform.

Practical Use Cases

  • For Data Engineers: Data engineers can use this integration to create pipelines that cover the entire data journey, from ingestion to transformation to Sigma dashboards. This setup allows them to trigger automatic Sigma dashboard refreshes in response to upstream changes.
  • For Analysts and Business Users: With data flowing smoothly from Dagster's pipelines into Sigma's workbooks and dashboards, analysts can be confident that their reports are always based on the latest data. This minimizes the risk of stale insights and ensures business users make decisions based on accurate, up-to-date information.

How It Works

The Dagster + Sigma integration is built to fit easily into existing workflows by representing Sigma data models and workbooks as assets in the Dagster asset graph. This setup provides a live view of data dependencies and relationships across the entire data lifecycle, allowing easier tracking and management.

Dagster uses assets to represent Sigma data models and workbooks, linking them to upstream data pipelines managed in Dagster. As a result, changes in data sources automatically flow through to Sigma, allowing BI dashboards to stay in sync without manual updates or redundant processes. All that is needed for the automated end-to-end experience is that materializations must first be created in the Sigma workbook.

Sigma data asset information in Dagster

How to Get Started

Getting started with the Dagster + Sigma integration is straightforward.

Simply install the dagster-sigma package, configure it to connect with your Sigma organization, and you're ready to start orchestrating Sigma assets alongside the rest of your data platform.

  from dagster_sigma import SigmaBaseUrl, SigmaOrganization, load_sigma_asset_specs
  import dagster as dg
  sigma_organization = SigmaOrganization(
    base_url=SigmaBaseUrl.AWS_US,
    client_id=dg.EnvVar("SIGMA_CLIENT_ID"),
    client_secret=dg.EnvVar("SIGMA_CLIENT_SECRET"),
  )
  sigma_specs = load_sigma_asset_specs(sigma_organization)
  defs = dg.Definitions(assets=[*sigma_specs], resources={"sigma": sigma_organization})

For more details, please refer to the setup doc, or the Dagster Sigma API documentation.

Try it out today, experience the power of orchestrated business intelligence, and tell us what you think!

Have feedback or questions? Start a discussion in Slack or Github.

Interested in working with us? View our open roles.

Want more content like this? Follow us on LinkedIn.

Dagster Newsletter

Get updates delivered to your inbox

Latest writings

The latest news, technologies, and resources from our team.

Multi-Tenancy for Modern Data Platforms
Webinar

April 7, 2026

Multi-Tenancy for Modern Data Platforms

Learn the patterns, trade-offs, and production-tested strategies for building multi-tenant data platforms with Dagster.

Deep Dive: Building a Cross-Workspace Control Plane for Databricks
Webinar

March 24, 2026

Deep Dive: Building a Cross-Workspace Control Plane for Databricks

Learn how to build a cross-workspace control plane for Databricks using Dagster — connecting multiple workspaces, dbt, and Fivetran into a single observable asset graph with zero code changes to get started.

Dagster Running Dagster: How We Use Compass for AI Analytics
Webinar

February 17, 2026

Dagster Running Dagster: How We Use Compass for AI Analytics

In this Deep Dive, we're joined by Dagster Analytics Lead Anil Maharjan, who demonstrates how our internal team utilizes Compass to drive AI-driven analysis throughout the company.

Monorepos, the hub-and-spoke model, and Copybara
Monorepos, the hub-and-spoke model, and Copybara
Blog

April 3, 2026

Monorepos, the hub-and-spoke model, and Copybara

How we configure Copybara for bi-directional syncing to enable a hub-and-spoke model for Git repositories

Making Dagster Easier to Contribute to in an AI-Driven World
Making Dagster Easier to Contribute to in an AI-Driven World
Blog

April 1, 2026

Making Dagster Easier to Contribute to in an AI-Driven World

AI has made contributing to open source easier but reviewing contributions is still hard. At Dagster, we’re improving the contributor experience with smarter review tooling, clearer guidelines, and a focus on contributions that are easier to evaluate, merge, and maintain.

DataOps with Dagster: A Practical Guide to Building a Reliable Data Platform
DataOps with Dagster: A Practical Guide to Building a Reliable Data Platform
Blog

March 17, 2026

DataOps with Dagster: A Practical Guide to Building a Reliable Data Platform

DataOps is about building a system that provides visibility into what's happening and control over how it behaves

How Magenta Telekom Built the Unsinkable Data Platform
Case study

February 25, 2026

How Magenta Telekom Built the Unsinkable Data Platform

Magenta Telekom rebuilt its data infrastructure from the ground up with Dagster, cutting developer onboarding from months to a single day and eliminating the shadow IT and manual workflows that had long slowed the business down.

Scaling FinTech: How smava achieved zero downtime with Dagster
Case study

November 25, 2025

Scaling FinTech: How smava achieved zero downtime with Dagster

smava achieved zero downtime and automated the generation of over 1,000 dbt models by migrating to Dagster's, eliminating maintenance overhead and reducing developer onboarding from weeks to 15 minutes.

Zero Incidents, Maximum Velocity: How HIVED achieved 99.9% pipeline reliability with Dagster
Case study

November 18, 2025

Zero Incidents, Maximum Velocity: How HIVED achieved 99.9% pipeline reliability with Dagster

UK logistics company HIVED achieved 99.9% pipeline reliability with zero data incidents over three years by replacing cron-based workflows with Dagster's unified orchestration platform.

Modernize Your Data Platform for the Age of AI
Guide

January 15, 2026

Modernize Your Data Platform for the Age of AI

While 75% of enterprises experiment with AI, traditional data platforms are becoming the biggest bottleneck. Learn how to build a unified control plane that enables AI-driven development, reduces pipeline failures, and cuts complexity.

Download the eBook on how to scale data teams
Guide

November 5, 2025

Download the eBook on how to scale data teams

From a solo data practitioner to an enterprise-wide platform, learn how to build systems that scale with clarity, reliability, and confidence.

Download the e-book primer on how to build data platforms
Guide

February 21, 2025

Download the e-book primer on how to build data platforms

Learn the fundamental concepts to build a data platform in your organization; covering common design patterns for data ingestion and transformation, data modeling strategies, and data quality tips.