


In this Deep Dive, we are joined by Dagster Data Engineer Nick Roach, who showcases how we run event-driven pipelines with Apache Kafka and Flink.
At Dagster, we process millions of events a day from Dagster+. This event data is used for tracking credit usage, powers tools like Insights, and provides a peek into how organizations are using our platform. This data comes in at such velocity that we need to take advantage of stream processing tools like Apache Kafka and Flink to ensure that we are capturing every event and doing so efficiently. In this webinar, we'll show you how we integrate these tools into our own data platform, take advantage of Dagster's observability features to monitor this streaming pipeline, and ingest this data into our internal data warehouse.




In this Deep Dive, we are joined by Dagster Data Engineer Nick Roach, who showcases how we run event-driven pipelines with Apache Kafka and Flink.
At Dagster, we process millions of events a day from Dagster+. This event data is used for tracking credit usage, powers tools like Insights, and provides a peek into how organizations are using our platform. This data comes in at such velocity that we need to take advantage of stream processing tools like Apache Kafka and Flink to ensure that we are capturing every event and doing so efficiently. In this webinar, we'll show you how we integrate these tools into our own data platform, take advantage of Dagster's observability features to monitor this streaming pipeline, and ingest this data into our internal data warehouse.