Orchestrating data in the mesh of the fragmented modern data stack
The fragmented modern data stack has emerged as the unbundling of Airflow. Various tools operate in silos. Dagster as a next-generation data orchestrator allows you to clearly see the data dependencies of the individual pipelines on your data factory floor. Following along with my blog post series about Dagster I will cover:
- Getting started with dagster and building simple data pipelines
- How software-defined assets allow to turn data pipelines around and result in higher quality by allowing to integrate data quality tests straight into the pipelines as well as by separating business logic from infrastructure allowing for better testability.
URLS for reference: