Fair point. Databricks owns a scheduling DAG (Workflows, DLT). What I meant by "owns the DAG" is the semantic DAG: model-to-model dependencies with column-level types that the compiler builds.
Workflows knows task A runs before task B. Rocky knows `dim_customer.email` flows from `raw_users.email_address` through three CTEs in `stg_customers`. Different layer, same word.
I'll be more careful with that framing.
> I'll be more careful with that framing.
I think you should also try to do a better job selling the benefit of this.
As a data engineer, I can see why this might be useful, but glancing through your README, the dots were not completely connected
Make sense. Reviewing the README is on my TODO list. Thanks for the heads up!