My prediction a few years back was that Materialize (or similar tech) would magically solve this - data teams could operate in terms of pure views and let the database engine differentiate their SQL and determine how to apply incremental (ideally streaming) updates through the view stack. While I'm in an adjacent space, I don't do this day-to-day so I'm not quite sure what's holding back adoption here - maybe in a few years more we'll get there.
You can read some case studies here https://tobikodata.com/harness.html or join Slack to meet with folks to learn more about their experiences.
Dbt introduced a language for managing these “metrics” at scale including the ability to use variables and more complex templates (Jinja.)
Then you do dbt run (https://docs.getdbt.com/reference/commands/run) and kapow the metric is populated in your database.
More broadly dbt did two other things: 1. It pushed the paradigm from ETL to ELT (so stick all the data in your warehouse and then transform it rather than transform it at extraction time.) 2. It created the concept of an “analytics engineer” (previously know as guy who knows SQL or business analyst.)
Then it provides additional tooling around that, GUI’s, governance, everything your average large corporate asks for.