First, if you want to follow along, we'll need to clone the example project. You will need access to a Snowflake, BigQuery, Databricks, or Postgres warehouse for this, for the time being. The project is our classic Jaffle Shop, a simulated chain restaurant serving jaffles and tasty beverages.
git clone firstname.lastname@example.org:dbt-labs/jaffle-sl-template.git
Next, before you start writing code, you need to install MetricFlow as an extension of a dbt adapter from PyPI (dbt Core users only). The MetricFlow is compatible with Python versions 3.8 through 3.11.
We'll use pip to install MetricFlow and our dbt adapter:
# activate a virtual environment for your project,
# if you don't have a name you like to use we suggest .venv
python -m venv [virtual environment name]
source [virtual environment name]/bin/activate
# install dbt and MetricFlow
python -m pip install "dbt-metricflow[adapter name]"
# e.g. python -m pip install "dbt-metricflow[snowflake]"
Lastly, to get to the pre-Semantic Layer starting state, checkout the
git checkout start-here
- 💻 This package will install both
mfas CLIs in our virtual environment. All the regular
- 🔍 A less common one that will come in handy with the Semantic Layer is
dbt parse. This will parse your project and generate a semantic manifest, a representation of meaningful connections described by your project. This file gives MetricFlow a state of the world from which to generate queries.
- 🧰 In addition to
dbt, you'll have access to
validate-configs, which operate based on that semantic manifest. We'll dig more into all of these as we go along.
- 🛠️ Lets start off by running a
dbt buildto get the starting state of our project built.