Skip to main content

Set up MetricFlow

Getting started

First, if you want to follow along, we'll need to clone the example project. You will need access to a Snowflake, BigQuery, Databricks, or Postgres warehouse for this, for the time being. The project is our classic Jaffle Shop, a simulated chain restaurant serving jaffles and tasty beverages.

git clone
cd path/to/project

Next, before you start writing code, you need to install MetricFlow:

  • dbt Cloud CLI MetricFlow commands are embedded in the dbt Cloud CLI. You can immediately run them once you install the dbt Cloud CLI. Using dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning.

  • dbt Cloud IDE You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon.

  • Now that you're ready to use MetricFlow, get to the pre-Semantic Layer starting state by checking out the start-here branch:
git checkout start-here

For more information, refer to the MetricFlow commands or the quickstart guides to get more familiar with setting up a dbt project.

Basic commands

  • 💻 This package will install both dbt and mf as CLIs in our virtual environment. All the regular dbt commands like run, build, and test are available.
  • 🔍 A less common one that will come in handy with the Semantic Layer is dbt parse. This will parse your project and generate a semantic manifest, a representation of meaningful connections described by your project. This file gives MetricFlow a state of the world from which to generate queries.
  • 🧰 In addition to dbt, you'll have access to mf commands like query and validate-configs, which operate based on that semantic manifest. We'll dig more into all of these as we go along.
  • 🛠️ Lets start off by running a dbt build to get the starting state of our project built.