Skip to main content

About running dbt in production

Running dbt in production means setting up a system to run a dbt job on a schedule, rather than running dbt commands manually from the command line. These production dbt jobs should create the tables and views that your business intelligence tools and end users query. Before continuing, make sure you understand dbt's approach to managing environments.

dbt commands in production

We've written a guide for the dbt commands we run in production, over on Discourse.

In addition to setting up a schedule, there are other considerations when setting up dbt to run in production:

  • The complexity involved in creating a new dbt job, or editing an existing one.
  • Setting up notifications if a step within your job returns an error code (e.g. a model cannot be built, or a test fails).
  • Accessing logs to help debug any issues.
  • Pulling the latest version of your git repo before running dbt (i.e. continuous deployment).
  • Running your dbt project before merging code into master (i.e. continuous integration).
  • Allowing access for team members that need to collaborate on your dbt project.

Ways to run dbt in production

If you don't want to run dbt commands manually on the command line, you can use dbt Cloud, Airflow, Prefect, Dagster, automation server, or Cron to run dbt jobs on a schedule.

dbt Cloud

We've built dbt Cloud to empower data teams to easily run dbt in production. If you're interested in trying out dbt Cloud, you can sign up for an account.

dbt Cloud enables you to:

  • run your jobs on a schedule
  • view logs for any historical invocation of dbt
  • configure error notifications
  • render your project's documentation

Airflow

If your organization is using Airflow, there are a number of ways you can run your dbt jobs, including:

  • Installing the dbt Cloud Provider to orchestrate dbt Cloud jobs. This package contains multiple Hooks, Operators, and Sensors to complete various actions within dbt Cloud. See an example airflow DAG to get started!
Airflow DAG using DbtCloudRunJobOperator

Airflow DAG using DbtCloudRunJobOperator

dbt Cloud job triggered by Airflow

dbt Cloud job triggered by Airflow

  • Invoking dbt Core jobs through the BashOperator. In this case, be sure to install dbt into a virtual environment to avoid issues with conflicting dependencies between Airflow and dbt.

For more details on both of these methods, including example implementations, check out this guide.

Prefect

If your organization is using Prefect, use the DbtShellTask to schedule, execute and monitor your dbt runs.

Alternatively, you can use the supported ShellTask to execute dbt commands through the shell.

You can also trigger dbt Cloud jobs with the DbtCloudRunJob task. Running this task will generate a markdown artifact viewable in the Prefect UI. The artifact will contain links to the dbt artifacts generate as a result of the job run.

Dagster

If your organization is using Dagster, you can use the dagster_dbt library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud, dbt CLI and the dbt RPC server. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Check out the example pipeline for details.

Automation servers

Automation servers, like CodeDeploy, GitLab CI/CD (video), Bamboo and Jenkins, can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository.

Cron

Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here.