Skip to content

Service Testing

Overview

Once you're able to manually test that your code change is working as expected, it's important to run existing automated tests, as well as adding some new ones. These tests will ensure that:

  • Your code changes do not unexpectedly break other established functionality
  • Your code changes can handle all known edge cases
  • The functionality you're adding will keep working in the future

Although dbt-core works with a number of different databases, you won't need to supply credentials for every one of these databases in your test environment. Instead, you can test most dbt-core code changes with Python and Postgres.

Initial setup

Postgres offers the easiest way to test most dbt-core functionality today. They are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:

make setup-db

or, alternatively:

docker-compose up -d database
PGHOST=localhost PGUSER=root PGPASSWORD=password PGDATABASE=postgres bash test/setup_db.sh

Test commands

There are a few methods for running tests locally.

Makefile

There are multiple targets in the Makefile to run common test suites and code checks, most notably:

# Runs unit tests with py38 and code checks in parallel.
make test
# Runs postgres integration tests with py38 in "fail fast" mode.
make integration

These make targets assume you have a local installation of a recent version of tox for unit/integration testing and pre-commit for code quality checks, unless you use choose a Docker container to run tests. Run make help for more info.

Check out the other targets in the Makefile to see other commonly used test suites.

pre-commit

pre-commit takes care of running all code-checks for formatting and linting. Run make dev to install pre-commit in your local environment (we recommend running this command with a python virtual environment active). This command installs several pip executables including black, mypy, and flake8. Once this is done you can use any of the linter-based make targets as well as a git pre-commit hook that will ensure proper formatting and linting.

pytest

Finally, you can also run a specific test or group of tests using pytest directly. With a virtualenv active and dev dependencies installed you can do things like:

# run all unit tests in a file
python3 -m pytest test/unit/test_graph.py
# run a specific unit test
python3 -m pytest test/unit/test_graph.py::GraphTest::test__dependency_list
# run specific Postgres integration tests (old way)
python3 -m pytest -m profile_postgres test/integration/074_postgres_unlogged_table_tests
# run specific Postgres integration tests (new way)
python3 -m pytest tests/functional/sources

See pytest usage docs for an overview of useful command-line options.