Skip to content

Commit 67ef54b

Browse files
authored
Fix make test for running integration tests locally (#591)
* Remove venv from CircleCI * Utilize latest minor version of Python 3.9 for CircleCI (rather than pinned patch version) * Align local environment variables with CircleCI * Ignore changes related to running integration tests * Move the make file to the project root * Refactor make commands to run integration tests * Update instructions for running tests * Implementation guidelines * Switch order of testing all models vs. a single model in the instructions
1 parent 471a838 commit 67ef54b

13 files changed

+175
-112
lines changed

.circleci/config.yml

+8-4
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ jobs:
55

66
integration-postgres:
77
docker:
8-
- image: cimg/python:3.9.9
8+
- image: cimg/python:3.9
99
- image: cimg/postgres:9.6
1010
environment:
1111
POSTGRES_USER: root
@@ -18,6 +18,7 @@ jobs:
1818

1919
steps:
2020
- checkout
21+
- run: pip install --pre dbt-postgres -r dev-requirements.txt
2122
- run:
2223
name: "Run Functional Tests - Postgres"
2324
command: ./run_functional_test.sh postgres
@@ -29,9 +30,10 @@ jobs:
2930

3031
integration-redshift:
3132
docker:
32-
- image: cimg/python:3.9.9
33+
- image: cimg/python:3.9
3334
steps:
3435
- checkout
36+
- run: pip install --pre dbt-redshift -r dev-requirements.txt
3537
- run:
3638
name: "Run Functional Tests - Redshift"
3739
command: ./run_functional_test.sh redshift
@@ -43,9 +45,10 @@ jobs:
4345

4446
integration-snowflake:
4547
docker:
46-
- image: cimg/python:3.9.9
48+
- image: cimg/python:3.9
4749
steps:
4850
- checkout
51+
- run: pip install --pre dbt-snowflake -r dev-requirements.txt
4952
- run:
5053
name: "Run Functional Tests - Snowflake"
5154
command: ./run_functional_test.sh snowflake
@@ -59,9 +62,10 @@ jobs:
5962
environment:
6063
BIGQUERY_SERVICE_KEY_PATH: "/home/circleci/bigquery-service-key.json"
6164
docker:
62-
- image: cimg/python:3.9.9
65+
- image: cimg/python:3.9
6366
steps:
6467
- checkout
68+
- run: pip install --pre dbt-bigquery -r dev-requirements.txt
6569
- run:
6670
name: "Set up credentials"
6771
command: echo $BIGQUERY_SERVICE_ACCOUNT_JSON > ${HOME}/bigquery-service-key.json

CONTRIBUTING.md

+11-14
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,12 @@
33
`dbt-utils` is open source software. It is what it is today because community members have opened issues, provided feedback, and [contributed to the knowledge loop](https://www.getdbt.com/dbt-labs/values/). Whether you are a seasoned open source contributor or a first-time committer, we welcome and encourage you to contribute code, documentation, ideas, or problem statements to this project.
44

55
1. [About this document](#about-this-document)
6-
2. [Getting the code](#getting-the-code)
7-
3. [Setting up an environment](#setting-up-an-environment)
8-
4. [Testing dbt-utils](#testing)
9-
5. [Adding CHANGELOG Entry](#adding-changelog-entry)
10-
6. [Submitting a Pull Request](#submitting-a-pull-request)
6+
1. [Getting the code](#getting-the-code)
7+
1. [Setting up an environment](#setting-up-an-environment)
8+
1. [Implementation guidelines](#implementation-guidelines)
9+
1. [Testing dbt-utils](#testing)
10+
1. [Adding CHANGELOG Entry](#adding-changelog-entry)
11+
1. [Submitting a Pull Request](#submitting-a-pull-request)
1112

1213
## About this document
1314

@@ -52,16 +53,12 @@ These are the tools used in `dbt-utils` development and testing:
5253

5354
A deep understanding of these tools in not required to effectively contribute to `dbt-utils`, but we recommend checking out the attached documentation if you're interested in learning more about each one.
5455

55-
#### Virtual environments
56+
## Implementation guidelines
5657

57-
We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
58-
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
59-
```sh
60-
python3 -m venv env
61-
source env/bin/activate
62-
```
63-
64-
This will create and activate a new Python virtual environment.
58+
Ensure that changes will work on "non-core" adapters by:
59+
- dispatching any new macro(s) so non-core adapters can also use them (e.g. [the `star()` source](https://github.com/fishtown-analytics/dbt-utils/blob/master/macros/sql/star.sql))
60+
- using the `limit_zero()` macro in place of the literal string: `limit 0`
61+
- using `dbt_utils.type_*` macros instead of explicit datatypes (e.g. `dbt_utils.type_timestamp()` instead of `TIMESTAMP`
6562

6663
## Testing
6764

Makefile

+24
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
.DEFAULT_GOAL:=help
2+
3+
.PHONY: test
4+
test: ## Run the integration tests.
5+
@./run_test.sh $(target) $(models) $(seeds)
6+
7+
.PHONY: dev
8+
dev: ## Installs dbt-* packages in develop mode along with development dependencies.
9+
@\
10+
echo "Install dbt-$(target)..."; \
11+
pip install --upgrade pip setuptools; \
12+
pip install --pre "dbt-$(target)" -r dev-requirements.txt;
13+
14+
.PHONY: setup-db
15+
setup-db: ## Setup Postgres database with docker-compose for system testing.
16+
@\
17+
docker-compose up --detach postgres
18+
19+
.PHONY: help
20+
help: ## Show this help message.
21+
@echo 'usage: make [target]'
22+
@echo
23+
@echo 'targets:'
24+
@grep -E '^[8+a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'

docker-compose.yml

+3-27
Original file line numberDiff line numberDiff line change
@@ -1,32 +1,8 @@
11
version: "3.7"
22
services:
3-
4-
dbt:
5-
image: circleci/python:3.6.3-stretch
6-
depends_on:
7-
- ${TARGET}
8-
env_file: "./integration_tests/.env/${TARGET}.env"
9-
entrypoint: "/repo/run_test.sh ${TARGET} ${MODELS} ${SEEDS}"
10-
working_dir: /repo
11-
volumes:
12-
- ".:/repo"
13-
143
postgres:
15-
image: circleci/postgres:9.6.5-alpine-ram
4+
image: cimg/postgres:9.6
5+
environment:
6+
- POSTGRES_USER=root
167
ports:
178
- "5432:5432"
18-
19-
# dummy container, since snowflake is a managed service
20-
snowflake:
21-
image: circleci/python:3.6.3-stretch
22-
entrypoint: "/bin/true"
23-
24-
# dummy container, since bigquery is a managed service
25-
bigquery:
26-
image: circleci/python:3.6.3-stretch
27-
entrypoint: "/bin/true"
28-
29-
# dummy container, since redshift is a managed service
30-
redshift:
31-
image: circleci/python:3.6.3-stretch
32-
entrypoint: "/bin/true"

integration_tests/.env/bigquery.env

+2-1
Original file line numberDiff line numberDiff line change
@@ -1 +1,2 @@
1-
GCLOUD_SERVICE_KEY_PATH=
1+
BIGQUERY_SERVICE_KEY_PATH=
2+
BIGQUERY_TEST_DATABASE=

integration_tests/.env/postgres.env

+5-5
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
CI_DBT_HOST=postgres
2-
CI_DBT_USER=root
3-
CI_DBT_PASS=''
4-
CI_DBT_PORT=5432
5-
CI_DBT_DBNAME=circle_test
1+
POSTGRES_TEST_HOST=localhost
2+
POSTGRES_TEST_USER=root
3+
POSTGRES_TEST_PASS=''
4+
POSTGRES_TEST_PORT=5432
5+
POSTGRES_TEST_DBNAME=circle_test

integration_tests/.env/redshift.env

+5-4
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
1-
CI_REDSHIFT_DBT_HOST=
2-
CI_REDSHIFT_DBT_USER=
3-
CI_REDSHIFT_DBT_PASS=
4-
CI_REDSHIFT_DBT_DBNAME=
1+
REDSHIFT_TEST_HOST=
2+
REDSHIFT_TEST_USER=
3+
REDSHIFT_TEST_PASS=
4+
REDSHIFT_TEST_DBNAME=
5+
REDSHIFT_TEST_PORT=

integration_tests/.env/snowflake.env

+6-6
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
CI_SNOWFLAKE_DBT_ACCOUNT=
2-
CI_SNOWFLAKE_DBT_USER=
3-
CI_SNOWFLAKE_DBT_PASS=
4-
CI_SNOWFLAKE_DBT_ROLE=
5-
CI_SNOWFLAKE_DBT_DATABASE=
6-
CI_SNOWFLAKE_DBT_WAREHOUSE=
1+
SNOWFLAKE_TEST_ACCOUNT=
2+
SNOWFLAKE_TEST_USER=
3+
SNOWFLAKE_TEST_PASSWORD=
4+
SNOWFLAKE_TEST_ROLE=
5+
SNOWFLAKE_TEST_DATABASE=
6+
SNOWFLAKE_TEST_WAREHOUSE=

integration_tests/.gitignore

+2
Original file line numberDiff line numberDiff line change
@@ -2,3 +2,5 @@
22
target/
33
dbt_modules/
44
logs/
5+
.env/
6+
profiles.yml

integration_tests/Makefile

-14
This file was deleted.

integration_tests/README.md

+100-15
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,90 @@
1-
### Run the integration tests
1+
### Overview
2+
1. Prerequisites
3+
1. Configure credentials
4+
1. Setup Postgres (optional)
5+
1. Setup virtual environment
6+
1. Installation for development
7+
1. Run the integration tests
8+
1. Run tests
9+
1. Creating a new integration test
10+
11+
### Prerequisites
12+
- python3
13+
- Docker
14+
15+
### Configure credentials
16+
Edit the env file for your TARGET in `integration_tests/.env/[TARGET].env`.
17+
18+
Load the environment variables:
19+
```shell
20+
set -a; source integration_tests/.env/[TARGET].env; set +a
21+
```
222

3-
To run the integration tests on your local machine, like they will get run in the CI (using CircleCI), you can do the following:
23+
or more specific:
24+
```shell
25+
set -a; source integration_tests/.env/postgres.env; set +a
26+
```
427

5-
Assuming you are in the `integration_tests` folder,
28+
#### Setup Postgres (optional)
629

7-
```bash
8-
make test target=[postgres|redshift|...] [models=...] [seeds=...]
30+
Docker and `docker-compose` are both used in testing. Specific instructions for your OS can be found [here](https://docs.docker.com/get-docker/).
31+
32+
Postgres offers the easiest way to test most `dbt-utils` functionality today. Its tests are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:
33+
34+
```shell
35+
make setup-db
36+
```
37+
or, alternatively:
38+
```shell
39+
docker-compose up --detach postgres
40+
```
41+
42+
### Setup virtual environment
43+
44+
We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
45+
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
46+
```shell
47+
python3 -m venv env
48+
source env/bin/activate
49+
```
50+
51+
This will create and activate a new Python virtual environment.
52+
53+
### Installation for development
54+
55+
First make sure that you set up your virtual environment as described above. Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-core` (and its dependencies) with:
56+
57+
```shell
58+
make dev target=[postgres|redshift|...]
59+
# or
60+
pip install --pre dbt-[postgres|redshift|...] -r dev-requirements.txt
961
```
1062

1163
or more specific:
1264

13-
```bash
14-
make test target=postgres models=sql.test_star seeds=sql.data_star
65+
```shell
66+
make dev target=postgres
67+
# or
68+
pip install --pre dbt-postgres -r dev-requirements.txt
1569
```
1670

17-
or, to test against all targets:
71+
### Run the integration tests
72+
73+
To run all the integration tests on your local machine like they will get run in the CI (using CircleCI):
74+
75+
```shell
76+
make test target=postgres
77+
```
1878

19-
```bash
20-
make test-all [models=...] [seeds=...]
79+
or, to run tests for a single model:
80+
```shell
81+
make test target=[postgres|redshift|...] [models=...] [seeds=...]
82+
```
83+
84+
or more specific:
85+
86+
```shell
87+
make test target=postgres models=sql.test_star seeds=sql.data_star
2188
```
2289

2390
Specying `models=` and `seeds=` is optional, however _if_ you specify `seeds`, you have to specify `models` too.
@@ -26,6 +93,17 @@ Where possible, targets are being run in docker containers (this works for Postg
2693

2794
### Creating a new integration test
2895

96+
#### Set up profiles
97+
Do either one of the following:
98+
1. Use `DBT_PROFILES_DIR`
99+
```shell
100+
cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
101+
export DBT_PROFILES_DIR=$(cd integration_tests && pwd)
102+
```
103+
2. Use `~/.dbt/profiles.yml`
104+
- Copy contents from `integration_tests/ci/sample.profiles.yml` into `~/.dbt/profiles.yml`.
105+
106+
#### Add your integration test
29107
This directory contains an example dbt project which tests the macros in the `dbt-utils` package. An integration test typically involves making 1) a new seed file 2) a new model file 3) a generic test to assert anticipated behaviour.
30108

31109
For an example integration tests, check out the tests for the `get_url_parameter` macro:
@@ -35,13 +113,20 @@ For an example integration tests, check out the tests for the `get_url_parameter
35113
3. [Model to test the macro](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/test_urls.sql)
36114
4. [A generic test to assert the macro works as expected](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/schema.yml#L2)
37115

38-
39116
Once you've added all of these files, you should be able to run:
117+
118+
Assuming you are in the `integration_tests` folder,
119+
```shell
120+
dbt deps --target {your_target}
121+
dbt seed --target {your_target}
122+
dbt run --target {your_target} --model {your_model_name}
123+
dbt test --target {your_target} --model {your_model_name}
40124
```
41-
$ dbt deps
42-
$ dbt seed
43-
$ dbt run --model {your_model_name}
44-
$ dbt test --model {your_model_name}
125+
126+
Alternatively:
127+
```shell
128+
dbt deps --target {your_target}
129+
dbt build --target {your_target} --select +{your_model_name}
45130
```
46131
47132
If the tests all pass, then you're good to go! All tests will be run automatically when you create a PR against this repo.

run_functional_test.sh

-10
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,3 @@
11
#!/bin/bash
2-
VENV="venv/bin/activate"
32

4-
if [[ ! -f $VENV ]]; then
5-
python3 -m venv venv
6-
. $VENV
7-
8-
pip install --upgrade pip setuptools
9-
pip install --pre "dbt-$1" -r dev-requirements.txt
10-
fi
11-
12-
. $VENV
133
python3 -m pytest tests/functional --profile $1

0 commit comments

Comments
 (0)