Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix make test for running integration tests locally #591

Merged
merged 9 commits into from
Jun 13, 2022
Prev Previous commit
Next Next commit
Update instructions for running tests
dbeatty10 committed May 16, 2022
commit a965f72c2b45c8a727de19e64adc9c0a3688e6ea
11 changes: 0 additions & 11 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -52,17 +52,6 @@ These are the tools used in `dbt-utils` development and testing:

A deep understanding of these tools in not required to effectively contribute to `dbt-utils`, but we recommend checking out the attached documentation if you're interested in learning more about each one.

#### Virtual environments

We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
```sh
python3 -m venv env
source env/bin/activate
```

This will create and activate a new Python virtual environment.

## Testing

Once you're able to manually test that your code change is working as expected, it's important to run existing automated tests, as well as adding some new ones. These tests will ensure that:
113 changes: 99 additions & 14 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,90 @@
### Run the integration tests
### Overview
1. Prerequisites
1. Configure credentials
1. Setup Postgres (optional)
1. Setup virtual environment
1. Installation for development
1. Run the integration tests
1. Run tests
1. Creating a new integration test

### Prerequisites
- python3
- Docker

### Configure credentials
Edit the env file for your TARGET in `integration_tests/.env/[TARGET].env`.
Comment on lines +15 to +16
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to consolidate all the integration_tests/.env/[TARGET].env files into a single test.env file instead?


Load the environment variables:
```shell
set -a; source integration_tests/.env/[TARGET].env; set +a
```

To run the integration tests on your local machine, like they will get run in the CI (using CircleCI), you can do the following:
or more specific:
```shell
set -a; source integration_tests/.env/postgres.env; set +a
Comment on lines +18 to +25
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any more preferable way to store and load environment variables containing secrets?

For non-secret project-specific environment variables, I like using direnv

```

Assuming you are in the `integration_tests` folder,
#### Setup Postgres (optional)

Docker and `docker-compose` are both used in testing. Specific instructions for your OS can be found [here](https://docs.docker.com/get-docker/).

Postgres offers the easiest way to test most `dbt-utils` functionality today. Its tests are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:

```shell
make setup-db
```
or, alternatively:
```shell
docker-compose up --detach postgres
```

### Setup virtual environment

We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
```shell
python3 -m venv env
source env/bin/activate
```

This will create and activate a new Python virtual environment.

### Installation for development

```bash
First make sure that you set up your virtual environment as described above. Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-core` (and its dependencies) with:

```shell
make dev target=[postgres|redshift|...]
# or
pip install --pre dbt-[postgres|redshift|...] -r dev-requirements.txt
```

or more specific:

```shell
make dev target=postgres
# or
pip install --pre dbt-postgres -r dev-requirements.txt
```

### Run the integration tests

To run the integration tests on your local machine, like they will get run in the CI (using CircleCI), you can do the following:

```shell
make test target=[postgres|redshift|...] [models=...] [seeds=...]
```

or more specific:

```bash
```shell
make test target=postgres models=sql.test_star seeds=sql.data_star
```

or, to test against all targets:

```bash
make test-all [models=...] [seeds=...]
or, to run all tests:
```shell
make test target=postgres
```

Specying `models=` and `seeds=` is optional, however _if_ you specify `seeds`, you have to specify `models` too.
@@ -26,6 +93,17 @@ Where possible, targets are being run in docker containers (this works for Postg

### Creating a new integration test

#### Set up profiles
Do either one of the following:
1. Use `DBT_PROFILES_DIR`
```shell
cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
export DBT_PROFILES_DIR=$(cd integration_tests && pwd)
```
2. Use `~/.dbt/profiles.yml`
- Copy contents from `integration_tests/ci/sample.profiles.yml` into `~/.dbt/profiles.yml`.
Comment on lines +96 to +104
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When given the choice, I prefer including a profiles.yml file within each dbt project and then setting DBT_PROFILES_DIR. See example within run_test.sh

Of course this relies on profiles.yml to be completely devoid of actual secrets (which seems eminently doable since we can use environment variables instead).

Any cautions for me before I rename integration_tests/ci/sample.profiles.yml to integration_tests/profiles.yml?


#### Add your integration test
This directory contains an example dbt project which tests the macros in the `dbt-utils` package. An integration test typically involves making 1) a new seed file 2) a new model file 3) a generic test to assert anticipated behaviour.

For an example integration tests, check out the tests for the `get_url_parameter` macro:
@@ -35,13 +113,20 @@ For an example integration tests, check out the tests for the `get_url_parameter
3. [Model to test the macro](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/test_urls.sql)
4. [A generic test to assert the macro works as expected](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/schema.yml#L2)


Once you've added all of these files, you should be able to run:

Assuming you are in the `integration_tests` folder,
```shell
dbt deps --target {your_target}
dbt seed --target {your_target}
dbt run --target {your_target} --model {your_model_name}
dbt test --target {your_target} --model {your_model_name}
```
$ dbt deps
$ dbt seed
$ dbt run --model {your_model_name}
$ dbt test --model {your_model_name}

Alternatively:
```shell
dbt deps --target {your_target}
dbt build --target {your_target} --select +{your_model_name}
```

If the tests all pass, then you're good to go! All tests will be run automatically when you create a PR against this repo.