Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

👷 Setup GitHub Actions Continuous Integration tests #5

Merged
merged 5 commits into from
May 11, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This workflow will install Python dependencies and run tests on a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: Test
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
workflow_dispatch:

permissions:
contents: read

jobs:
test:
name: ${{ matrix.os }} - Python ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
python-version: ["3.11"]
os: [ubuntu-22.04]
defaults:
run:
shell: bash -l {0}

steps:
# Checkout current git repository
- name: Checkout
uses: actions/checkout@8e5e7e5ab8b370d6c329ec480221332ada57f0ab # v3.5.2

# Install Micromamba with conda-forge dependencies
- name: Setup Micromamba
uses: mamba-org/setup-micromamba@1887e3afc05fd11eb40c9542c2939ac04234546e # v1.2.1
with:
environment-name: chabud
environment-file: conda-lock.yml

# Run the unit tests
- name: Test with pytest
run: |
micromamba install python=${{ matrix.python-version }} pytest
python -m pytest --verbose chabud/tests/
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
# Byte-compiled / optimized / DLL files
__pycache__/

# Data files and folders
data/**
!data/**/

# Unit test / coverage reports
.pytest_cache/
33 changes: 33 additions & 0 deletions chabud/tests/test_datapipe.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
import lightning as L
import torch

from chabud.datapipe import ChaBuDDataPipeModule


# %%
def test_datapipemodule():
"""
Ensure that ChaBuDDataPipeModule works to load data from a nested HDF5 file
into torch.Tensor and list objects.
"""
datamodule: L.LightningDataModule = ChaBuDDataPipeModule(
hdf5_urls=[
"https://huggingface.co/datasets/chabud-team/chabud-extra/resolve/main/california_2.hdf5"
]
)
datamodule.setup()

it = iter(datamodule.train_dataloader())
pre_image, post_image, mask, metadata = next(it)

assert pre_image.shape == (32, 512, 512, 12)
assert pre_image.dtype == torch.int16

assert post_image.shape == (32, 512, 512, 12)
assert post_image.dtype == torch.int16

assert mask.shape == (32, 512, 512)
assert mask.dtype == torch.uint8

assert len(metadata) == 32
assert set(metadata[0].keys()) == {"filename", "uuid"}