Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add real world test suite #27

Merged
merged 13 commits into from
Oct 24, 2022
Merged

Add real world test suite #27

merged 13 commits into from
Oct 24, 2022

Conversation

valeriupredoi
Copy link
Collaborator

Hey guys @bnlawrence @davidhassell here's a pretty serious test module that looks at:

  • actual complex data with variable groups in one file
  • a netCDF3 (or CLASSIC) file that can not be opened by h5py
  • a native model dataset, that has _FillValue and missing_value
  • I have also added a repo where we have test data tests/test_data

Active() does silly well in all these cases, it's brutal how good it performs! 🍺

@bnlawrence
Copy link
Collaborator

In the process of reviewing this, I realised we needed some really clean test data files for us and those responsible for the storage systems themselves, hence the last commit. Will review the rest of it properly tomorrow.

@bnlawrence
Copy link
Collaborator

@valeriupredoi Why do we get warnings from the test_native_email_fails test? E.g.

tests/test_bigger_data.py::test_native_emac_model_fails
  /Users/BNL28/miniconda3/envs/pymb22a/lib/python3.9/site-packages/numpy/ma/core.py:5288: RuntimeWarning: Mean of empty slice.
    result = super().mean(axis=axis, dtype=dtype, **kwargs)[()]

tests/test_bigger_data.py::test_native_emac_model_fails
  /Users/BNL28/miniconda3/envs/pymb22a/lib/python3.9/site-packages/numpy/core/_methods.py:182: RuntimeWarning: invalid value encountered in divide
    ret = um.true_divide(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

Surely the test should fail when it can't open the file, before we get to that point? Or is the data empty anyway and it's failing first?

@valeriupredoi
Copy link
Collaborator Author

@bnlawrence those are warnings that don't trigger a test fail unless I setup pytest to fail on warning; I don't think there is an actual issue coming from them to be honest, I think it's just that the data is fully masked in that particular slice - note that those warnings come from the pure-Numpy block

    ncfile = str(test_data_path / "emac.nc")
    active = Active(ncfile, "aps_ave")
    active._version = 0
    d = active[4:5, 1:2]
    mean_result = np.mean(d)

that we can simply remove since the scope of the test is to show the fail on a Classic or netCDF3 file - what you reckon?

@valeriupredoi
Copy link
Collaborator Author

In the process of reviewing this, I realised we needed some really clean test data files for us and those responsible for the storage systems themselves, hence the last commit. Will review the rest of it properly tomorrow.

Very good point, cheers!

@bnlawrence
Copy link
Collaborator

I figured the easiest way to handle this was in the test, rather than remove, although as you say the scope of the test doesn't cover it., but there is a bigger problem to be handled in active itself, which needs to handle this without errors. I've dealt with it in ee26bd

@bnlawrence those are warnings that don't trigger a test fail unless I setup pytest to fail on warning; I don't think there is an actual issue coming from them to be honest, I think it's just that the data is fully masked in that particular slice - note that those warnings come from the pure-Numpy block

    ncfile = str(test_data_path / "emac.nc")
    active = Active(ncfile, "aps_ave")
    active._version = 0
    d = active[4:5, 1:2]
    mean_result = np.mean(d)

that we can simply remove since the scope of the test is to show the fail on a Classic or netCDF3 file - what you reckon?

Copy link
Collaborator

@bnlawrence bnlawrence left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm happy that this can be merged now, the outstanding bits are in other tickets.

# as it happens it is is possible for a slice to be
# all missing, so for the purpose of this test we
# ignore it, but the general case should not.
pass
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ouch! Haha, this is a proper hack, but heyho - this bit of the test is not important at all, I'll have a proper fix in the future, cheers for it for now @bnlawrence

@valeriupredoi valeriupredoi merged commit 6d90d9a into main Oct 24, 2022
@valeriupredoi valeriupredoi deleted the add_realworld_test branch October 24, 2022 10:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing testing duh
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants