-
Notifications
You must be signed in to change notification settings - Fork 4
Dict dataset_version params #652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this implementation much better. The dataset and the version are clearly associated now. 💯
Codecov ReportAttention: Patch coverage is
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## develop #652 +/- ##
===========================================
- Coverage 77.70% 77.57% -0.13%
===========================================
Files 142 142
Lines 6516 6528 +12
===========================================
+ Hits 5063 5064 +1
- Misses 1453 1464 +11
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
* Use OTF backend for drivers instead of precanned response * Use mocks for tree cover loss by drivers tests * Use pydantic for tcl by driver compute task * Use more consistent naming * Deal with error from OTF in drivers background task * Add logging to background task * Check geostore is valid on POST * Fix error writing * Fix test * Add version overrides option for OTF query * Add tests for dataset overrides * Remove deprecation flag * Upgrade cryptography for security; Improve README; bump uv * Add decorator to make dict hasheable in _get_data_environment * Commit decorator module * WIP:api key req for datamart endpoints * WIP: add analysis results table * Fixed broken cache_clear() due to new decorator on _get_data_environment * Truly use pydantic models everywhere * datamart orm crud operations * snake_case * change details field to message for errors to conform to JSEND * store error message as error in db * add db index on data mart analysis id * return resource status * pass geostore_id from the request directly * rename error to message so it can be used for pending messages as well * Require GFW Data API key for country query, but send our key to RW * json.loads doesn't like type Secret * Just always send Data API's API key when proxying to RW * Various fixes for drivers endpoints -Snake case for everything, including API path, to be consistent. -Missed that the call in the compute task to _query_dataset_json always used the default version for -umd_tree_cover_loss as the default asset, but should use the version override if it exists. While trying to find this, ending up adding other unit tests for queries module to make sure everything is covered. * Require API key on all new geostore endpoints * Replace insanely incorrect test (what was I thinking?) * De-monkeypatch/improve another test * Remove unnecessary x-api-key param * De-monkeypatch another test * De-monkeypatch and rename another test * Check status code too * Remove the remaining uses of Monkeypatch * update unit tests * snake_case endpoint * Add rw_api_key_arn for staging and prod * Make API keys optional for RW geostore endpoints (revert later) * Fix providing an API key when adding a RW geostore * Remove default for RW_API_KEY (will always be set) * create new GW API * prevent data mart proxy collision with catch-all root one * minor fixes/clean up * point to new API GW resource in dev * Set RW API key to None for local envs; only send RW API key when it's not None * Make the export_ndjson job run on the on-demand queue because it's non-resumable * Fix pinning UV version, URL * D'oh. Fix UV version * Dict dataset_version params (#652) For the first iteration of pinning datasets and versions, I couldn't figure out how to send a dictionary as a FastAPI parameter, so I made two list parameters, dataset and version. The code just zipped these together in the order provided, which could lead to all kinds of mistakes. I figured out how we can send a dictionary parameter like dataset_version[dataset]=version by using the Request object in FastAPI. This is definitely more ideal and what we originally agreed on. * Give ECS permission to read RW key secret * Fix key of RW API key secret * GTC-3172: support GADM areas in drivers endpoints (#656) * feat(DataMart): add `AdminAreaOfInterest` to TCL by driver * 📝 docs(DataMart): update docs to enable Swagger usage * 🎨 refactor(DataMart): create `AreaOfInterest` abstraction * feat(Geostore): add ability to retrieve geostore ID only * feat(DataMart): `AdminAreaOfInterest` only retrieves geostore ID * feat(AreaOfInterest): add discriminator param for better docs * Flatten response from datamart (#660) * Return 409 for POST on existing resource, don't convert UUID to UUID (#662) --------- Co-authored-by: Daniel Mannarino <daniel.mannarino@gmail.com> Co-authored-by: Solomon Negusse <solomon.negusse@wri.org> Co-authored-by: Gary Tempus <gtempus@users.noreply.github.com>
For the first iteration of pinning datasets and versions, I couldn't figure out how to send a dictionary as a FastAPI parameter, so I made two list parameters,
dataset
andversion
. The code just zipped these together in the order provided, which could lead to all kinds of mistakes.I figured out how we can send a dictionary parameter like
dataset_version[dataset]=version
by using theRequest
object in FastAPI. This is definitely more ideal and what we originally agreed on.