-
Notifications
You must be signed in to change notification settings - Fork 183
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build_all.sh fails when gdas.cd is present #1043
Comments
I've got this bug fixed, but in trying to confirm the fix I get another error during the GDASApp build:
|
The GDASApp hash in
with
This alone, is not sufficient, for GDASApp to build. The ioda-converters in GDASApp require Until
|
Does GDASApp no longer build using the hash checked out as part of the develop branch of |
No, the build fails as @WalterKolczynski-NOAA documented. This is the jedicmake problem. We fixed this in GDASApp develop and so the GDASApp hash in checkout.sh needs to be updated. While updating the GDASApp develop hash will get us through the configure part of the GDASApp build, the compilation will still fail. The ioda-converters extracted with the GDASApp require |
I'm going to submit the fix for the bug that is ours, since I'm confident I have fixed that. I dislike that GDAS is using non-static versions of components by checking out develop instead of specific commits. It makes it impossible to guarantee that any specific version at our level will work in the same way (or at all, as seen here). We have a policy against using dynamic version pointers in workflow, and would hope that constituent components also follow them. It can be relaxed since GDAS is still in a preliminary stage, but as soon as people start really using it, this will be a problem. |
@WalterKolczynski-NOAA agreed, NOAA-EMC/GDASApp#143 is already created. Soon we will switch to checking out specific tags/commits of JEDI repos. |
When the GDAS app was added to the workflow, the corresponding build setting was not added to partial_build and the build configuration file. Refs NOAA-EMC#1043
When the GDAS app was added to the workflow, the corresponding build setting was not added to partial_build and the build configuration file. This means that after `build_all.sh` was updated to correct syntax issues, the build would fail because `$Build_gdas` was undefined. Note: the GDAS app still does not build currently due to unrelated problems within the gdas repo. Refs #1043
g-w issue #1067 updates the GDASApp hash. g-w successfully builds the GDASApp (gdas.cd) with the updated GDASApp hash. |
Updates the GDASapp version to a newer version that builds correctly. The former version no longer builds because submodules point to develop branches that are no longer compatible. Moves module loads out of the j-jobs and into their appropriate place in the rocoto job. A new load module script is added to handle the different module set needed for UFSDA. Also temporarily turns off strict mode for the UFSDA jobs to avoid PS1 unbound error in conda. Fixes #1043 Fixes #1067
Expected behavior
./build_all.sh
on Orion from develop @ 833b8f4 should run to completion whensorc/gdas.cd
is present.Current behavior
./build_all.sh
fails on Orion when it gets to gdas.cd.Machines affected
Only tested on Orion.
To Reproduce
develop
at 833b8f4cd sorc
checkout.sh -g -u
to clone both DA engines for comparative studies, butcheckout.sh -u
is sufficient to see the failure../build_all.sh
Possible Implementation
As a test,
Build_gdas
toBuild_prg
list inpartial_build.sh
Building gdas (gdas) .................................. yes
togfs_build.cfg
./build_all.sh
runs to completionSince
./checkout.sh
supports both GSI (-g
) and UFS (-u
) DA builds, shouldpartial_build.sh
andgfs_build.cfg
include both gsi_enkf and gdas?The text was updated successfully, but these errors were encountered: