-
Notifications
You must be signed in to change notification settings - Fork 264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reading hdf5 Dataset with two equalsized unlimited dimensions results in only one dimension #945
Comments
I see
and the netcdf4-python output is consistent with that. The fact that the variable has the same dimension associated with it twice can't be changed without re-creating the file. |
@jswhit Thanks for looking into this. Do you happen to know how should the hdf5 file be created, that netcdf is able to detect two different dimensions? |
I can tell you how to create it with netcdf (netcdf4-python), but not with hdf5 (h5py). How are you creating the file now? |
@jswhit With netcdf4-python I know it too 😀 IIUC when reading with netcdf the dimension mapping is done via netcdf-c library. If the array is 360x361 then I would get two dimensions (phony_dim0, phony_dim1). So there must be some logic to detect this. Maybe that by introducing some switch for loading (eg. squeeze_dims=False) the current behaviour can be overridden? |
There's only one dimension in the h5 file - so the netcdf library doesn't have much choice in this case. I think the dimensions are associated with variables in hdf5 using the "dimension scales" API (http://docs.h5py.org/en/stable/high/dims.html). |
@jswhit Yes, good chance that the problem is at creation time. Using h5py to retrieve the dimensions I get two different objects out. I'll need to investigate a bit more to track this down. Thanks for the pointer! I'll close the issue for now. Would you be happy if I reopen when I have more information? |
Sure - but you may need to open it under the netcdf-c project if ncdump is not showing the extra dimension. |
I bet that the C code is not creating any dimensions for the variables, so the netcdf lib is having to guess (or create it's own 'phony' dimensions based upon the shape of the variable). |
@jswhit OK, I'll try at netcdf-c next time. Thanks for the help so far. |
…unking, georeferencing), introduce two classes for holding open netcdf-filehandles (also for properly closing), only hold sweep-data in Dataset-dict, workaround Unidata/netcdf4-python#945, properly load multiple OdimH5 files into one volume (DWD one sweep one moment files), several simplifications
…unking, georeferencing), introduce two classes for holding open netcdf-filehandles (also for properly closing), only hold sweep-data in Dataset-dict, workaround Unidata/netcdf4-python#945, properly load multiple OdimH5 files into one volume (DWD one sweep one moment files), several simplifications
…hunking, georeferencing), introduce two classes for holding open netcdf-filehandles (also for properly closing), only hold sweep-data in Dataset-dict, workaround Unidata/netcdf4-python#945, properly load multiple OdimH5 files into one volume (DWD one sweep one moment files), several simplifications
…hunking, georeferencing), introduce two classes for holding open netcdf-filehandles (also for properly closing), only hold sweep-data in Dataset-dict, workaround Unidata/netcdf4-python#945, properly load multiple OdimH5 files into one volume (DWD one sweep one moment files), several simplifications (#367)
@jswhit, FYI, I created an issue outlining the problem at netcdf-c Unidata/netcdf-c#1484 |
…hunking, georeferencing), introduce two classes for holding open netcdf-filehandles (also for properly closing), only hold sweep-data in Dataset-dict, workaround Unidata/netcdf4-python#945, properly load multiple OdimH5 files into one volume (DWD one sweep one moment files), several simplifications (wradlib#367)
I'm reading an hdf5 file like this:
Result:
h5dump -H
Is there any possibility to yield two separate dimensions? I did not find anything related through internet search. Test-file is attached.
test.zip
The text was updated successfully, but these errors were encountered: