-
Notifications
You must be signed in to change notification settings - Fork 128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nibabel can't read my data and heudiconv doesn't explain why #455
Comments
I am away from laptop ATM, but if you try with --minmeta (check for correct spelling in --help) would it work? |
I tried again with the Interestingly, my dicom directories contains files prefixed either with
I instead get an error about conflicting study identifiers, though this is an error I've actually seen before and believe I know how to solve. When I convert just the RAW-prefixed files,
it actually works, but then the generated |
@Terf nibabel's MultiframeWrapper implementation currently doesn't support multiple DICOM stacks within the same multiframe image If you haven't already, I would try converting the DICOMs with |
Is there any way I can convert my data so nibabel will read it and I can use heudiconv? Or is there a way to get heudiconv to skip over files it isn't able to read and convert the rest? When I use
I'm confused if this means some of the DICOMs were not actually able to be converted? nitrc.org lists that exact transfer syntax and links to a tool (dcmconv) that can apparently convert DICOM file encoding. If I were to get rid of the |
Unfortunately separate. BTW what scanner/sequences those are? |
I will check in Monday to see what more about my data I'm allowed to share, but in short these particular DICOMs are from a Phillips scanner. However, my dataset as a whole contains scans from multiple sites that each use a different scanner -- a focus of my labs research is actually to quantify differences between sites/scanners. Is it possible to combine all my data into a single BIDS dataset, or would I be better off creating a distinct BIDS dataset for every site? |
As I remember it - dcmstack can read multiframe DICOMs at least - will it read your DICOMs? |
@matthew-brett dcmstack does not seem to be able to read the entire directory of DICOMs.
|
@yarikoptic @matthew-brett @mgxd It seems Philips scanners have a variety of output options but one of them is a sort of "enhanced" dicoms which causes this issue. My enhanced dicoms were additionally compressed (hence the unsupported transfer syntax) so I first used dcmconv to decompress then used the emf2sf tool from dcm4che to convert to standard dicoms. apt-get update && apt-get install dcmtk
# only the MRe-prefixed dicoms are necessary, the PSg and RAW-prefixed files are Philips-specific files that are kept to recreate the scanning environment
dcmconv /base/Data/01-001/Hopkins/dicom/std_redacted/MRe.redacted.dcm ./out.dcm
docker run -ti -v $PWD:/base --entrypoint="" dcm4che/dcm4che-tools \
emf2sf --out-file test.dcm /base/out.dcm I can then run heudiconv on these converted dicoms. So this issue is data-specific and somewhat separate to heudiconv's functionality, however, it would be nice if the nibabel multi-frame exception could be caught and a more explanatory exception could be raised, describing how to convert enhanced dicoms to standard ones. Also, why does heudiconv not report the same unsupported transfer syntax warning that dcm2niix does? At the very least, I would like to submit a PR that makes the unsupported transfer syntax more explicit and gives a more verbose explanation of the multi-frame/enhanced dicom issue. Depending on how big of a priority container size is for you, I think it would be very useful to include at least dcmconv and perhaps the larger dcm4che library in the heudiconv container so the solution can be fully automated. If you're on board with that idea, I'd love to submit a PR. My PI says I can't share these particular data, but if it would help test an automated solution, I have other scans of an inanimate object (a water bottle actually) I can publicly share which came from the same scanner so will have the same issues. |
I love bottles! |
Summary
When I run
I get an error that
nibabel.nicom.dicomwrappers.WrapperError: File contains more than one StackID. Cannot handle multi-stack files
. Would anyone know what aStackID
even is? I can't find much about it either in the heudiconv docs or elsewhere online, but the stack trace indicates the problem occurs on line 105 in the heudiconv source where you can see a try-catch block, but only KeyErrors are caught. So, I tried catching all errors by using sed to removeKeyError
:which does bypass the error in that particular spot but moves the problem downstream so it simply fails a bit later in the run. I’ve never seen this issue with other datasets I’ve converted, so I assume the issue is actually with the data, not the code, but I have no idea what it could be. Does anyone else? Ultimately I hope to submit a PR that catches the nibabel exception and raises our own exception with more of an explanation.
Platform details:
Choose one:
The text was updated successfully, but these errors were encountered: