-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypingError Decoupler Import #34
Comments
Hi @jvogenstahl Unfortunately, this looks like a dependency issue with GoogleColab, could you try to redo the enviroment without upgrading numba? So basically:
|
Hi Pau, Thanks for your help ! Best, Jo
|
HI @jvogenstahl ,
The second one is about I also saw that you are using python 3.7, maybe update it to 3.10 just in case. @pablormier, did you encounter any of this issues? |
I think
The numba compilation error is independent and will stay around regardless. I would also highly recommend trying a more up-to-date Python, 3.10 or 3.11, and making sure all packages are up-to-date. |
Is this problem resolved? as @deeenes said, pypath-omnipath is not required. It is also recommended to install the dependencies using the right version of each tool, as in the original notebook. The issue seems to be related to incompatible versions between packages. |
nudge to devs, that this issue is still happening, at least to me. ---------------------------------------------------------------------------
TypingError Traceback (most recent call last)
Cell In[18], line 1
----> 1 import decoupler
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/decoupler/__init__.py:17
15 from .method_udt import run_udt # noqa: F401
16 from .method_ora import run_ora, test1r, get_ora_df # noqa: F401
---> 17 from .method_gsva import run_gsva # noqa: F401
18 from .method_gsea import run_gsea, get_gsea_df # noqa: F401
19 from .method_viper import run_viper # noqa: F401
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/decoupler/method_gsva.py:83
79 mat = mat_ecdf(mat)
80 return mat
---> 83 @nb.njit(nb.types.Tuple((nb.f4[:, :], nb.i8[:, :]))(nb.f4[:, :]), parallel=True, cache=True)
84 def nb_get_D_I(mat):
85 n = mat.shape[1]
86 rev_idx = np.abs(np.arange(start=n, stop=0, step=-1, dtype=nb.f4) - n / 2)
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/decorators.py:232, in _jit.<locals>.wrapper(func)
230 with typeinfer.register_dispatcher(disp):
231 for sig in sigs:
--> 232 disp.compile(sig)
233 disp.disable_compile()
234 return disp
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/dispatcher.py:905, in Dispatcher.compile(self, sig)
903 with ev.trigger_event("numba:compile", data=ev_details):
904 try:
--> 905 cres = self._compiler.compile(args, return_type)
906 except errors.ForceLiteralArg as e:
907 def folded(args, kws):
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/dispatcher.py:84, in _FunctionCompiler.compile(self, args, return_type)
82 return retval
83 else:
---> 84 raise retval
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/dispatcher.py:94, in _FunctionCompiler._compile_cached(self, args, return_type)
91 pass
93 try:
---> 94 retval = self._compile_core(args, return_type)
95 except errors.TypingError as e:
96 self._failed_cache[key] = e
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/dispatcher.py:107, in _FunctionCompiler._compile_core(self, args, return_type)
104 flags = self._customize_flags(flags)
106 impl = self._get_implementation(args, {})
--> 107 cres = compiler.compile_extra(self.targetdescr.typing_context,
108 self.targetdescr.target_context,
109 impl,
110 args=args, return_type=return_type,
111 flags=flags, locals=self.locals,
112 pipeline_class=self.pipeline_class)
113 # Check typing error if object mode is used
114 if cres.typing_error is not None and not flags.enable_pyobject:
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler.py:744, in compile_extra(typingctx, targetctx, func, args, return_type, flags, locals, library, pipeline_class)
720 """Compiler entry point
721
722 Parameter
(...)
740 compiler pipeline
741 """
742 pipeline = pipeline_class(typingctx, targetctx, library,
743 args, return_type, flags, locals)
--> 744 return pipeline.compile_extra(func)
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler.py:438, in CompilerBase.compile_extra(self, func)
436 self.state.lifted = ()
437 self.state.lifted_from = None
--> 438 return self._compile_bytecode()
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler.py:506, in CompilerBase._compile_bytecode(self)
502 """
503 Populate and run pipeline for bytecode input
504 """
505 assert self.state.func_ir is None
--> 506 return self._compile_core()
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler.py:485, in CompilerBase._compile_core(self)
483 self.state.status.fail_reason = e
484 if is_final_pipeline:
--> 485 raise e
486 else:
487 raise CompilerError("All available pipelines exhausted")
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler.py:472, in CompilerBase._compile_core(self)
470 res = None
471 try:
--> 472 pm.run(self.state)
473 if self.state.cr is not None:
474 break
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler_machinery.py:368, in PassManager.run(self, state)
365 msg = "Failed in %s mode pipeline (step: %s)" % \
366 (self.pipeline_name, pass_desc)
367 patched_exception = self._patch_error(msg, e)
--> 368 raise patched_exception
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler_machinery.py:356, in PassManager.run(self, state)
354 pass_inst = _pass_registry.get(pss).pass_inst
355 if isinstance(pass_inst, CompilerPass):
--> 356 self._runPass(idx, pass_inst, state)
357 else:
358 raise BaseException("Legacy pass in use")
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler_lock.py:35, in _CompilerLock.__call__.<locals>._acquire_compile_lock(*args, **kwargs)
32 @functools.wraps(func)
33 def _acquire_compile_lock(*args, **kwargs):
34 with self:
---> 35 return func(*args, **kwargs)
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler_machinery.py:311, in PassManager._runPass(self, index, pss, internal_state)
309 mutated |= check(pss.run_initialization, internal_state)
310 with SimpleTimer() as pass_time:
--> 311 mutated |= check(pss.run_pass, internal_state)
312 with SimpleTimer() as finalize_time:
313 mutated |= check(pss.run_finalizer, internal_state)
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/compiler_machinery.py:273, in PassManager._runPass.<locals>.check(func, compiler_state)
272 def check(func, compiler_state):
--> 273 mangled = func(compiler_state)
274 if mangled not in (True, False):
275 msg = ("CompilerPass implementations should return True/False. "
276 "CompilerPass with name '%s' did not.")
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/typed_passes.py:112, in BaseTypeInference.run_pass(self, state)
106 """
107 Type inference and legalization
108 """
109 with fallback_context(state, 'Function "%s" failed type inference'
110 % (state.func_id.func_name,)):
111 # Type inference
--> 112 typemap, return_type, calltypes, errs = type_inference_stage(
113 state.typingctx,
114 state.targetctx,
115 state.func_ir,
116 state.args,
117 state.return_type,
118 state.locals,
119 raise_errors=self._raise_errors)
120 state.typemap = typemap
121 # save errors in case of partial typing
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/typed_passes.py:93, in type_inference_stage(typingctx, targetctx, interp, args, return_type, locals, raise_errors)
91 infer.build_constraint()
92 # return errors in case of partial typing
---> 93 errs = infer.propagate(raise_errors=raise_errors)
94 typemap, restype, calltypes = infer.unify(raise_errors=raise_errors)
96 return _TypingResults(typemap, restype, calltypes, errs)
File ~/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/typeinfer.py:1091, in TypeInferer.propagate(self, raise_errors)
1088 force_lit_args = [e for e in errors
1089 if isinstance(e, ForceLiteralArg)]
1090 if not force_lit_args:
-> 1091 raise errors[0]
1092 else:
1093 raise reduce(operator.or_, force_lit_args)
TypingError: Failed in nopython mode pipeline (step: nopython frontend)
No implementation of function Function(<built-in function arange>) found for signature:
>>> arange(start=int64, stop=Literal[int](0), step=Literal[int](-1), dtype=class(float32))
There are 2 candidate implementations:
- Of which 2 did not match due to:
Overload in function 'np_arange': File: numba/np/arrayobj.py: Line 4760.
With argument(s): '(start=int64, stop=int64, step=int64, dtype=class(float32))':
Rejected as the implementation raised a specific error:
TypingError: got some positional-only arguments passed as keyword arguments: 'start'
raised from /Users/jnimoca/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/numba/core/typing/templates.py:783
During: resolving callee type: Function(<built-in function arange>)
During: typing of call at /Users/jnimoca/Jose_BI/4_Projects/P26_BCCN/P26E18_ProteomicAnalysis/.pixi/envs/default/lib/python3.12/site-packages/decoupler/method_gsva.py (86)
File "../.pixi/envs/default/lib/python3.12/site-packages/decoupler/method_gsva.py", line 86:
def nb_get_D_I(mat):
<source elided>
n = mat.shape[1]
rev_idx = np.abs(np.arange(start=n, stop=0, step=-1, dtype=nb.f4) - n / 2)
^ Pixi EnvironmentPackage Version Build Size Kind Source
adjusttext 1.3.0 pyhd8ed1ab_0 17.9 KiB conda adjusttext
anndata 0.11.2 pyhd8ed1ab_0 110.4 KiB conda anndata
appnope 0.1.4 pyhd8ed1ab_1 9.8 KiB conda appnope
array-api-compat 1.10.0 pyhd8ed1ab_0 37.5 KiB conda array-api-compat
asttokens 3.0.0 pyhd8ed1ab_1 27.5 KiB conda asttokens
brotli 1.1.0 hd74edd7_2 19.1 KiB conda brotli
brotli-bin 1.1.0 hd74edd7_2 16.4 KiB conda brotli-bin
bzip2 1.0.8 h99b78c6_7 120 KiB conda bzip2
c-ares 1.34.4 h5505292_0 175.3 KiB conda c-ares
ca-certificates 2024.12.14 hf0a4a13_0 153.4 KiB conda ca-certificates
cached-property 1.5.2 hd8ed1ab_1 4 KiB conda cached-property
cached_property 1.5.2 pyha770c72_1 10.8 KiB conda cached_property
colorama 0.4.6 pyhd8ed1ab_1 26.4 KiB conda colorama
comm 0.2.2 pyhd8ed1ab_1 11.8 KiB conda comm
contourpy 1.3.1 py312hb23fbb9_0 239.9 KiB conda contourpy
cycler 0.12.1 pyhd8ed1ab_1 13.1 KiB conda cycler
debugpy 1.8.11 py312hd8f9ff3_0 2.4 MiB conda debugpy
decorator 5.1.1 pyhd8ed1ab_1 13.7 KiB conda decorator
decoupler 1.6.0 301.7 KiB pypi decoupler-1.6.0-py3-none-any.whl
exceptiongroup 1.2.2 pyhd8ed1ab_1 20 KiB conda exceptiongroup
executing 2.1.0 pyhd8ed1ab_1 27.7 KiB conda executing
fonttools 4.55.3 py312h998013c_1 2.6 MiB conda fonttools
freetype 2.12.1 hadb7bae_2 582.5 KiB conda freetype
h5py 3.12.1 nompi_py312h34530d4_103 1.1 MiB conda h5py
hdf5 1.14.4 nompi_ha698983_105 3.3 MiB conda hdf5
icu 75.1 hfee45f7_0 11.3 MiB conda icu
importlib-metadata 8.5.0 pyha770c72_1 28 KiB conda importlib-metadata
ipykernel 6.29.5 pyh57ce528_0 116.8 KiB conda ipykernel
ipython 8.31.0 pyh707e725_0 586.7 KiB conda ipython
jedi 0.19.2 pyhd8ed1ab_1 823.9 KiB conda jedi
joblib 1.4.2 pyhd8ed1ab_1 215.1 KiB conda joblib
jupyter_client 8.6.3 pyhd8ed1ab_1 103.8 KiB conda jupyter_client
jupyter_core 5.7.2 pyh31011fe_1 56.3 KiB conda jupyter_core
kiwisolver 1.4.7 py312h6142ec9_0 59.5 KiB conda kiwisolver
krb5 1.21.3 h237132a_0 1.1 MiB conda krb5
lcms2 2.16 ha0e7c42_0 207 KiB conda lcms2
legacy-api-wrap 1.4.1 pyhd8ed1ab_0 14.8 KiB conda legacy-api-wrap
lerc 4.0.0 h9a09cb3_0 210.7 KiB conda lerc
libaec 1.1.3 hebf3989_0 27.8 KiB conda libaec
libblas 3.9.0 26_osxarm64_openblas 16.3 KiB conda libblas
libbrotlicommon 1.1.0 hd74edd7_2 66.8 KiB conda libbrotlicommon
libbrotlidec 1.1.0 hd74edd7_2 27.7 KiB conda libbrotlidec
libbrotlienc 1.1.0 hd74edd7_2 273.1 KiB conda libbrotlienc
libcblas 3.9.0 26_osxarm64_openblas 16.2 KiB conda libcblas
libcurl 8.11.1 h73640d1_0 376.1 KiB conda libcurl
libcxx 19.1.6 ha82da77_1 508.8 KiB conda libcxx
libdeflate 1.23 hec38601_0 52.9 KiB conda libdeflate
libedit 3.1.20240808 pl5321hafb1f1b_0 105.1 KiB conda libedit
libev 4.33 h93a5062_2 104.9 KiB conda libev
libexpat 2.6.4 h286801f_0 63.2 KiB conda libexpat
libffi 3.4.2 h3422bc3_5 38.1 KiB conda libffi
libgfortran 5.0.0 13_2_0_hd922786_3 107.6 KiB conda libgfortran
libgfortran5 13.2.0 hf226fd6_3 974 KiB conda libgfortran5
libhwloc 2.11.2 default_hbce5d74_1001 2.2 MiB conda libhwloc
libiconv 1.17 h0d3ecfb_2 660.6 KiB conda libiconv
libjpeg-turbo 3.0.0 hb547adb_1 534.7 KiB conda libjpeg-turbo
liblapack 3.9.0 26_osxarm64_openblas 16.2 KiB conda liblapack
libllvm14 14.0.6 hd1a9a77_4 19.6 MiB conda libllvm14
liblzma 5.6.3 h39f12f2_1 96.8 KiB conda liblzma
libnghttp2 1.64.0 h6d7220d_0 553.4 KiB conda libnghttp2
libopenblas 0.3.28 openmp_hf332438_1 4 MiB conda libopenblas
libpng 1.6.45 h3783ad8_0 257 KiB conda libpng
libsodium 1.0.20 h99b78c6_0 161.1 KiB conda libsodium
libsqlite 3.47.2 h3f77e49_0 830.6 KiB conda libsqlite
libssh2 1.11.1 h9cc3647_0 272.5 KiB conda libssh2
libtiff 4.7.0 h551f018_3 361.9 KiB conda libtiff
libwebp-base 1.5.0 h2471fea_0 283.2 KiB conda libwebp-base
libxcb 1.17.0 hdb1d25a_0 316.1 KiB conda libxcb
libxml2 2.13.5 h178c5d8_1 569.2 KiB conda libxml2
libzlib 1.3.1 h8359307_2 45.3 KiB conda libzlib
llvm-openmp 19.1.6 hdb05f8b_0 274.7 KiB conda llvm-openmp
llvmlite 0.43.0 py312ha9ca408_1 361.4 KiB conda llvmlite
loguru 0.7.2 py312h81bd7bf_2 120.5 KiB conda loguru
markdown-it-py 3.0.0 pyhd8ed1ab_1 62.9 KiB conda markdown-it-py
matplotlib 3.10.0 py312h1f38498_0 16.7 KiB conda matplotlib
matplotlib-base 3.10.0 py312hdbc7e53_0 7.6 MiB conda matplotlib-base
matplotlib-inline 0.1.7 pyhd8ed1ab_1 14.1 KiB conda matplotlib-inline
mdurl 0.1.2 pyhd8ed1ab_1 14.1 KiB conda mdurl
munkres 1.1.4 pyh9f0ad1d_0 12.2 KiB conda munkres
natsort 8.4.0 pyh29332c3_1 38.1 KiB conda natsort
ncurses 6.5 h7bae524_1 783.5 KiB conda ncurses
nest-asyncio 1.6.0 pyhd8ed1ab_1 11.3 KiB conda nest-asyncio
networkx 3.4.2 pyh267e887_2 1.2 MiB conda networkx
numba 0.60.0 py312h41cea2d_0 5.4 MiB conda numba
numpy 2.0.2 py312h94ee1e1_1 6.1 MiB conda numpy
openjpeg 2.5.3 h8a3d83b_0 311.9 KiB conda openjpeg
openssl 3.4.0 h81ee809_1 2.8 MiB conda openssl
p26e18_proteomicanalysis 0.1.0 pypi (editable)
packaging 24.2 pyhd8ed1ab_2 58.8 KiB conda packaging
pandas 2.2.3 py312hcd31e36_1 13.8 MiB conda pandas
parso 0.8.4 pyhd8ed1ab_1 73.5 KiB conda parso
patsy 1.0.1 pyhd8ed1ab_1 182.2 KiB conda patsy
pexpect 4.9.0 pyhd8ed1ab_1 52.3 KiB conda pexpect
pickleshare 0.7.5 pyhd8ed1ab_1004 11.5 KiB conda pickleshare
pillow 11.1.0 py312h50aef2c_0 40.9 MiB conda pillow
platformdirs 4.3.6 pyhd8ed1ab_1 20 KiB conda platformdirs
plotly 5.24.1 pyhd8ed1ab_1 7.7 MiB conda plotly
prompt-toolkit 3.0.48 pyha770c72_1 263.5 KiB conda prompt-toolkit
psutil 6.1.1 py312hea69d52_0 483.8 KiB conda psutil
pthread-stubs 0.4 hd74edd7_1002 8.2 KiB conda pthread-stubs
ptyprocess 0.7.0 pyhd8ed1ab_1 19 KiB conda ptyprocess
pure_eval 0.2.3 pyhd8ed1ab_1 16.3 KiB conda pure_eval
pygments 2.19.1 pyhd8ed1ab_0 867.8 KiB conda pygments
pynndescent 0.5.13 pyhd8ed1ab_1 48.5 KiB conda pynndescent
pyparsing 3.2.1 pyhd8ed1ab_0 90.9 KiB conda pyparsing
python 3.12.8 hc22306f_1_cpython 12.4 MiB conda python
python-dateutil 2.9.0.post0 pyhff2d567_1 217.3 KiB conda python-dateutil
python-tzdata 2024.2 pyhd8ed1ab_1 138.9 KiB conda python-tzdata
python_abi 3.12 5_cp312 6.1 KiB conda python_abi
pytz 2024.1 pyhd8ed1ab_0 184.1 KiB conda pytz
pyzmq 26.2.0 py312hf8a1cbd_3 353.2 KiB conda pyzmq
qhull 2020.2 h420ef59_5 504.3 KiB conda qhull
readline 8.2 h92ec313_1 244.5 KiB conda readline
rich 13.9.4 pyhd8ed1ab_1 181.3 KiB conda rich
ruff 0.9.0 py312h5d18b81_0 6.7 MiB conda ruff
scanpy 1.10.4 pyhd8ed1ab_0 1.9 MiB conda scanpy
scikit-learn 1.6.1 py312h39203ce_0 9.3 MiB conda scikit-learn
scipy 1.15.0 py312hb7ffdcd_1 15.2 MiB conda scipy
seaborn 0.13.2 hd8ed1ab_3 6.7 KiB conda seaborn
seaborn-base 0.13.2 pyhd8ed1ab_3 222.5 KiB conda seaborn-base
session-info 1.0.0 pyhd8ed1ab_0 11.9 KiB conda session-info
setuptools 75.8.0 pyhff2d567_0 757.4 KiB conda setuptools
six 1.17.0 pyhd8ed1ab_0 16 KiB conda six
stack_data 0.6.3 pyhd8ed1ab_1 26.4 KiB conda stack_data
statsmodels 0.14.4 py312h755e627_0 11.3 MiB conda statsmodels
stdlib-list 0.11.0 pyhd8ed1ab_1 25.7 KiB conda stdlib-list
tabulate 0.9.0 pyhd8ed1ab_2 36.7 KiB conda tabulate
tbb 2022.0.0 h0cbf7ec_0 115.1 KiB conda tbb
tenacity 9.0.0 pyhd8ed1ab_1 24.2 KiB conda tenacity
threadpoolctl 3.5.0 pyhc1e730c_0 23 KiB conda threadpoolctl
tk 8.6.13 h5083fa2_1 3 MiB conda tk
tornado 6.4.2 py312hea69d52_0 822.8 KiB conda tornado
tqdm 4.67.1 pyhd8ed1ab_1 87.4 KiB conda tqdm
traitlets 5.14.3 pyhd8ed1ab_1 107.5 KiB conda traitlets
typing_extensions 4.12.2 pyha770c72_1 38.7 KiB conda typing_extensions
tzdata 2024b hc8b5060_0 119.5 KiB conda tzdata
umap-learn 0.5.7 py312h81bd7bf_0 184 KiB conda umap-learn
unicodedata2 15.1.0 py312h0bf5046_1 363.8 KiB conda unicodedata2
wcwidth 0.2.13 pyhd8ed1ab_1 31.8 KiB conda wcwidth
xorg-libxau 1.0.12 h5505292_0 13.3 KiB conda xorg-libxau
xorg-libxdmcp 1.1.5 hd74edd7_0 18.1 KiB conda xorg-libxdmcp
zeromq 4.3.5 hc1bb282_7 275 KiB conda zeromq
zipp 3.21.0 pyhd8ed1ab_1 21.3 KiB conda zipp
zstd 1.5.6 hb46c0d2_0 395.6 KiB conda zstd |
Dear Decoupler team,
I followed the PerMedCoE hands-on course on Tuesday and I'm trying to apply the analysis to my data (spatial transcriptomics) following the GoogleCollab file.
I managed to create the 'signalling' environment and I believe I managed to install all packages in Jupyter. However, I get a TypingError when importing decoupler in python and can't figure out what's wrong (see below).
Any assistance would be very welcome.
Have a great day,
Johanna
My code:
Error message:
The text was updated successfully, but these errors were encountered: