We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The stable doc here: https://docs.kedro.org/en/stable/integrations/pyspark_integration.html is out of date I think.
Specifically: parameters = context.config_loader.get("spark*", "spark*/**")
parameters = context.config_loader.get("spark*", "spark*/**")
needs to be update to the new method.
I am mentioning this as I tried config_loader["spark"] with:
config_loader["spark"]
CONFIG_LOADER_ARGS = { "config_patterns": { "spark": ["spark*", "spark*/**"], }, }
but it couldn't find the conf/base/spark.yml for some reason, so I moved it to conf/databricks/spark.yml and now it finds it.
conf/base/spark.yml
conf/databricks/spark.yml
https://docs.kedro.org/en/stable/integrations/pyspark_integration.html
The text was updated successfully, but these errors were encountered:
Thank you, @alexisdrakopoulos, for reporting an issue!
I tried to reproduce an issue, and I created conf/base/spark.yml and set
CONFIG_LOADER_ARGS = { "base_env": "base", "default_run_env": "local", "config_patterns": { "spark": ["spark*/"], } }
and it seems to be working well; at least it can find conf/base/spark.yml
So, for me, it looks like this line in the docs might not be relevant: parameters = context.config_loader.get("spark*", "spark*/**")
We will double-check and come back.
Sorry, something went wrong.
I'm bumping the priority on this, because it's been reported again in #4166
lrcouto
Successfully merging a pull request may close this issue.
Description
The stable doc here: https://docs.kedro.org/en/stable/integrations/pyspark_integration.html is out of date I think.
Specifically:
parameters = context.config_loader.get("spark*", "spark*/**")
needs to be update to the new method.
I am mentioning this as I tried
config_loader["spark"]
with:but it couldn't find the
conf/base/spark.yml
for some reason, so I moved it toconf/databricks/spark.yml
and now it finds it.Documentation page (if applicable)
https://docs.kedro.org/en/stable/integrations/pyspark_integration.html
Context
The text was updated successfully, but these errors were encountered: