Skip to content

Update databricks-sdk requirement from <0.42,>=0.29 to >=0.29,<0.48 #1497

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Mar 21, 2025

Updates the requirements on databricks-sdk to permit the latest version.

Release notes

Sourced from databricks-sdk's releases.

v0.47.0

Release v0.47.0

Bug Fixes

  • Ensure that refresh tokens are returned when using the external-browser credentials strategy.

API Changes

  • Added abfss, dbfs, error_message, execution_duration_seconds, file, gcs, s3, status, volumes and workspace fields for databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails.
  • [Breaking] Added forecast_granularity field for databricks.sdk.service.ml.CreateForecastingExperimentRequest.
  • Added jwks_uri field for databricks.sdk.service.oauth2.OidcFederationPolicy.
  • Added fallback_config field for databricks.sdk.service.serving.AiGatewayConfig.
  • Added custom_provider_config field for databricks.sdk.service.serving.ExternalModel.
  • Added fallback_config field for databricks.sdk.service.serving.PutAiGatewayRequest.
  • Added fallback_config field for databricks.sdk.service.serving.PutAiGatewayResponse.
  • Added aliases, comment, data_type, dependency_list, full_data_type, id, input_params, name, properties, routine_definition, schema, securable_kind, share, share_id, storage_location and tags fields for databricks.sdk.service.sharing.DeltaSharingFunction.
  • Added access_token_failure, allocation_timeout, allocation_timeout_node_daemon_not_ready, allocation_timeout_no_healthy_clusters, allocation_timeout_no_matched_clusters, allocation_timeout_no_ready_clusters, allocation_timeout_no_unallocated_clusters, allocation_timeout_no_warmed_up_clusters, aws_inaccessible_kms_key_failure, aws_instance_profile_update_failure, aws_invalid_key_pair, aws_invalid_kms_key_state, aws_resource_quota_exceeded, azure_packed_deployment_partial_failure, bootstrap_timeout_due_to_misconfig, budget_policy_limit_enforcement_activated, budget_policy_resolution_failure, cloud_account_setup_failure, cloud_operation_cancelled, cloud_provider_instance_not_launched, cloud_provider_launch_failure_due_to_misconfig, cloud_provider_resource_stockout_due_to_misconfig, cluster_operation_throttled, cluster_operation_timeout, control_plane_request_failure_due_to_misconfig, data_access_config_changed, disaster_recovery_replication, driver_eviction, driver_launch_timeout, driver_node_unreachable, driver_out_of_disk, driver_out_of_memory, driver_pod_creation_failure, driver_unexpected_failure, dynamic_spark_conf_size_exceeded, eos_spark_image, executor_pod_unscheduled, gcp_api_rate_quota_exceeded, gcp_forbidden, gcp_iam_timeout, gcp_inaccessible_kms_key_failure, gcp_insufficient_capacity, gcp_ip_space_exhausted, gcp_kms_key_permission_denied, gcp_not_found, gcp_resource_quota_exceeded, gcp_service_account_access_denied, gcp_service_account_not_found, gcp_subnet_not_ready, gcp_trusted_image_projects_violated, gke_based_cluster_termination, init_container_not_finished, instance_pool_max_capacity_reached, instance_pool_not_found, instance_unreachable_due_to_misconfig, internal_capacity_failure, invalid_aws_parameter, invalid_instance_placement_protocol, invalid_worker_image_failure, in_penalty_box, lazy_allocation_timeout, maintenance_mode, netvisor_setup_timeout, no_matched_k8s, no_matched_k8s_testing_tag, pod_assignment_failure, pod_scheduling_failure, resource_usage_blocked, secret_creation_failure, serverless_long_running_terminated, spark_image_download_throttled, spark_image_not_found, ssh_bootstrap_failure, storage_download_failure_due_to_misconfig, storage_download_failure_slow, storage_download_failure_throttled, unexpected_pod_recreation, user_initiated_vm_termination and workspace_update enum values for databricks.sdk.service.compute.TerminationReasonCode.
  • Added generated_sql_query_too_long_exception and missing_sql_query_exception enum values for databricks.sdk.service.dashboards.MessageErrorType.
  • Added balanced enum value for databricks.sdk.service.jobs.PerformanceTarget.
  • Added listing_resource enum value for databricks.sdk.service.marketplace.FileParentType.
  • Added app enum value for databricks.sdk.service.marketplace.MarketplaceFileType.
  • Added custom enum value for databricks.sdk.service.serving.ExternalModelProvider.
  • [Breaking] Changed create_experiment() method for w.forecasting workspace-level service with new required argument order.
  • Changed instance_type_id field for databricks.sdk.service.compute.NodeInstanceType to be required.
  • Changed category field for databricks.sdk.service.compute.NodeType to be required.
  • [Breaking] Changed functions field for databricks.sdk.service.sharing.ListProviderShareAssetsResponse to type databricks.sdk.service.sharing.DeltaSharingFunctionList dataclass.
  • [Breaking] Changed waiter for ClustersAPI.create method.
  • [Breaking] Changed waiter for ClustersAPI.delete method.
  • [Breaking] Changed waiter for ClustersAPI.edit method.
  • [Breaking] Changed waiter for ClustersAPI.get method.
  • [Breaking] Changed waiter for ClustersAPI.resize method.
  • [Breaking] Changed waiter for ClustersAPI.restart method.
  • [Breaking] Changed waiter for ClustersAPI.start method.
  • [Breaking] Changed waiter for ClustersAPI.update method.
  • [Breaking] Removed execution_details and script fields for databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails.
  • [Breaking] Removed supports_elastic_disk field for databricks.sdk.service.compute.NodeType.
  • [Breaking] Removed data_granularity_quantity and data_granularity_unit fields for databricks.sdk.service.ml.CreateForecastingExperimentRequest.
  • [Breaking] Removed aliases, comment, data_type, dependency_list, full_data_type, id, input_params, name, properties, routine_definition, schema, securable_kind, share, share_id, storage_location and tags fields for databricks.sdk.service.sharing.Function.
Changelog

Sourced from databricks-sdk's changelog.

Release v0.47.0

Bug Fixes

  • Ensure that refresh tokens are returned when using the external-browser credentials strategy.

API Changes

  • Added abfss, dbfs, error_message, execution_duration_seconds, file, gcs, s3, status, volumes and workspace fields for databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails.
  • [Breaking] Added forecast_granularity field for databricks.sdk.service.ml.CreateForecastingExperimentRequest.
  • Added jwks_uri field for databricks.sdk.service.oauth2.OidcFederationPolicy.
  • Added fallback_config field for databricks.sdk.service.serving.AiGatewayConfig.
  • Added custom_provider_config field for databricks.sdk.service.serving.ExternalModel.
  • Added fallback_config field for databricks.sdk.service.serving.PutAiGatewayRequest.
  • Added fallback_config field for databricks.sdk.service.serving.PutAiGatewayResponse.
  • Added aliases, comment, data_type, dependency_list, full_data_type, id, input_params, name, properties, routine_definition, schema, securable_kind, share, share_id, storage_location and tags fields for databricks.sdk.service.sharing.DeltaSharingFunction.
  • Added access_token_failure, allocation_timeout, allocation_timeout_node_daemon_not_ready, allocation_timeout_no_healthy_clusters, allocation_timeout_no_matched_clusters, allocation_timeout_no_ready_clusters, allocation_timeout_no_unallocated_clusters, allocation_timeout_no_warmed_up_clusters, aws_inaccessible_kms_key_failure, aws_instance_profile_update_failure, aws_invalid_key_pair, aws_invalid_kms_key_state, aws_resource_quota_exceeded, azure_packed_deployment_partial_failure, bootstrap_timeout_due_to_misconfig, budget_policy_limit_enforcement_activated, budget_policy_resolution_failure, cloud_account_setup_failure, cloud_operation_cancelled, cloud_provider_instance_not_launched, cloud_provider_launch_failure_due_to_misconfig, cloud_provider_resource_stockout_due_to_misconfig, cluster_operation_throttled, cluster_operation_timeout, control_plane_request_failure_due_to_misconfig, data_access_config_changed, disaster_recovery_replication, driver_eviction, driver_launch_timeout, driver_node_unreachable, driver_out_of_disk, driver_out_of_memory, driver_pod_creation_failure, driver_unexpected_failure, dynamic_spark_conf_size_exceeded, eos_spark_image, executor_pod_unscheduled, gcp_api_rate_quota_exceeded, gcp_forbidden, gcp_iam_timeout, gcp_inaccessible_kms_key_failure, gcp_insufficient_capacity, gcp_ip_space_exhausted, gcp_kms_key_permission_denied, gcp_not_found, gcp_resource_quota_exceeded, gcp_service_account_access_denied, gcp_service_account_not_found, gcp_subnet_not_ready, gcp_trusted_image_projects_violated, gke_based_cluster_termination, init_container_not_finished, instance_pool_max_capacity_reached, instance_pool_not_found, instance_unreachable_due_to_misconfig, internal_capacity_failure, invalid_aws_parameter, invalid_instance_placement_protocol, invalid_worker_image_failure, in_penalty_box, lazy_allocation_timeout, maintenance_mode, netvisor_setup_timeout, no_matched_k8s, no_matched_k8s_testing_tag, pod_assignment_failure, pod_scheduling_failure, resource_usage_blocked, secret_creation_failure, serverless_long_running_terminated, spark_image_download_throttled, spark_image_not_found, ssh_bootstrap_failure, storage_download_failure_due_to_misconfig, storage_download_failure_slow, storage_download_failure_throttled, unexpected_pod_recreation, user_initiated_vm_termination and workspace_update enum values for databricks.sdk.service.compute.TerminationReasonCode.
  • Added generated_sql_query_too_long_exception and missing_sql_query_exception enum values for databricks.sdk.service.dashboards.MessageErrorType.
  • Added balanced enum value for databricks.sdk.service.jobs.PerformanceTarget.
  • Added listing_resource enum value for databricks.sdk.service.marketplace.FileParentType.
  • Added app enum value for databricks.sdk.service.marketplace.MarketplaceFileType.
  • Added custom enum value for databricks.sdk.service.serving.ExternalModelProvider.
  • [Breaking] Changed create_experiment() method for w.forecasting workspace-level service with new required argument order.
  • Changed instance_type_id field for databricks.sdk.service.compute.NodeInstanceType to be required.
  • Changed category field for databricks.sdk.service.compute.NodeType to be required.
  • [Breaking] Changed functions field for databricks.sdk.service.sharing.ListProviderShareAssetsResponse to type databricks.sdk.service.sharing.DeltaSharingFunctionList dataclass.
  • [Breaking] Changed waiter for ClustersAPI.create method.
  • [Breaking] Changed waiter for ClustersAPI.delete method.
  • [Breaking] Changed waiter for ClustersAPI.edit method.
  • [Breaking] Changed waiter for ClustersAPI.get method.
  • [Breaking] Changed waiter for ClustersAPI.resize method.
  • [Breaking] Changed waiter for ClustersAPI.restart method.
  • [Breaking] Changed waiter for ClustersAPI.start method.
  • [Breaking] Changed waiter for ClustersAPI.update method.
  • [Breaking] Removed execution_details and script fields for databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails.
  • [Breaking] Removed supports_elastic_disk field for databricks.sdk.service.compute.NodeType.
  • [Breaking] Removed data_granularity_quantity and data_granularity_unit fields for databricks.sdk.service.ml.CreateForecastingExperimentRequest.
  • [Breaking] Removed aliases, comment, data_type, dependency_list, full_data_type, id, input_params, name, properties, routine_definition, schema, securable_kind, share, share_id, storage_location and tags fields for databricks.sdk.service.sharing.Function.

Release v0.46.0

New Features and Improvements

  • [Experimental] Add support for async token refresh (#916). This can be enabled with by setting the following setting:
    export DATABRICKS_ENABLE_EXPERIMENTAL_ASYNC_TOKEN_REFRESH=1.
    
    This feature and its setting are experimental and may be removed in future releases.

API Changes

... (truncated)

Commits
  • f2472bf [Release] Release v0.47.0
  • 732ee54 Update Python SDK to latest API spec (#932)
  • da8eb26 Ensure that refresh tokens are returned when using the external-browser cre...
  • cc7c236 Update codegen to match new style (#927)
  • 5607b09 [Internal] Update ipython requirement from <9,>=8 to >=8,<10 (#908)
  • ea3d2b5 Update version constant to 0.46.0 (#926)
  • 699949b [Release] Release v0.46.0
  • f2977aa [Feature] Add support for async token refresh (#916)
  • 254956e Update OpenAPI spec (#925)
  • 42df0de Remove Redundant YAPF configuration (#924)
  • Additional commits viewable in compare view

Most Recent Ignore Conditions Applied to This Pull Request
Dependency Name Ignore Conditions
databricks-sdk [>= 0.38.dev0, < 0.39]
databricks-sdk [>= 0.39.dev0, < 0.40]

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Mar 21, 2025
@dependabot dependabot bot requested a review from a team as a code owner March 21, 2025 09:25
@dependabot dependabot bot force-pushed the dependabot/pip/databricks-sdk-gte-0.29-and-lt-0.48 branch from 97d816d to 6991c65 Compare March 27, 2025 15:44
@dependabot dependabot bot force-pushed the dependabot/pip/databricks-sdk-gte-0.29-and-lt-0.48 branch from 6991c65 to 866cb40 Compare March 27, 2025 15:45
Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.29.0...v0.47.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot force-pushed the dependabot/pip/databricks-sdk-gte-0.29-and-lt-0.48 branch from 866cb40 to fcdfca0 Compare March 27, 2025 15:46
Copy link
Contributor Author

dependabot bot commented on behalf of github Mar 28, 2025

Superseded by #1504.

@dependabot dependabot bot closed this Mar 28, 2025
@dependabot dependabot bot deleted the dependabot/pip/databricks-sdk-gte-0.29-and-lt-0.48 branch March 28, 2025 09:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants