Skip to content

Conversation

@alexott
Copy link
Contributor

@alexott alexott commented Nov 14, 2025

Changes

Move defaults for databricks_spark_version data source from Go SDK to the Terraform. First part of #5218 work - after new Go SDK is merged, we'll need to change Scala default to 2.1.

Related to databricks/databricks-sdk-go#1331

Tests

  • make test run locally
  • relevant change in docs/ folder
  • covered with integration tests in internal/acceptance
  • using Go SDK
  • using TF Plugin Framework
  • has entry in NEXT_CHANGELOG.md file

Move defaults for `databricks_spark_version` data source from Go SDK to the
Terraform. First part of #5218 work - after new Go SDK is merged, we'll need to change
Scala default to `2.1`.

Related to databricks/databricks-sdk-go#1331
@alexott alexott requested review from a team as code owners November 14, 2025 17:52
@alexott alexott requested review from rauchy and removed request for a team November 14, 2025 17:52
@github-actions
Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/terraform

Inputs:

  • PR number: 5219
  • Commit SHA: 33d69c17e46f3a158720a1f0e1458371080acabd

Checks will be approved automatically on success.

@alexott alexott added this pull request to the merge queue Nov 26, 2025
Merged via the queue into main with commit b0a2a1c Nov 26, 2025
12 checks passed
@alexott alexott deleted the issue-5218-part1 branch November 26, 2025 15:31
deco-sdk-tagging bot added a commit that referenced this pull request Dec 3, 2025
## Release v1.98.0

### New Features and Improvements

* Relaxed `force_new` constraint on `catalog` attribute in `databricks_pipeline` resource to allow changing the default catalog for existing pipelines ([#5180](#5180)).
* Add `databricks_users` data source ([#4028](#4028))
* Improve `databricks_service_principals` data source ([#5164](#5164))
* Add `feature_engineering_kafka_config` resource and data source ([#5240](#5240))

### Bug Fixes

* Fix spurious plan diffs in `databricks_model_serving` and `databricks_model_serving_provisioned_throughput` resources due to tag reordering ([#5120](#5120))
* Move Spark Version selector defaults to Terraform ([#5219](#5219)).

### Documentation

* Document tag policies in `databricks_access_control_rule_set` ([#5209](#5209)).
* Document missing `aws_attributes.ebs_*` properties in `databricks_cluster` ([#5196](#5196)).
* Document support for serverless workspaces on GCP ([#5124](#5124))
* Document data object types for share resource ([#5244](#5244))

### Exporter

* Added support for `databricks_data_quality_monitor` resource ([#5193](#5193)).
* Added support for `databricks_account_federation_policy`, `databricks_custom_app_integration`, `databricks_quality_monitor_v2`, `databricks_service_principal_federation_policy` resources ([#5237](#5237)).
* Added support for `databricks_budget_policy` resource ([#5217](#5217)).
* Fix typo in the name of environment variable ([#5158](#5158)).
* Export permission assignments on workspace level ([#5169](#5169)).
* Added support for UC Tag policies ([#5213](#5213)).
* Added support for Databricks Apps resources ([#5208](#5208)).
* Added support for Database Instance resource (aka Lakebase) ([#5212](#5212)).
* Added support for workspace and account settings v2 ([#5230](#5230)).

### Internal Changes

* Update Go SDK to v0.92.0 ([#5240](#5240))
* Bump github.com/hashicorp/terraform-plugin-framework from 1.16.1 to 1.17.0 ([#5247](#5247))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants