Skip to content

Tags: jdavidheiser/cli

Tags

snapshot

Toggle snapshot's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
build(deps): bump golang.org/x/term from 0.32.0 to 0.33.0 (databricks…

…#3249)

Bumps [golang.org/x/term](https://github.com/golang/term) from 0.32.0 to
0.33.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/golang/term/commit/30da5dd58fc835bf6704fa7464ac3d23202d8685"><code>30da5dd</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

v0.261.0

Toggle v0.261.0's commit message
## Release v0.261.0

### Notable Changes
The following CLI commands now have additional required positional arguments:
* `alerts-v2 update-alert ID UPDATE_MASK` - Update an alert (v2)
* `database update-database-instance NAME UPDATE_MASK` - Update a database instance
* `external-lineage create-external-lineage-relationship SOURCE TARGET` - Create an external lineage relationship
* `external-lineage update-external-lineage-relationship UPDATE_MASK SOURCE TARGET` - Update an external lineage relationship
* `external-metadata update-external-metadata NAME UPDATE_MASK SYSTEM_TYPE ENTITY_TYPE` - Update external metadata
* `feature-store update-online-store NAME UPDATE_MASK CAPACITY` - Update an online store
* `lakeview create-schedule DASHBOARD_ID CRON_SCHEDULE` - Create a schedule
* `lakeview create-subscription DASHBOARD_ID SCHEDULE_ID SUBSCRIBER` - Create a subscription
* `lakeview update-schedule DASHBOARD_ID SCHEDULE_ID CRON_SCHEDULE` - Update a schedule
* `network-connectivity update-private-endpoint-rule NETWORK_CONNECTIVITY_CONFIG_ID PRIVATE_ENDPOINT_RULE_ID UPDATE_MASK` - Update a private endpoint rule

### CLI
* Add required query parameters as positional arguments in CLI commands ([databricks#3289](databricks#3289))

### Bundles
* Fixed an issue where `allow_duplicate_names` field on the pipeline definition was ignored by the bundle ([databricks#3274](databricks#3274))
* Add warning for when required bundle fields are not set ([databricks#3044](databricks#3044))

v0.260.0

Toggle v0.260.0's commit message
## Release v0.260.0

### Notable Changes
* Added support for creating SQL warehouses in DABs ([databricks#3129](databricks#3129))

### Dependency updates
* Upgrade Go SDK to 0.75.0 ([databricks#3256](databricks#3256))

### CLI
* Add `databricks psql` command to connect to Lakebase with a single command ([databricks#3128](databricks#3128))

### Bundles

 * Jobs that use cluster policy default values for their cluster configuration now correctly update those defaults on every deployment ([databricks#3255](databricks#3255)).
 * Add upper and lower helper methods for bundle templates ([databricks#3242](databricks#3242))

v0.259.0

Toggle v0.259.0's commit message
## Release v0.259.0

### Notable Changes
* Add support for arbitrary scripts in DABs. Users can now define scripts in their bundle configuration. These scripts automatically inherit the bundle's credentials for authentication. They can be invoked with the `bundle run` command. ([databricks#2813](databricks#2813))
* Error when the absolute path to `databricks.yml` contains a glob character. These are: `*`, `?`, `[`, `]` and `^`. If the path to the `databricks.yml` file on your local filesystem contains one of these characters, that could lead to incorrect computation of glob patterns for the `includes` block and might cause resources to be deleted. After this patch users will not be at risk for unexpected deletions due to this issue. ([databricks#3096](databricks#3096))
* Diagnostics messages are no longer buffered to be printed at the end of command, flushed after every mutator ([databricks#3175](databricks#3175))
* Diagnostics are now always rendered with forward slashes in file paths, even on Windows ([databricks#3175](databricks#3175))
* "bundle summary" now prints diagnostics to stderr instead of stdout in text output mode ([databricks#3175](databricks#3175))
* "bundle summary" no longer prints recommendations, it will only prints warnings and errors ([databricks#3175](databricks#3175))

### Bundles
* Fix default search location for whl artifacts ([databricks#3184](databricks#3184)). This was a regression introduced in 0.255.0.
* The job tasks are now sorted by task key in "bundle validate/summary" output ([databricks#3212](databricks#3212))

v0.258.0

Toggle v0.258.0's commit message
## Release v0.258.0

### Notable Changes
* Switch default-python template to use pyproject.toml + dynamic\_version in dev target. uv is now required. ([databricks#3042](databricks#3042))

### Dependency updates
* Upgraded TF provider to 1.84.0 ([databricks#3151](databricks#3151))

### CLI
* Fixed an issue where running `databricks auth login` would remove the `cluster_id` field from profiles in `.databrickscfg`. The login process now preserves the `cluster_id` field. ([databricks#2988](databricks#2988))

### Bundles
* Added support for pipeline environment field ([databricks#3153](databricks#3153))
* "bundle summary" now prints diagnostic warnings to stderr ([databricks#3123](databricks#3123))
* "bundle open" will print the URL before opening the browser ([databricks#3168](databricks#3168))

v0.257.0

Toggle v0.257.0's commit message
## Release v0.257.0

### Bundles
* Improve error message for host mismatch between bundle and profile used ([databricks#3100](databricks#3100))
* Remove support for deprecated `experimental/pydabs` config, use `experimental/python` instead. See [Configuration in Python
](https://docs.databricks.com/dev-tools/bundles/python). ([databricks#3102](databricks#3102))
* Pass through OIDC token env variable to Terraform ([databricks#3113](databricks#3113))

* The `default-python` template now prompts if you want to use serverless compute (default to `yes`) ([databricks#3051](databricks#3051)).

### API Changes
* Removed `databricks custom-llms` command group.
* Added `databricks ai-builder` command group.
* Added `databricks feature-store` command group.

v0.256.0

Toggle v0.256.0's commit message
## Release v0.256.0

### Bundles
* When building Python artifacts as part of "bundle deploy" we no longer delete `dist`, `build`, `*egg-info` and `__pycache__` directories ([databricks#2982](databricks#2982))
* When glob for wheels is used, like "\*.whl", it will filter out different version of the same package and will only take the most recent version ([databricks#2982](databricks#2982))
* Add preset `presets.artifacts_dynamic_version` that automatically enables `dynamic_version: true` on all "whl" artifacts ([databricks#3074](databricks#3074))
* Update client version to "2" for the serverless variation of the default-python template ([databricks#3083](databricks#3083))
* Fix reading dashboard contents when the sync root is different than the bundle root ([databricks#3006](databricks#3006))
* Fix variable resolution for lookup variables with other references ([databricks#3054](databricks#3054))
* Allow users to override the Terraform version to use by setting the `DATABRICKS_TF_VERSION` environment variable ([databricks#3069](databricks#3069))

v0.255.0

Toggle v0.255.0's commit message
## Release v0.255.0

### Notable Changes

* Fix `databricks auth login` to tolerate URLs copied from the browser ([databricks#3001](databricks#3001)).

### CLI
* Use OS aware runner instead of bash for run-local command ([databricks#2996](databricks#2996))

### Bundles
* Fix "bundle summary -o json" to render null values properly ([databricks#2990](databricks#2990))
* Fix dashboard generation for already imported dashboard ([databricks#3016](databricks#3016))
* Fixed null pointer de-reference if artifacts missing fields ([databricks#3022](databricks#3022))
* Update bundle templates to also include `resources/*/*.yml` ([databricks#3024](databricks#3024))
* Apply YAML formatter on default-python and dbt-sql templates ([databricks#3026](databricks#3026))

v0.254.0

Toggle v0.254.0's commit message
## Release v0.254.0

### Bundles
* Added `experimental.skip_artifact_cleanup` flag ([databricks#2980](databricks#2980))
* Add an experimental project template for Lakeflow Declarative Pipelines ([databricks#2959](databricks#2959))

v0.253.0

Toggle v0.253.0's commit message
## Release v0.253.0

### Dependency updates
* Upgrade SDK to v0.70.0 ([databricks#2920](databricks#2920))
* Upgrade TF provider to v1.81.0 ([databricks#2936](databricks#2936))