Skip to content

Fix highlighting issue due to multi-segment imports in WorkspaceClient/AccountClient #979

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 27, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .codegen.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
"databricks/sdk/version.py": "__version__ = \"$VERSION\""
},
"toolchain": {
"required": ["python3"],
"required": ["python3.12"],
"pre_setup": [
"python3 -m venv .databricks"
"python3.12 -m venv .databricks"
],
"prepend_path": ".databricks/bin",
"setup": [
Expand All @@ -17,7 +17,7 @@
"make fmt",
"pytest -m 'not integration' --cov=databricks --cov-report html tests",
"pip install .",
"python docs/gen-client-docs.py"
"python3.12 docs/gen-client-docs.py"
]
}
}
3 changes: 3 additions & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@

### Bug Fixes

- Fix a reported highlighting problem with the way API clients are imported in WorkspaceClient/AccountClient
([#979](https://github.com/databricks/databricks-sdk-py/pull/979)).

### Documentation

### Internal Changes
Expand Down
535 changes: 275 additions & 260 deletions databricks/sdk/__init__.py

Large diffs are not rendered by default.

15 changes: 15 additions & 0 deletions docs/account/iam/access_control.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,11 @@
:param resource: str
The resource name for which assignable roles will be listed.

Examples | Summary :--- | :--- `resource=accounts/<ACCOUNT_ID>` | A resource name for the account.
`resource=accounts/<ACCOUNT_ID>/groups/<GROUP_ID>` | A resource name for the group.
`resource=accounts/<ACCOUNT_ID>/servicePrincipals/<SP_ID>` | A resource name for the service
principal.

:returns: :class:`GetAssignableRolesForResourceResponse`


Expand All @@ -30,6 +35,12 @@

:param name: str
The ruleset name associated with the request.

Examples | Summary :--- | :--- `name=accounts/<ACCOUNT_ID>/ruleSets/default` | A name for a rule set
on the account. `name=accounts/<ACCOUNT_ID>/groups/<GROUP_ID>/ruleSets/default` | A name for a rule
set on the group.
`name=accounts/<ACCOUNT_ID>/servicePrincipals/<SERVICE_PRINCIPAL_APPLICATION_ID>/ruleSets/default` |
A name for a rule set on the service principal.
:param etag: str
Etag used for versioning. The response is at least as fresh as the eTag provided. Etag is used for
optimistic concurrency control as a way to help prevent simultaneous updates of a rule set from
Expand All @@ -38,6 +49,10 @@
etag from a GET rule set request, and pass it with the PUT update request to identify the rule set
version you are updating.

Examples | Summary :--- | :--- `etag=` | An empty etag can only be used in GET to indicate no
freshness requirements. `etag=RENUAAABhSweA4NvVmmUYdiU717H3Tgy0UJdor3gE4a+mq/oj9NjAf8ZsQ==` | An
etag encoded a specific version of the rule set to get or to be updated.

:returns: :class:`RuleSetResponse`


Expand Down
8 changes: 4 additions & 4 deletions docs/account/iam/workspace_assignment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,9 @@

a = AccountClient()

workspace_id = os.environ["TEST_WORKSPACE_ID"]
workspace_id = os.environ["DUMMY_WORKSPACE_ID"]

all = a.workspace_assignment.list(list=workspace_id)
all = a.workspace_assignment.list(workspace_id=workspace_id)

Get permission assignments.

Expand Down Expand Up @@ -80,9 +80,9 @@

spn_id = spn.id

workspace_id = os.environ["TEST_WORKSPACE_ID"]
workspace_id = os.environ["DUMMY_WORKSPACE_ID"]

a.workspace_assignment.update(
_ = a.workspace_assignment.update(
workspace_id=workspace_id,
principal_id=spn_id,
permissions=[iam.WorkspacePermission.USER],
Expand Down
6 changes: 5 additions & 1 deletion docs/account/provisioning/storage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@

.. code-block::

import os
import time

from databricks.sdk import AccountClient
Expand All @@ -25,8 +26,11 @@

storage = a.storage.create(
storage_configuration_name=f"sdk-{time.time_ns()}",
root_bucket_info=provisioning.RootBucketInfo(bucket_name=f"sdk-{time.time_ns()}"),
root_bucket_info=provisioning.RootBucketInfo(bucket_name=os.environ["TEST_ROOT_BUCKET"]),
)

# cleanup
a.storage.delete(storage_configuration_id=storage.storage_configuration_id)

Create new storage configuration.

Expand Down
6 changes: 5 additions & 1 deletion docs/account/settings/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,13 @@ Manage security settings for Accounts and Workspaces

ip_access_lists
network_connectivity
network_policies
settings
csp_enablement_account
disable_legacy_features
enable_ip_access_lists
esm_enablement_account
personal_compute
llm_proxy_partner_powered_account
llm_proxy_partner_powered_enforce
personal_compute
workspace_network_configuration
46 changes: 46 additions & 0 deletions docs/account/settings/llm_proxy_partner_powered_account.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
``a.settings.llm_proxy_partner_powered_account``: Enable Partner Powered AI Features for Account
================================================================================================
.. currentmodule:: databricks.sdk.service.settings

.. py:class:: LlmProxyPartnerPoweredAccountAPI

Determines if partner powered models are enabled or not for a specific account

.. py:method:: get( [, etag: Optional[str]]) -> LlmProxyPartnerPoweredAccount

Get the enable partner powered AI features account setting.

Gets the enable partner powered AI features account setting.

:param etag: str (optional)
etag used for versioning. The response is at least as fresh as the eTag provided. This is used for
optimistic concurrency control as a way to help prevent simultaneous writes of a setting overwriting
each other. It is strongly suggested that systems make use of the etag in the read -> delete pattern
to perform setting deletions in order to avoid race conditions. That is, get an etag from a GET
request, and pass it with the DELETE request to identify the rule set version you are deleting.

:returns: :class:`LlmProxyPartnerPoweredAccount`


.. py:method:: update(allow_missing: bool, setting: LlmProxyPartnerPoweredAccount, field_mask: str) -> LlmProxyPartnerPoweredAccount

Update the enable partner powered AI features account setting.

Updates the enable partner powered AI features account setting.

:param allow_missing: bool
This should always be set to true for Settings API. Added for AIP compliance.
:param setting: :class:`LlmProxyPartnerPoweredAccount`
:param field_mask: str
The field mask must be a single string, with multiple fields separated by commas (no spaces). The
field path is relative to the resource object, using a dot (`.`) to navigate sub-fields (e.g.,
`author.given_name`). Specification of elements in sequence or map fields is not allowed, as only
the entire collection field can be specified. Field names must exactly match the resource field
names.

A field mask of `*` indicates full replacement. It’s recommended to always explicitly list the
fields being updated and avoid using `*` wildcards, as it can lead to unintended results if the API
changes in the future.

:returns: :class:`LlmProxyPartnerPoweredAccount`

47 changes: 47 additions & 0 deletions docs/account/settings/llm_proxy_partner_powered_enforce.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
``a.settings.llm_proxy_partner_powered_enforce``: Enable Enforcement of Partner Powered AI Features
===================================================================================================
.. currentmodule:: databricks.sdk.service.settings

.. py:class:: LlmProxyPartnerPoweredEnforceAPI

Determines if the account-level partner-powered setting value is enforced upon the workspace-level
partner-powered setting

.. py:method:: get( [, etag: Optional[str]]) -> LlmProxyPartnerPoweredEnforce

Get the enforcement status of partner powered AI features account setting.

Gets the enforcement status of partner powered AI features account setting.

:param etag: str (optional)
etag used for versioning. The response is at least as fresh as the eTag provided. This is used for
optimistic concurrency control as a way to help prevent simultaneous writes of a setting overwriting
each other. It is strongly suggested that systems make use of the etag in the read -> delete pattern
to perform setting deletions in order to avoid race conditions. That is, get an etag from a GET
request, and pass it with the DELETE request to identify the rule set version you are deleting.

:returns: :class:`LlmProxyPartnerPoweredEnforce`


.. py:method:: update(allow_missing: bool, setting: LlmProxyPartnerPoweredEnforce, field_mask: str) -> LlmProxyPartnerPoweredEnforce

Update the enforcement status of partner powered AI features account setting.

Updates the enable enforcement status of partner powered AI features account setting.

:param allow_missing: bool
This should always be set to true for Settings API. Added for AIP compliance.
:param setting: :class:`LlmProxyPartnerPoweredEnforce`
:param field_mask: str
The field mask must be a single string, with multiple fields separated by commas (no spaces). The
field path is relative to the resource object, using a dot (`.`) to navigate sub-fields (e.g.,
`author.given_name`). Specification of elements in sequence or map fields is not allowed, as only
the entire collection field can be specified. Field names must exactly match the resource field
names.

A field mask of `*` indicates full replacement. It’s recommended to always explicitly list the
fields being updated and avoid using `*` wildcards, as it can lead to unintended results if the API
changes in the future.

:returns: :class:`LlmProxyPartnerPoweredEnforce`

73 changes: 73 additions & 0 deletions docs/account/settings/network_policies.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
``a.network_policies``: Network Policies
========================================
.. currentmodule:: databricks.sdk.service.settings

.. py:class:: NetworkPoliciesAPI

These APIs manage network policies for this account. Network policies control which network destinations
can be accessed from the Databricks environment. Each Databricks account includes a default policy named
'default-policy'. 'default-policy' is associated with any workspace lacking an explicit network policy
assignment, and is automatically associated with each newly created workspace. 'default-policy' is
reserved and cannot be deleted, but it can be updated to customize the default network access rules for
your account.

.. py:method:: create_network_policy_rpc(network_policy: AccountNetworkPolicy) -> AccountNetworkPolicy

Create a network policy.

Creates a new network policy to manage which network destinations can be accessed from the Databricks
environment.

:param network_policy: :class:`AccountNetworkPolicy`

:returns: :class:`AccountNetworkPolicy`


.. py:method:: delete_network_policy_rpc(network_policy_id: str)

Delete a network policy.

Deletes a network policy. Cannot be called on 'default-policy'.

:param network_policy_id: str
The unique identifier of the network policy to delete.




.. py:method:: get_network_policy_rpc(network_policy_id: str) -> AccountNetworkPolicy

Get a network policy.

Gets a network policy.

:param network_policy_id: str
The unique identifier of the network policy to retrieve.

:returns: :class:`AccountNetworkPolicy`


.. py:method:: list_network_policies_rpc( [, page_token: Optional[str]]) -> Iterator[AccountNetworkPolicy]

List network policies.

Gets an array of network policies.

:param page_token: str (optional)
Pagination token to go to next page based on previous query.

:returns: Iterator over :class:`AccountNetworkPolicy`


.. py:method:: update_network_policy_rpc(network_policy_id: str, network_policy: AccountNetworkPolicy) -> AccountNetworkPolicy

Update a network policy.

Updates a network policy. This allows you to modify the configuration of a network policy.

:param network_policy_id: str
The unique identifier for the network policy.
:param network_policy: :class:`AccountNetworkPolicy`

:returns: :class:`AccountNetworkPolicy`

11 changes: 11 additions & 0 deletions docs/account/settings/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,17 @@
new workspaces. By default, this account-level setting is disabled for new workspaces. After workspace
creation, account admins can enable enhanced security monitoring individually for each workspace.

.. py:property:: llm_proxy_partner_powered_account
:type: LlmProxyPartnerPoweredAccountAPI

Determines if partner powered models are enabled or not for a specific account

.. py:property:: llm_proxy_partner_powered_enforce
:type: LlmProxyPartnerPoweredEnforceAPI

Determines if the account-level partner-powered setting value is enforced upon the workspace-level
partner-powered setting

.. py:property:: personal_compute
:type: PersonalComputeAPI

Expand Down
39 changes: 39 additions & 0 deletions docs/account/settings/workspace_network_configuration.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
``a.workspace_network_configuration``: Workspace Network Configuration
======================================================================
.. currentmodule:: databricks.sdk.service.settings

.. py:class:: WorkspaceNetworkConfigurationAPI

These APIs allow configuration of network settings for Databricks workspaces. Each workspace is always
associated with exactly one network policy that controls which network destinations can be accessed from
the Databricks environment. By default, workspaces are associated with the 'default-policy' network
policy. You cannot create or delete a workspace's network configuration, only update it to associate the
workspace with a different policy.

.. py:method:: get_workspace_network_option_rpc(workspace_id: int) -> WorkspaceNetworkOption

Get workspace network configuration.

Gets the network configuration for a workspace. Every workspace has exactly one network policy
binding, with 'default-policy' used if no explicit assignment exists.

:param workspace_id: int
The workspace ID.

:returns: :class:`WorkspaceNetworkOption`


.. py:method:: update_workspace_network_option_rpc(workspace_id: int, workspace_network_option: WorkspaceNetworkOption) -> WorkspaceNetworkOption

Update workspace network configuration.

Updates the network configuration for a workspace. This operation associates the workspace with the
specified network policy. To revert to the default policy, specify 'default-policy' as the
network_policy_id.

:param workspace_id: int
The workspace ID.
:param workspace_network_option: :class:`WorkspaceNetworkOption`

:returns: :class:`WorkspaceNetworkOption`

17 changes: 17 additions & 0 deletions docs/dbdataclasses/apps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,23 @@ These dataclasses are used in the SDK to represent API requests and responses fo
.. py:attribute:: IS_OWNER
:value: "IS_OWNER"

.. autoclass:: AppResourceUcSecurable
:members:
:undoc-members:

.. py:class:: AppResourceUcSecurableUcSecurablePermission

.. py:attribute:: READ_VOLUME
:value: "READ_VOLUME"

.. py:attribute:: WRITE_VOLUME
:value: "WRITE_VOLUME"

.. py:class:: AppResourceUcSecurableUcSecurableType

.. py:attribute:: VOLUME
:value: "VOLUME"

.. py:class:: ApplicationState

.. py:attribute:: CRASHED
Expand Down
Loading
Loading