From 1f7e308553666baa582b269e6d2fe62a4717d874 Mon Sep 17 00:00:00 2001 From: Hector Castejon Diaz Date: Tue, 25 Jun 2024 12:43:10 +0200 Subject: [PATCH] Release v0.27.0 ### Improvements * Support partners in headers for SDK ([#291](https://github.com/databricks/databricks-sdk-java/pull/291)). * Add `serverless_compute_id` field to the config ([#299](https://github.com/databricks/databricks-sdk-java/pull/299)). ### Internal Changes * Ignore DataPlane Services during generation ([#296](https://github.com/databricks/databricks-sdk-java/pull/296)). * Update OpenAPI spec ([#297](https://github.com/databricks/databricks-sdk-java/pull/297)). * Retry failed integration tests ([#298](https://github.com/databricks/databricks-sdk-java/pull/298)). ### API Changes: * Changed `list()` method for `accountClient.storageCredentials()` service to return `com.databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` class. * Changed `isolationMode` field for `com.databricks.sdk.service.catalog.CatalogInfo` to `com.databricks.sdk.service.catalog.CatalogIsolationMode` class. * Added `isolationMode` field for `com.databricks.sdk.service.catalog.ExternalLocationInfo`. * Added `maxResults` and `pageToken` fields for `com.databricks.sdk.service.catalog.ListCatalogsRequest`. * Added `nextPageToken` field for `com.databricks.sdk.service.catalog.ListCatalogsResponse`. * Added `tableServingUrl` field for `com.databricks.sdk.service.catalog.OnlineTable`. * Added `isolationMode` field for `com.databricks.sdk.service.catalog.StorageCredentialInfo`. * Changed `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateCatalog` to `com.databricks.sdk.service.catalog.CatalogIsolationMode` class. * Added `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateExternalLocation`. * Added `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateStorageCredential`. * Added `com.databricks.sdk.service.catalog.CatalogIsolationMode` and `com.databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` class.es * Added `createSchedule()`, `createSubscription()`, `deleteSchedule()`, `deleteSubscription()`, `getSchedule()`, `getSubscription()`, `list()`, `listSchedules()`, `listSubscriptions()` and `updateSchedule()` methods for `workspaceClient.lakeview()` service. * Added `com.databricks.sdk.service.dashboards.CreateScheduleRequest`, `com.databricks.sdk.service.dashboards.CreateSubscriptionRequest`, `com.databricks.sdk.service.dashboards.CronSchedule`, `com.databricks.sdk.service.dashboards.DashboardView`, `com.databricks.sdk.service.dashboards.DeleteScheduleRequest`, `com.databricks.sdk.service.dashboards.DeleteSubscriptionRequest`, `com.databricks.sdk.service.dashboards.GetScheduleRequest`, `com.databricks.sdk.service.dashboards.GetSubscriptionRequest`, `com.databricks.sdk.service.dashboards.ListDashboardsRequest`, `com.databricks.sdk.service.dashboards.ListDashboardsResponse`,`com.databricks.sdk.service.dashboards.ListSchedulesRequest`, `com.databricks.sdk.service.dashboards.ListSchedulesResponse`, `com.databricks.sdk.service.dashboards.ListSubscriptionsRequest`, `com.databricks.sdk.service.dashboards.ListSubscriptionsResponse`, `com.databricks.sdk.service.dashboards.Schedule`, `com.databricks.sdk.service.dashboards.SchedulePauseStatus`, `com.databricks.sdk.service.dashboards.Subscriber`, `com.databricks.sdk.service.dashboards.Subscription`, `com.databricks.sdk.service.dashboards.SubscriptionSubscriberDestination`, `com.databricks.sdk.service.dashboards.SubscriptionSubscriberUser`and `com.databricks.sdk.service.dashboards.UpdateScheduleRequest` classes. * Added `terminationCategory` field for `com.databricks.sdk.service.jobs.ForEachTaskErrorMessageStats`. * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.JobEmailNotifications`. * Added `environmentKey` field for `com.databricks.sdk.service.jobs.RunTask`. * Removed `conditionTask`, `dbtTask`, `notebookTask`, `pipelineTask`, `pythonWheelTask`, `runJobTask`, `sparkJarTask`, `sparkPythonTask`, `sparkSubmitTask` and `sqlTask` fields for `com.databricks.sdk.service.jobs.SubmitRun`. * Added `environments` field for `com.databricks.sdk.service.jobs.SubmitRun`. * Added `dbtTask` field for `com.databricks.sdk.service.jobs.SubmitTask`. * Added `environmentKey` field for `com.databricks.sdk.service.jobs.SubmitTask`. * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `periodic` field for `com.databricks.sdk.service.jobs.TriggerSettings`. * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.WebhookNotifications`. * Added `com.databricks.sdk.service.jobs.PeriodicTriggerConfiguration` and `com.databricks.sdk.service.jobs.PeriodicTriggerConfigurationTimeUnit` classes. * Added `batchGet()` method for `workspaceClient.consumerListings()` service. * Added `batchGet()` method for `workspaceClient.consumerProviders()` service. * Added `providerSummary` field for `com.databricks.sdk.service.marketplace.Listing`. * Added `com.databricks.sdk.service.marketplace.BatchGetListingsRequest`, `com.databricks.sdk.service.marketplace.BatchGetListingsResponse`, `com.databricks.sdk.service.marketplace.BatchGetProvidersRequest`, `com.databricks.sdk.service.marketplace.BatchGetProvidersResponse`, `com.databricks.sdk.service.marketplace.ProviderIconFile`, `com.databricks.sdk.service.marketplace.ProviderIconType`, `com.databricks.sdk.service.marketplace.ProviderListingSummaryInfo` and `com.databricks.sdk.service.oauth2.DataPlaneInfo` classes. * Removed `createDeployment()` method for `workspaceClient.apps()` service. * Added `deploy()` and `start()` methods for `workspaceClient.apps()` service. * Added `workspaceClient.servingEndpointsDataPlane()` service. * Added `servicePrincipalId` field for `com.databricks.sdk.service.serving.App`. * Added `servicePrincipalName` field for `com.databricks.sdk.service.serving.App`. * Added `mode` field for `com.databricks.sdk.service.serving.AppDeployment`. * Added `mode` field for `com.databricks.sdk.service.serving.CreateAppDeploymentRequest`. * Added `dataPlaneInfo` field for `com.databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `com.databricks.sdk.service.serving.AppDeploymentMode` class. * Added `com.databricks.sdk.service.serving.ModelDataPlaneInfo` class. * Added `com.databricks.sdk.service.serving.StartAppRequest` class. * Added `queryNextPage()` method for `workspaceClient.vectorSearchIndexes()` service. * Added `queryType` field for `com.databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`. * Added `nextPageToken` field for `com.databricks.sdk.service.vectorsearch.QueryVectorIndexResponse`. * Added `com.databricks.sdk.service.vectorsearch.QueryVectorIndexNextPageRequest` class. OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-24 --- .codegen/_openapi_sha | 2 +- .gitattributes | 42 ++++ CHANGELOG.md | 63 ++++++ databricks-sdk-java/pom.xml | 2 +- .../com/databricks/sdk/WorkspaceClient.java | 20 ++ .../com/databricks/sdk/core/UserAgent.java | 2 +- .../sdk/service/catalog/CatalogInfo.java | 6 +- .../catalog/CatalogInfoSecurableKind.java | 2 - .../service/catalog/CatalogIsolationMode.java | 14 ++ .../sdk/service/catalog/CatalogsAPI.java | 11 +- .../sdk/service/catalog/CreateMetastore.java | 3 +- .../catalog/CreateStorageCredential.java | 2 +- .../service/catalog/ExternalLocationInfo.java | 19 ++ .../sdk/service/catalog/FunctionsAPI.java | 2 + .../sdk/service/catalog/FunctionsService.java | 2 + .../sdk/service/catalog/IsolationMode.java | 4 +- .../service/catalog/ListCatalogsRequest.java | 46 +++- .../service/catalog/ListCatalogsResponse.java | 26 ++- .../sdk/service/catalog/OnlineTable.java | 19 +- .../sdk/service/catalog/Privilege.java | 1 - .../catalog/StorageCredentialInfo.java | 21 +- .../sdk/service/catalog/UpdateCatalog.java | 6 +- .../catalog/UpdateExternalLocation.java | 19 ++ .../catalog/UpdateStorageCredential.java | 21 +- .../sdk/service/compute/ClusterDetails.java | 2 +- .../sdk/service/compute/Environment.java | 3 +- .../sdk/service/compute/Policy.java | 4 +- .../dashboards/CreateScheduleRequest.java | 88 ++++++++ .../dashboards/CreateSubscriptionRequest.java | 72 +++++++ .../sdk/service/dashboards/CronSchedule.java | 70 ++++++ .../sdk/service/dashboards/DashboardView.java | 11 + .../dashboards/DeleteScheduleRequest.java | 76 +++++++ .../dashboards/DeleteScheduleResponse.java | 28 +++ .../dashboards/DeleteSubscriptionRequest.java | 90 ++++++++ .../DeleteSubscriptionResponse.java | 28 +++ .../dashboards/GetScheduleRequest.java | 57 +++++ .../dashboards/GetSubscriptionRequest.java | 71 ++++++ .../sdk/service/dashboards/LakeviewAPI.java | 139 ++++++++++++ .../sdk/service/dashboards/LakeviewImpl.java | 106 +++++++++ .../service/dashboards/LakeviewService.java | 30 +++ .../dashboards/ListDashboardsRequest.java | 100 +++++++++ .../dashboards/ListDashboardsResponse.java | 63 ++++++ .../dashboards/ListSchedulesRequest.java | 77 +++++++ .../dashboards/ListSchedulesResponse.java | 63 ++++++ .../dashboards/ListSubscriptionsRequest.java | 91 ++++++++ .../dashboards/ListSubscriptionsResponse.java | 63 ++++++ .../sdk/service/dashboards/Schedule.java | 161 ++++++++++++++ .../dashboards/SchedulePauseStatus.java | 11 + .../sdk/service/dashboards/Subscriber.java | 66 ++++++ .../sdk/service/dashboards/Subscription.java | 163 ++++++++++++++ .../SubscriptionSubscriberDestination.java | 44 ++++ .../SubscriptionSubscriberUser.java | 42 ++++ .../dashboards/UpdateScheduleRequest.java | 121 +++++++++++ .../service/jobs/JobEmailNotifications.java | 29 ++- .../sdk/service/jobs/JobEnvironment.java | 5 +- .../sdk/service/jobs/JobsHealthMetric.java | 27 ++- .../sdk/service/jobs/JobsHealthRule.java | 12 +- .../jobs/PeriodicTriggerConfiguration.java | 58 +++++ .../PeriodicTriggerConfigurationTimeUnit.java | 13 ++ .../databricks/sdk/service/jobs/RunTask.java | 19 ++ .../sdk/service/jobs/SubmitRun.java | 203 ++---------------- .../sdk/service/jobs/SubmitTask.java | 38 ++++ .../service/jobs/TaskEmailNotifications.java | 29 ++- .../sdk/service/jobs/TriggerSettings.java | 17 +- .../service/jobs/WebhookNotifications.java | 30 ++- .../sdk/service/marketplace/Listing.java | 20 +- .../service/marketplace/ProviderIconFile.java | 74 +++++++ .../service/marketplace/ProviderIconType.java | 12 ++ .../ProviderListingSummaryInfo.java | 79 +++++++ .../service/pipelines/PipelineLibrary.java | 2 +- .../databricks/sdk/service/serving/App.java | 32 +++ .../sdk/service/serving/AppsAPI.java | 13 ++ .../sdk/service/serving/AppsImpl.java | 9 + .../sdk/service/serving/AppsService.java | 7 + .../serving/AutoCaptureConfigInput.java | 11 +- .../serving/AutoCaptureConfigOutput.java | 2 +- .../serving/ServingEndpointsDataPlaneAPI.java | 41 ++++ .../ServingEndpointsDataPlaneImpl.java | 26 +++ .../ServingEndpointsDataPlaneService.java | 18 ++ .../sdk/service/serving/StartAppRequest.java | 40 ++++ .../service/settings/ComplianceStandard.java | 1 + .../sdk/service/sharing/Privilege.java | 1 - .../sdk/service/sql/AlertQuery.java | 2 +- .../databricks/sdk/service/sql/AlertsAPI.java | 32 ++- .../sdk/service/sql/AlertsService.java | 32 ++- .../sdk/service/sql/DashboardsAPI.java | 4 +- .../sdk/service/sql/DashboardsService.java | 4 +- .../sdk/service/sql/DataSource.java | 2 +- .../sdk/service/sql/DataSourcesAPI.java | 10 + .../sdk/service/sql/DataSourcesService.java | 10 + .../sdk/service/sql/DbsqlPermissionsAPI.java | 20 ++ .../service/sql/DbsqlPermissionsService.java | 20 ++ .../service/sql/ExecuteStatementRequest.java | 5 +- .../sdk/service/sql/QueriesAPI.java | 39 +++- .../sdk/service/sql/QueriesService.java | 39 +++- .../com/databricks/sdk/service/sql/Query.java | 2 +- .../sdk/service/sql/QueryEditContent.java | 2 +- .../sdk/service/sql/QueryPostContent.java | 2 +- .../sql/StatementParameterListItem.java | 6 +- .../QueryVectorIndexNextPageRequest.java | 74 +++++++ .../vectorsearch/QueryVectorIndexRequest.java | 24 ++- .../QueryVectorIndexResponse.java | 24 ++- .../vectorsearch/VectorSearchIndexesAPI.java | 14 ++ .../vectorsearch/VectorSearchIndexesImpl.java | 10 + .../VectorSearchIndexesService.java | 9 + examples/docs/pom.xml | 2 +- examples/spring-boot-oauth-u2m-demo/pom.xml | 2 +- pom.xml | 2 +- shaded/pom.xml | 2 +- 109 files changed, 3293 insertions(+), 264 deletions(-) create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogIsolationMode.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateScheduleRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateSubscriptionRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CronSchedule.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DashboardView.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleResponse.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionResponse.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetScheduleRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetSubscriptionRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsResponse.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesResponse.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsResponse.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Schedule.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SchedulePauseStatus.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscriber.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscription.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberDestination.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberUser.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateScheduleRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfiguration.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfigurationTimeUnit.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconFile.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconType.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingSummaryInfo.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneAPI.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneImpl.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneService.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StartAppRequest.java create mode 100755 databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexNextPageRequest.java diff --git a/.codegen/_openapi_sha b/.codegen/_openapi_sha index de0f45ab9..c4b47ca14 100644 --- a/.codegen/_openapi_sha +++ b/.codegen/_openapi_sha @@ -1 +1 @@ -37b925eba37dfb3d7e05b6ba2d458454ce62d3a0 \ No newline at end of file +7437dabb9dadee402c1fc060df4c1ce8cc5369f0 \ No newline at end of file diff --git a/.gitattributes b/.gitattributes index 81826cb80..185925286 100755 --- a/.gitattributes +++ b/.gitattributes @@ -92,6 +92,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CancelRefre databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CancelRefreshResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfo.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfoSecurableKind.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogIsolationMode.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogType.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsImpl.java linguist-generated=true @@ -216,6 +217,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/IsolationMo databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListAccountMetastoreAssignmentsRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListAccountMetastoreAssignmentsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListAccountStorageCredentialsRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListAccountStorageCredentialsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListConnectionsRequest.java linguist-generated=true @@ -591,21 +593,44 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/VolumesStor databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/WorkloadType.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/WorkspaceStorageInfo.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateDashboardRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateScheduleRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateSubscriptionRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CronSchedule.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Dashboard.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DashboardView.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleResponse.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetDashboardRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetPublishedDashboardRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetScheduleRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetSubscriptionRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewImpl.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewService.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LifecycleState.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsResponse.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesResponse.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/MigrateDashboardRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/PublishRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/PublishedDashboard.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Schedule.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SchedulePauseStatus.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscriber.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscription.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberDestination.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberUser.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/TrashDashboardRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/TrashDashboardResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UnpublishDashboardRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UnpublishDashboardResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateDashboardRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateScheduleRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/files/AddBlock.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/files/AddBlockResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/files/Close.java linguist-generated=true @@ -836,6 +861,8 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/ListRunsRespon databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/NotebookOutput.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/NotebookTask.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PauseStatus.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfiguration.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfigurationTimeUnit.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PipelineParams.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PipelineTask.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PythonWheelTask.java linguist-generated=true @@ -909,6 +936,10 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/WebhookNotific databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/AddExchangeForListingRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/AddExchangeForListingResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/AssetType.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/BatchGetListingsRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/BatchGetListingsResponse.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/BatchGetProvidersRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/BatchGetProvidersResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Category.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ConsumerFulfillmentsAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ConsumerFulfillmentsImpl.java linguist-generated=true @@ -1030,7 +1061,10 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Provide databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderFilesAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderFilesImpl.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderFilesService.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconFile.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconType.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderInfo.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingSummaryInfo.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingsAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingsImpl.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingsService.java linguist-generated=true @@ -1263,6 +1297,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/CreateServic databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/CustomAppIntegrationAPI.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/CustomAppIntegrationImpl.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/CustomAppIntegrationService.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/DataPlaneInfo.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/DeleteCustomAppIntegrationOutput.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/DeleteCustomAppIntegrationRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/oauth2/DeletePublishedAppIntegrationOutput.java linguist-generated=true @@ -1455,6 +1490,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AnthropicCo databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/App.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppDeployment.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppDeploymentArtifacts.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppDeploymentMode.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppDeploymentState.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppDeploymentStatus.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppEnvironment.java linguist-generated=true @@ -1511,6 +1547,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ListAppsReq databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ListAppsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ListEndpointsResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/LogsRequest.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ModelDataPlaneInfo.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/OpenAiConfig.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/PaLmConfig.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/PatchServingEndpointTags.java linguist-generated=true @@ -1546,8 +1583,12 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndp databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointPermissionsDescription.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointPermissionsRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsAPI.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneAPI.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneImpl.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneService.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsImpl.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsService.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StartAppRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StopAppRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StopAppResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/TrafficConfig.java linguist-generated=true @@ -2002,6 +2043,7 @@ databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/ListVe databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/MapStringValueEntry.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/MiniVectorIndex.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/PipelineType.java linguist-generated=true +databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexNextPageRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexRequest.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexResponse.java linguist-generated=true databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/ResultData.java linguist-generated=true diff --git a/CHANGELOG.md b/CHANGELOG.md index 4df715849..6c6f4a44d 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,68 @@ # Version changelog +## 0.27.0 + +### Improvements + + * Support partners in headers for SDK ([#291](https://github.com/databricks/databricks-sdk-java/pull/291)). + * Add `serverless_compute_id` field to the config ([#299](https://github.com/databricks/databricks-sdk-java/pull/299)). + + +### Internal Changes + + * Ignore DataPlane Services during generation ([#296](https://github.com/databricks/databricks-sdk-java/pull/296)). + * Update OpenAPI spec ([#297](https://github.com/databricks/databricks-sdk-java/pull/297)). + * Retry failed integration tests ([#298](https://github.com/databricks/databricks-sdk-java/pull/298)). + + +### API Changes: + + * Changed `list()` method for `accountClient.storageCredentials()` service to return `com.databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` class. + * Changed `isolationMode` field for `com.databricks.sdk.service.catalog.CatalogInfo` to `com.databricks.sdk.service.catalog.CatalogIsolationMode` class. + * Added `isolationMode` field for `com.databricks.sdk.service.catalog.ExternalLocationInfo`. + * Added `maxResults` and `pageToken` fields for `com.databricks.sdk.service.catalog.ListCatalogsRequest`. + * Added `nextPageToken` field for `com.databricks.sdk.service.catalog.ListCatalogsResponse`. + * Added `tableServingUrl` field for `com.databricks.sdk.service.catalog.OnlineTable`. + * Added `isolationMode` field for `com.databricks.sdk.service.catalog.StorageCredentialInfo`. + * Changed `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateCatalog` to `com.databricks.sdk.service.catalog.CatalogIsolationMode` class. + * Added `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateExternalLocation`. + * Added `isolationMode` field for `com.databricks.sdk.service.catalog.UpdateStorageCredential`. + * Added `com.databricks.sdk.service.catalog.CatalogIsolationMode` and `com.databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` class.es + * Added `createSchedule()`, `createSubscription()`, `deleteSchedule()`, `deleteSubscription()`, `getSchedule()`, `getSubscription()`, `list()`, `listSchedules()`, `listSubscriptions()` and `updateSchedule()` methods for `workspaceClient.lakeview()` service. + * Added `com.databricks.sdk.service.dashboards.CreateScheduleRequest`, `com.databricks.sdk.service.dashboards.CreateSubscriptionRequest`, `com.databricks.sdk.service.dashboards.CronSchedule`, `com.databricks.sdk.service.dashboards.DashboardView`, `com.databricks.sdk.service.dashboards.DeleteScheduleRequest`, `com.databricks.sdk.service.dashboards.DeleteSubscriptionRequest`, `com.databricks.sdk.service.dashboards.GetScheduleRequest`, `com.databricks.sdk.service.dashboards.GetSubscriptionRequest`, `com.databricks.sdk.service.dashboards.ListDashboardsRequest`, `com.databricks.sdk.service.dashboards.ListDashboardsResponse`,`com.databricks.sdk.service.dashboards.ListSchedulesRequest`, `com.databricks.sdk.service.dashboards.ListSchedulesResponse`, `com.databricks.sdk.service.dashboards.ListSubscriptionsRequest`, `com.databricks.sdk.service.dashboards.ListSubscriptionsResponse`, `com.databricks.sdk.service.dashboards.Schedule`, `com.databricks.sdk.service.dashboards.SchedulePauseStatus`, `com.databricks.sdk.service.dashboards.Subscriber`, `com.databricks.sdk.service.dashboards.Subscription`, `com.databricks.sdk.service.dashboards.SubscriptionSubscriberDestination`, `com.databricks.sdk.service.dashboards.SubscriptionSubscriberUser`and `com.databricks.sdk.service.dashboards.UpdateScheduleRequest` classes. + * Added `terminationCategory` field for `com.databricks.sdk.service.jobs.ForEachTaskErrorMessageStats`. + * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.JobEmailNotifications`. + * Added `environmentKey` field for `com.databricks.sdk.service.jobs.RunTask`. + * Removed `conditionTask`, `dbtTask`, `notebookTask`, `pipelineTask`, `pythonWheelTask`, `runJobTask`, `sparkJarTask`, `sparkPythonTask`, `sparkSubmitTask` and `sqlTask` fields for `com.databricks.sdk.service.jobs.SubmitRun`. + * Added `environments` field for `com.databricks.sdk.service.jobs.SubmitRun`. + * Added `dbtTask` field for `com.databricks.sdk.service.jobs.SubmitTask`. + * Added `environmentKey` field for `com.databricks.sdk.service.jobs.SubmitTask`. + * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.TaskEmailNotifications`. + * Added `periodic` field for `com.databricks.sdk.service.jobs.TriggerSettings`. + * Added `onStreamingBacklogExceeded` field for `com.databricks.sdk.service.jobs.WebhookNotifications`. + * Added `com.databricks.sdk.service.jobs.PeriodicTriggerConfiguration` and `com.databricks.sdk.service.jobs.PeriodicTriggerConfigurationTimeUnit` classes. + * Added `batchGet()` method for `workspaceClient.consumerListings()` service. + * Added `batchGet()` method for `workspaceClient.consumerProviders()` service. + * Added `providerSummary` field for `com.databricks.sdk.service.marketplace.Listing`. + * Added `com.databricks.sdk.service.marketplace.BatchGetListingsRequest`, `com.databricks.sdk.service.marketplace.BatchGetListingsResponse`, `com.databricks.sdk.service.marketplace.BatchGetProvidersRequest`, `com.databricks.sdk.service.marketplace.BatchGetProvidersResponse`, `com.databricks.sdk.service.marketplace.ProviderIconFile`, `com.databricks.sdk.service.marketplace.ProviderIconType`, `com.databricks.sdk.service.marketplace.ProviderListingSummaryInfo` and `com.databricks.sdk.service.oauth2.DataPlaneInfo` classes. + * Removed `createDeployment()` method for `workspaceClient.apps()` service. + * Added `deploy()` and `start()` methods for `workspaceClient.apps()` service. + * Added `workspaceClient.servingEndpointsDataPlane()` service. + * Added `servicePrincipalId` field for `com.databricks.sdk.service.serving.App`. + * Added `servicePrincipalName` field for `com.databricks.sdk.service.serving.App`. + * Added `mode` field for `com.databricks.sdk.service.serving.AppDeployment`. + * Added `mode` field for `com.databricks.sdk.service.serving.CreateAppDeploymentRequest`. + * Added `dataPlaneInfo` field for `com.databricks.sdk.service.serving.ServingEndpointDetailed`. + * Added `com.databricks.sdk.service.serving.AppDeploymentMode` class. + * Added `com.databricks.sdk.service.serving.ModelDataPlaneInfo` class. + * Added `com.databricks.sdk.service.serving.StartAppRequest` class. + * Added `queryNextPage()` method for `workspaceClient.vectorSearchIndexes()` service. + * Added `queryType` field for `com.databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`. + * Added `nextPageToken` field for `com.databricks.sdk.service.vectorsearch.QueryVectorIndexResponse`. + * Added `com.databricks.sdk.service.vectorsearch.QueryVectorIndexNextPageRequest` class. + +OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-24 + ## 0.26.0 ### Improvements diff --git a/databricks-sdk-java/pom.xml b/databricks-sdk-java/pom.xml index e2994d733..95ec12ce9 100644 --- a/databricks-sdk-java/pom.xml +++ b/databricks-sdk-java/pom.xml @@ -5,7 +5,7 @@ com.databricks databricks-sdk-parent - 0.26.0 + 0.27.0 databricks-sdk-java diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/WorkspaceClient.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/WorkspaceClient.java index 6131464af..8b0ef1aae 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/WorkspaceClient.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/WorkspaceClient.java @@ -374,6 +374,11 @@ public AccountAccessControlProxyAPI accountAccessControlProxy() { * object that periodically runs a query, evaluates a condition of its result, and notifies one or * more users and/or notification destinations if the condition was met. Alerts can be scheduled * using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public AlertsAPI alerts() { return alertsAPI; @@ -572,6 +577,11 @@ public DashboardsAPI dashboards() { *

This API does not support searches. It returns the full list of SQL warehouses in your * workspace. We advise you to use any text editor, REST client, or `grep` to search the response * from this API for the name of your SQL warehouse as it appears in Databricks SQL. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public DataSourcesAPI dataSources() { return dataSourcesAPI; @@ -598,6 +608,11 @@ public DbfsExt dbfs() { * *

- `CAN_MANAGE`: Allows all actions: read, run, edit, delete, modify permissions (superset of * `CAN_RUN`) + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public DbsqlPermissionsAPI dbsqlPermissions() { return dbsqlPermissionsAPI; @@ -1045,6 +1060,11 @@ public QualityMonitorsAPI qualityMonitors() { * These endpoints are used for CRUD operations on query definitions. Query definitions include * the target SQL warehouse, query text, name, description, tags, parameters, and visualizations. * Queries can be scheduled using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public QueriesAPI queries() { return queriesAPI; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/UserAgent.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/UserAgent.java index b51c8176c..63dcedd22 100644 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/UserAgent.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/UserAgent.java @@ -32,7 +32,7 @@ public String getValue() { // TODO: check if reading from // /META-INF/maven/com.databricks/databrics-sdk-java/pom.properties // or getClass().getPackage().getImplementationVersion() is enough. - private static final String version = "0.26.0"; + private static final String version = "0.27.0"; public static void withProduct(String product, String productVersion) { UserAgent.product = product; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfo.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfo.java index c0cec6fcf..1c6d9472d 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfo.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfo.java @@ -54,7 +54,7 @@ public class CatalogInfo { * workspaces. */ @JsonProperty("isolation_mode") - private IsolationMode isolationMode; + private CatalogIsolationMode isolationMode; /** Unique identifier of parent metastore. */ @JsonProperty("metastore_id") @@ -200,12 +200,12 @@ public String getFullName() { return fullName; } - public CatalogInfo setIsolationMode(IsolationMode isolationMode) { + public CatalogInfo setIsolationMode(CatalogIsolationMode isolationMode) { this.isolationMode = isolationMode; return this; } - public IsolationMode getIsolationMode() { + public CatalogIsolationMode getIsolationMode() { return isolationMode; } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfoSecurableKind.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfoSecurableKind.java index a4d2d7074..8e3357434 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfoSecurableKind.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogInfoSecurableKind.java @@ -17,8 +17,6 @@ public enum CatalogInfoSecurableKind { CATALOG_FOREIGN_SQLDW, CATALOG_FOREIGN_SQLSERVER, CATALOG_INTERNAL, - CATALOG_ONLINE, - CATALOG_ONLINE_INDEX, CATALOG_STANDARD, CATALOG_SYSTEM, CATALOG_SYSTEM_DELTASHARING, diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogIsolationMode.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogIsolationMode.java new file mode 100755 index 000000000..04a2037e4 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogIsolationMode.java @@ -0,0 +1,14 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.catalog; + +import com.databricks.sdk.support.Generated; + +/** + * Whether the current securable is accessible from all workspaces or a specific set of workspaces. + */ +@Generated +public enum CatalogIsolationMode { + ISOLATED, + OPEN, +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsAPI.java index 6b3ea1712..68096a0fd 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CatalogsAPI.java @@ -84,7 +84,16 @@ public CatalogInfo get(GetCatalogRequest request) { */ public Iterable list(ListCatalogsRequest request) { return new Paginator<>( - request, impl::list, ListCatalogsResponse::getCatalogs, response -> null); + request, + impl::list, + ListCatalogsResponse::getCatalogs, + response -> { + String token = response.getNextPageToken(); + if (token == null) { + return null; + } + return request.setPageToken(token); + }); } public CatalogInfo update(String name) { diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateMetastore.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateMetastore.java index d945302f4..717ad49a0 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateMetastore.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateMetastore.java @@ -14,7 +14,8 @@ public class CreateMetastore { private String name; /** - * Cloud region which the metastore serves (e.g., `us-west-2`, `westus`). If this field is + * Cloud region which the metastore serves (e.g., `us-west-2`, `westus`). The field can be omitted + * in the __workspace-level__ __API__ but not in the __account-level__ __API__. If this field is * omitted, the region of the workspace receiving the request will be used. */ @JsonProperty("region") diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateStorageCredential.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateStorageCredential.java index 23717f93d..b462a9075 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateStorageCredential.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/CreateStorageCredential.java @@ -29,7 +29,7 @@ public class CreateStorageCredential { @JsonProperty("comment") private String comment; - /** The managed GCP service account configuration. */ + /** The Databricks managed GCP service account configuration. */ @JsonProperty("databricks_gcp_service_account") private DatabricksGcpServiceAccountRequest databricksGcpServiceAccount; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ExternalLocationInfo.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ExternalLocationInfo.java index ef3d12232..6560c80dd 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ExternalLocationInfo.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ExternalLocationInfo.java @@ -44,6 +44,13 @@ public class ExternalLocationInfo { @JsonProperty("encryption_details") private EncryptionDetails encryptionDetails; + /** + * Whether the current securable is accessible from all workspaces or a specific set of + * workspaces. + */ + @JsonProperty("isolation_mode") + private IsolationMode isolationMode; + /** Unique identifier of metastore hosting the external location. */ @JsonProperty("metastore_id") private String metastoreId; @@ -144,6 +151,15 @@ public EncryptionDetails getEncryptionDetails() { return encryptionDetails; } + public ExternalLocationInfo setIsolationMode(IsolationMode isolationMode) { + this.isolationMode = isolationMode; + return this; + } + + public IsolationMode getIsolationMode() { + return isolationMode; + } + public ExternalLocationInfo setMetastoreId(String metastoreId) { this.metastoreId = metastoreId; return this; @@ -220,6 +236,7 @@ public boolean equals(Object o) { && Objects.equals(credentialId, that.credentialId) && Objects.equals(credentialName, that.credentialName) && Objects.equals(encryptionDetails, that.encryptionDetails) + && Objects.equals(isolationMode, that.isolationMode) && Objects.equals(metastoreId, that.metastoreId) && Objects.equals(name, that.name) && Objects.equals(owner, that.owner) @@ -240,6 +257,7 @@ public int hashCode() { credentialId, credentialName, encryptionDetails, + isolationMode, metastoreId, name, owner, @@ -260,6 +278,7 @@ public String toString() { .add("credentialId", credentialId) .add("credentialName", credentialName) .add("encryptionDetails", encryptionDetails) + .add("isolationMode", isolationMode) .add("metastoreId", metastoreId) .add("name", name) .add("owner", owner) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsAPI.java index d4e3e587b..136c1d905 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsAPI.java @@ -38,6 +38,8 @@ public FunctionInfo create(CreateFunction functionInfo) { /** * Create a function. * + *

**WARNING: This API is experimental and will change in future versions** + * *

Creates a new function * *

The user must have the following permissions in order for the function to be created: - diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsService.java index c47075d69..813657540 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/FunctionsService.java @@ -20,6 +20,8 @@ public interface FunctionsService { /** * Create a function. * + *

**WARNING: This API is experimental and will change in future versions** + * *

Creates a new function * *

The user must have the following permissions in order for the function to be created: - diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/IsolationMode.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/IsolationMode.java index 78978b3b5..1c6e3168f 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/IsolationMode.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/IsolationMode.java @@ -9,6 +9,6 @@ */ @Generated public enum IsolationMode { - ISOLATED, - OPEN, + ISOLATION_MODE_ISOLATED, + ISOLATION_MODE_OPEN, } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsRequest.java index f61ff29f3..2c71027b0 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsRequest.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsRequest.java @@ -17,6 +17,22 @@ public class ListCatalogsRequest { @QueryParam("include_browse") private Boolean includeBrowse; + /** + * Maximum number of catalogs to return. - when set to 0, the page length is set to a server + * configured value (recommended); - when set to a value greater than 0, the page length is the + * minimum of this value and a server configured value; - when set to a value less than 0, an + * invalid parameter error is returned; - If not set, all valid catalogs are returned (not + * recommended). - Note: The number of returned catalogs might be less than the specified + * max_results size, even zero. The only definitive indication that no further catalogs can be + * fetched is when the next_page_token is unset from the response. + */ + @QueryParam("max_results") + private Long maxResults; + + /** Opaque pagination token to go to next page based on previous query. */ + @QueryParam("page_token") + private String pageToken; + public ListCatalogsRequest setIncludeBrowse(Boolean includeBrowse) { this.includeBrowse = includeBrowse; return this; @@ -26,21 +42,45 @@ public Boolean getIncludeBrowse() { return includeBrowse; } + public ListCatalogsRequest setMaxResults(Long maxResults) { + this.maxResults = maxResults; + return this; + } + + public Long getMaxResults() { + return maxResults; + } + + public ListCatalogsRequest setPageToken(String pageToken) { + this.pageToken = pageToken; + return this; + } + + public String getPageToken() { + return pageToken; + } + @Override public boolean equals(Object o) { if (this == o) return true; if (o == null || getClass() != o.getClass()) return false; ListCatalogsRequest that = (ListCatalogsRequest) o; - return Objects.equals(includeBrowse, that.includeBrowse); + return Objects.equals(includeBrowse, that.includeBrowse) + && Objects.equals(maxResults, that.maxResults) + && Objects.equals(pageToken, that.pageToken); } @Override public int hashCode() { - return Objects.hash(includeBrowse); + return Objects.hash(includeBrowse, maxResults, pageToken); } @Override public String toString() { - return new ToStringer(ListCatalogsRequest.class).add("includeBrowse", includeBrowse).toString(); + return new ToStringer(ListCatalogsRequest.class) + .add("includeBrowse", includeBrowse) + .add("maxResults", maxResults) + .add("pageToken", pageToken) + .toString(); } } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsResponse.java index 37702d590..314b99cbd 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsResponse.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/ListCatalogsResponse.java @@ -14,6 +14,13 @@ public class ListCatalogsResponse { @JsonProperty("catalogs") private Collection catalogs; + /** + * Opaque token to retrieve the next page of results. Absent if there are no more pages. + * __page_token__ should be set to this value for the next request (for the next page of results). + */ + @JsonProperty("next_page_token") + private String nextPageToken; + public ListCatalogsResponse setCatalogs(Collection catalogs) { this.catalogs = catalogs; return this; @@ -23,21 +30,34 @@ public Collection getCatalogs() { return catalogs; } + public ListCatalogsResponse setNextPageToken(String nextPageToken) { + this.nextPageToken = nextPageToken; + return this; + } + + public String getNextPageToken() { + return nextPageToken; + } + @Override public boolean equals(Object o) { if (this == o) return true; if (o == null || getClass() != o.getClass()) return false; ListCatalogsResponse that = (ListCatalogsResponse) o; - return Objects.equals(catalogs, that.catalogs); + return Objects.equals(catalogs, that.catalogs) + && Objects.equals(nextPageToken, that.nextPageToken); } @Override public int hashCode() { - return Objects.hash(catalogs); + return Objects.hash(catalogs, nextPageToken); } @Override public String toString() { - return new ToStringer(ListCatalogsResponse.class).add("catalogs", catalogs).toString(); + return new ToStringer(ListCatalogsResponse.class) + .add("catalogs", catalogs) + .add("nextPageToken", nextPageToken) + .toString(); } } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/OnlineTable.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/OnlineTable.java index 522deb660..1e65a14ee 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/OnlineTable.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/OnlineTable.java @@ -22,6 +22,10 @@ public class OnlineTable { @JsonProperty("status") private OnlineTableStatus status; + /** Data serving REST API URL for this table */ + @JsonProperty("table_serving_url") + private String tableServingUrl; + public OnlineTable setName(String name) { this.name = name; return this; @@ -49,6 +53,15 @@ public OnlineTableStatus getStatus() { return status; } + public OnlineTable setTableServingUrl(String tableServingUrl) { + this.tableServingUrl = tableServingUrl; + return this; + } + + public String getTableServingUrl() { + return tableServingUrl; + } + @Override public boolean equals(Object o) { if (this == o) return true; @@ -56,12 +69,13 @@ public boolean equals(Object o) { OnlineTable that = (OnlineTable) o; return Objects.equals(name, that.name) && Objects.equals(spec, that.spec) - && Objects.equals(status, that.status); + && Objects.equals(status, that.status) + && Objects.equals(tableServingUrl, that.tableServingUrl); } @Override public int hashCode() { - return Objects.hash(name, spec, status); + return Objects.hash(name, spec, status, tableServingUrl); } @Override @@ -70,6 +84,7 @@ public String toString() { .add("name", name) .add("spec", spec) .add("status", status) + .add("tableServingUrl", tableServingUrl) .toString(); } } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/Privilege.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/Privilege.java index a4c5a69c1..df485f25e 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/Privilege.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/Privilege.java @@ -38,7 +38,6 @@ public enum Privilege { REFRESH, SELECT, SET_SHARE_PERMISSION, - SINGLE_USER_ACCESS, USAGE, USE_CATALOG, USE_CONNECTION, diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/StorageCredentialInfo.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/StorageCredentialInfo.java index 7a580ad73..12b687e66 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/StorageCredentialInfo.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/StorageCredentialInfo.java @@ -37,7 +37,7 @@ public class StorageCredentialInfo { @JsonProperty("created_by") private String createdBy; - /** The managed GCP service account configuration. */ + /** The Databricks managed GCP service account configuration. */ @JsonProperty("databricks_gcp_service_account") private DatabricksGcpServiceAccountResponse databricksGcpServiceAccount; @@ -45,6 +45,13 @@ public class StorageCredentialInfo { @JsonProperty("id") private String id; + /** + * Whether the current securable is accessible from all workspaces or a specific set of + * workspaces. + */ + @JsonProperty("isolation_mode") + private IsolationMode isolationMode; + /** Unique identifier of parent metastore. */ @JsonProperty("metastore_id") private String metastoreId; @@ -157,6 +164,15 @@ public String getId() { return id; } + public StorageCredentialInfo setIsolationMode(IsolationMode isolationMode) { + this.isolationMode = isolationMode; + return this; + } + + public IsolationMode getIsolationMode() { + return isolationMode; + } + public StorageCredentialInfo setMetastoreId(String metastoreId) { this.metastoreId = metastoreId; return this; @@ -234,6 +250,7 @@ public boolean equals(Object o) { && Objects.equals(createdBy, that.createdBy) && Objects.equals(databricksGcpServiceAccount, that.databricksGcpServiceAccount) && Objects.equals(id, that.id) + && Objects.equals(isolationMode, that.isolationMode) && Objects.equals(metastoreId, that.metastoreId) && Objects.equals(name, that.name) && Objects.equals(owner, that.owner) @@ -255,6 +272,7 @@ public int hashCode() { createdBy, databricksGcpServiceAccount, id, + isolationMode, metastoreId, name, owner, @@ -276,6 +294,7 @@ public String toString() { .add("createdBy", createdBy) .add("databricksGcpServiceAccount", databricksGcpServiceAccount) .add("id", id) + .add("isolationMode", isolationMode) .add("metastoreId", metastoreId) .add("name", name) .add("owner", owner) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateCatalog.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateCatalog.java index 8c502694c..7eb8e40a9 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateCatalog.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateCatalog.java @@ -23,7 +23,7 @@ public class UpdateCatalog { * workspaces. */ @JsonProperty("isolation_mode") - private IsolationMode isolationMode; + private CatalogIsolationMode isolationMode; /** The name of the catalog. */ private String name; @@ -59,12 +59,12 @@ public EnablePredictiveOptimization getEnablePredictiveOptimization() { return enablePredictiveOptimization; } - public UpdateCatalog setIsolationMode(IsolationMode isolationMode) { + public UpdateCatalog setIsolationMode(CatalogIsolationMode isolationMode) { this.isolationMode = isolationMode; return this; } - public IsolationMode getIsolationMode() { + public CatalogIsolationMode getIsolationMode() { return isolationMode; } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateExternalLocation.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateExternalLocation.java index 6489b09b3..3dac6497c 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateExternalLocation.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateExternalLocation.java @@ -29,6 +29,13 @@ public class UpdateExternalLocation { @JsonProperty("force") private Boolean force; + /** + * Whether the current securable is accessible from all workspaces or a specific set of + * workspaces. + */ + @JsonProperty("isolation_mode") + private IsolationMode isolationMode; + /** Name of the external location. */ private String name; @@ -97,6 +104,15 @@ public Boolean getForce() { return force; } + public UpdateExternalLocation setIsolationMode(IsolationMode isolationMode) { + this.isolationMode = isolationMode; + return this; + } + + public IsolationMode getIsolationMode() { + return isolationMode; + } + public UpdateExternalLocation setName(String name) { this.name = name; return this; @@ -161,6 +177,7 @@ public boolean equals(Object o) { && Objects.equals(credentialName, that.credentialName) && Objects.equals(encryptionDetails, that.encryptionDetails) && Objects.equals(force, that.force) + && Objects.equals(isolationMode, that.isolationMode) && Objects.equals(name, that.name) && Objects.equals(newName, that.newName) && Objects.equals(owner, that.owner) @@ -177,6 +194,7 @@ public int hashCode() { credentialName, encryptionDetails, force, + isolationMode, name, newName, owner, @@ -193,6 +211,7 @@ public String toString() { .add("credentialName", credentialName) .add("encryptionDetails", encryptionDetails) .add("force", force) + .add("isolationMode", isolationMode) .add("name", name) .add("newName", newName) .add("owner", owner) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateStorageCredential.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateStorageCredential.java index 29c98b451..35d5e248a 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateStorageCredential.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/catalog/UpdateStorageCredential.java @@ -29,7 +29,7 @@ public class UpdateStorageCredential { @JsonProperty("comment") private String comment; - /** The managed GCP service account configuration. */ + /** The Databricks managed GCP service account configuration. */ @JsonProperty("databricks_gcp_service_account") private DatabricksGcpServiceAccountRequest databricksGcpServiceAccount; @@ -37,6 +37,13 @@ public class UpdateStorageCredential { @JsonProperty("force") private Boolean force; + /** + * Whether the current securable is accessible from all workspaces or a specific set of + * workspaces. + */ + @JsonProperty("isolation_mode") + private IsolationMode isolationMode; + /** Name of the storage credential. */ private String name; @@ -122,6 +129,15 @@ public Boolean getForce() { return force; } + public UpdateStorageCredential setIsolationMode(IsolationMode isolationMode) { + this.isolationMode = isolationMode; + return this; + } + + public IsolationMode getIsolationMode() { + return isolationMode; + } + public UpdateStorageCredential setName(String name) { this.name = name; return this; @@ -179,6 +195,7 @@ public boolean equals(Object o) { && Objects.equals(comment, that.comment) && Objects.equals(databricksGcpServiceAccount, that.databricksGcpServiceAccount) && Objects.equals(force, that.force) + && Objects.equals(isolationMode, that.isolationMode) && Objects.equals(name, that.name) && Objects.equals(newName, that.newName) && Objects.equals(owner, that.owner) @@ -196,6 +213,7 @@ public int hashCode() { comment, databricksGcpServiceAccount, force, + isolationMode, name, newName, owner, @@ -213,6 +231,7 @@ public String toString() { .add("comment", comment) .add("databricksGcpServiceAccount", databricksGcpServiceAccount) .add("force", force) + .add("isolationMode", isolationMode) .add("name", name) .add("newName", newName) .add("owner", owner) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/ClusterDetails.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/ClusterDetails.java index 3a7a9151d..e1c192eae 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/ClusterDetails.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/ClusterDetails.java @@ -153,7 +153,7 @@ public class ClusterDetails { /** * Node on which the Spark driver resides. The driver node contains the Spark master and the - * application that manages the per-notebook Spark REPLs. + * Databricks application that manages the per-notebook Spark REPLs. */ @JsonProperty("driver") private SparkNode driver; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Environment.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Environment.java index e46010d44..74e6b36fb 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Environment.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Environment.java @@ -9,9 +9,8 @@ import java.util.Objects; /** - * The a environment entity used to preserve serverless environment side panel and jobs' environment + * The environment entity used to preserve serverless environment side panel and jobs' environment * for non-notebook task. In this minimal environment spec, only pip dependencies are supported. - * Next ID: 5 */ @Generated public class Environment { diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Policy.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Policy.java index de6167066..a820968b5 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Policy.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/compute/Policy.java @@ -35,8 +35,8 @@ public class Policy { private String description; /** - * If true, policy is a default policy created and managed by . Default policies - * cannot be deleted, and their policy families cannot be changed. + * If true, policy is a default policy created and managed by Databricks. Default policies cannot + * be deleted, and their policy families cannot be changed. */ @JsonProperty("is_default") private Boolean isDefault; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateScheduleRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateScheduleRequest.java new file mode 100755 index 000000000..304694ca3 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateScheduleRequest.java @@ -0,0 +1,88 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class CreateScheduleRequest { + /** The cron expression describing the frequency of the periodic refresh for this schedule. */ + @JsonProperty("cron_schedule") + private CronSchedule cronSchedule; + + /** UUID identifying the dashboard to which the schedule belongs. */ + private String dashboardId; + + /** The display name for schedule. */ + @JsonProperty("display_name") + private String displayName; + + /** The status indicates whether this schedule is paused or not. */ + @JsonProperty("pause_status") + private SchedulePauseStatus pauseStatus; + + public CreateScheduleRequest setCronSchedule(CronSchedule cronSchedule) { + this.cronSchedule = cronSchedule; + return this; + } + + public CronSchedule getCronSchedule() { + return cronSchedule; + } + + public CreateScheduleRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public CreateScheduleRequest setDisplayName(String displayName) { + this.displayName = displayName; + return this; + } + + public String getDisplayName() { + return displayName; + } + + public CreateScheduleRequest setPauseStatus(SchedulePauseStatus pauseStatus) { + this.pauseStatus = pauseStatus; + return this; + } + + public SchedulePauseStatus getPauseStatus() { + return pauseStatus; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + CreateScheduleRequest that = (CreateScheduleRequest) o; + return Objects.equals(cronSchedule, that.cronSchedule) + && Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(displayName, that.displayName) + && Objects.equals(pauseStatus, that.pauseStatus); + } + + @Override + public int hashCode() { + return Objects.hash(cronSchedule, dashboardId, displayName, pauseStatus); + } + + @Override + public String toString() { + return new ToStringer(CreateScheduleRequest.class) + .add("cronSchedule", cronSchedule) + .add("dashboardId", dashboardId) + .add("displayName", displayName) + .add("pauseStatus", pauseStatus) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateSubscriptionRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateSubscriptionRequest.java new file mode 100755 index 000000000..bce376403 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CreateSubscriptionRequest.java @@ -0,0 +1,72 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class CreateSubscriptionRequest { + /** UUID identifying the dashboard to which the subscription belongs. */ + private String dashboardId; + + /** UUID identifying the schedule to which the subscription belongs. */ + private String scheduleId; + + /** Subscriber details for users and destinations to be added as subscribers to the schedule. */ + @JsonProperty("subscriber") + private Subscriber subscriber; + + public CreateSubscriptionRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public CreateSubscriptionRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + public CreateSubscriptionRequest setSubscriber(Subscriber subscriber) { + this.subscriber = subscriber; + return this; + } + + public Subscriber getSubscriber() { + return subscriber; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + CreateSubscriptionRequest that = (CreateSubscriptionRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(scheduleId, that.scheduleId) + && Objects.equals(subscriber, that.subscriber); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, scheduleId, subscriber); + } + + @Override + public String toString() { + return new ToStringer(CreateSubscriptionRequest.class) + .add("dashboardId", dashboardId) + .add("scheduleId", scheduleId) + .add("subscriber", subscriber) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CronSchedule.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CronSchedule.java new file mode 100755 index 000000000..db2332001 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/CronSchedule.java @@ -0,0 +1,70 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class CronSchedule { + /** + * A cron expression using quartz syntax. EX: `0 0 8 * * ?` represents everyday at 8am. See [Cron + * Trigger] for details. + * + *

[Cron Trigger]: + * http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html + */ + @JsonProperty("quartz_cron_expression") + private String quartzCronExpression; + + /** + * A Java timezone id. The schedule will be resolved with respect to this timezone. See [Java + * TimeZone] for details. + * + *

[Java TimeZone]: https://docs.oracle.com/javase/7/docs/api/java/util/TimeZone.html + */ + @JsonProperty("timezone_id") + private String timezoneId; + + public CronSchedule setQuartzCronExpression(String quartzCronExpression) { + this.quartzCronExpression = quartzCronExpression; + return this; + } + + public String getQuartzCronExpression() { + return quartzCronExpression; + } + + public CronSchedule setTimezoneId(String timezoneId) { + this.timezoneId = timezoneId; + return this; + } + + public String getTimezoneId() { + return timezoneId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + CronSchedule that = (CronSchedule) o; + return Objects.equals(quartzCronExpression, that.quartzCronExpression) + && Objects.equals(timezoneId, that.timezoneId); + } + + @Override + public int hashCode() { + return Objects.hash(quartzCronExpression, timezoneId); + } + + @Override + public String toString() { + return new ToStringer(CronSchedule.class) + .add("quartzCronExpression", quartzCronExpression) + .add("timezoneId", timezoneId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DashboardView.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DashboardView.java new file mode 100755 index 000000000..b934e2f89 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DashboardView.java @@ -0,0 +1,11 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; + +@Generated +public enum DashboardView { + DASHBOARD_VIEW_BASIC, + DASHBOARD_VIEW_FULL, +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleRequest.java new file mode 100755 index 000000000..814f80b43 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleRequest.java @@ -0,0 +1,76 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.QueryParam; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** Delete dashboard schedule */ +@Generated +public class DeleteScheduleRequest { + /** UUID identifying the dashboard to which the schedule belongs. */ + private String dashboardId; + + /** + * The etag for the schedule. Optionally, it can be provided to verify that the schedule has not + * been modified from its last retrieval. + */ + @QueryParam("etag") + private String etag; + + /** UUID identifying the schedule. */ + private String scheduleId; + + public DeleteScheduleRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public DeleteScheduleRequest setEtag(String etag) { + this.etag = etag; + return this; + } + + public String getEtag() { + return etag; + } + + public DeleteScheduleRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + DeleteScheduleRequest that = (DeleteScheduleRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(etag, that.etag) + && Objects.equals(scheduleId, that.scheduleId); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, etag, scheduleId); + } + + @Override + public String toString() { + return new ToStringer(DeleteScheduleRequest.class) + .add("dashboardId", dashboardId) + .add("etag", etag) + .add("scheduleId", scheduleId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleResponse.java new file mode 100755 index 000000000..f21eeb237 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteScheduleResponse.java @@ -0,0 +1,28 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +@Generated +public class DeleteScheduleResponse { + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + return true; + } + + @Override + public int hashCode() { + return Objects.hash(); + } + + @Override + public String toString() { + return new ToStringer(DeleteScheduleResponse.class).toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionRequest.java new file mode 100755 index 000000000..119ab6b99 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionRequest.java @@ -0,0 +1,90 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.QueryParam; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** Delete schedule subscription */ +@Generated +public class DeleteSubscriptionRequest { + /** UUID identifying the dashboard which the subscription belongs. */ + private String dashboardId; + + /** + * The etag for the subscription. Can be optionally provided to ensure that the subscription has + * not been modified since the last read. + */ + @QueryParam("etag") + private String etag; + + /** UUID identifying the schedule which the subscription belongs. */ + private String scheduleId; + + /** UUID identifying the subscription. */ + private String subscriptionId; + + public DeleteSubscriptionRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public DeleteSubscriptionRequest setEtag(String etag) { + this.etag = etag; + return this; + } + + public String getEtag() { + return etag; + } + + public DeleteSubscriptionRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + public DeleteSubscriptionRequest setSubscriptionId(String subscriptionId) { + this.subscriptionId = subscriptionId; + return this; + } + + public String getSubscriptionId() { + return subscriptionId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + DeleteSubscriptionRequest that = (DeleteSubscriptionRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(etag, that.etag) + && Objects.equals(scheduleId, that.scheduleId) + && Objects.equals(subscriptionId, that.subscriptionId); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, etag, scheduleId, subscriptionId); + } + + @Override + public String toString() { + return new ToStringer(DeleteSubscriptionRequest.class) + .add("dashboardId", dashboardId) + .add("etag", etag) + .add("scheduleId", scheduleId) + .add("subscriptionId", subscriptionId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionResponse.java new file mode 100755 index 000000000..6325c783a --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/DeleteSubscriptionResponse.java @@ -0,0 +1,28 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +@Generated +public class DeleteSubscriptionResponse { + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + return true; + } + + @Override + public int hashCode() { + return Objects.hash(); + } + + @Override + public String toString() { + return new ToStringer(DeleteSubscriptionResponse.class).toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetScheduleRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetScheduleRequest.java new file mode 100755 index 000000000..867b17197 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetScheduleRequest.java @@ -0,0 +1,57 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** Get dashboard schedule */ +@Generated +public class GetScheduleRequest { + /** UUID identifying the dashboard to which the schedule belongs. */ + private String dashboardId; + + /** UUID identifying the schedule. */ + private String scheduleId; + + public GetScheduleRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public GetScheduleRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + GetScheduleRequest that = (GetScheduleRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(scheduleId, that.scheduleId); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, scheduleId); + } + + @Override + public String toString() { + return new ToStringer(GetScheduleRequest.class) + .add("dashboardId", dashboardId) + .add("scheduleId", scheduleId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetSubscriptionRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetSubscriptionRequest.java new file mode 100755 index 000000000..2dcca409e --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/GetSubscriptionRequest.java @@ -0,0 +1,71 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** Get schedule subscription */ +@Generated +public class GetSubscriptionRequest { + /** UUID identifying the dashboard which the subscription belongs. */ + private String dashboardId; + + /** UUID identifying the schedule which the subscription belongs. */ + private String scheduleId; + + /** UUID identifying the subscription. */ + private String subscriptionId; + + public GetSubscriptionRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public GetSubscriptionRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + public GetSubscriptionRequest setSubscriptionId(String subscriptionId) { + this.subscriptionId = subscriptionId; + return this; + } + + public String getSubscriptionId() { + return subscriptionId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + GetSubscriptionRequest that = (GetSubscriptionRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(scheduleId, that.scheduleId) + && Objects.equals(subscriptionId, that.subscriptionId); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, scheduleId, subscriptionId); + } + + @Override + public String toString() { + return new ToStringer(GetSubscriptionRequest.class) + .add("dashboardId", dashboardId) + .add("scheduleId", scheduleId) + .add("subscriptionId", subscriptionId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewAPI.java index ff62385cf..da3753144 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewAPI.java @@ -3,6 +3,7 @@ import com.databricks.sdk.core.ApiClient; import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.Paginator; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -39,6 +40,53 @@ public Dashboard create(CreateDashboardRequest request) { return impl.create(request); } + public Schedule createSchedule(String dashboardId, CronSchedule cronSchedule) { + return createSchedule( + new CreateScheduleRequest().setDashboardId(dashboardId).setCronSchedule(cronSchedule)); + } + + /** Create dashboard schedule. */ + public Schedule createSchedule(CreateScheduleRequest request) { + return impl.createSchedule(request); + } + + public Subscription createSubscription( + String dashboardId, String scheduleId, Subscriber subscriber) { + return createSubscription( + new CreateSubscriptionRequest() + .setDashboardId(dashboardId) + .setScheduleId(scheduleId) + .setSubscriber(subscriber)); + } + + /** Create schedule subscription. */ + public Subscription createSubscription(CreateSubscriptionRequest request) { + return impl.createSubscription(request); + } + + public void deleteSchedule(String dashboardId, String scheduleId) { + deleteSchedule( + new DeleteScheduleRequest().setDashboardId(dashboardId).setScheduleId(scheduleId)); + } + + /** Delete dashboard schedule. */ + public void deleteSchedule(DeleteScheduleRequest request) { + impl.deleteSchedule(request); + } + + public void deleteSubscription(String dashboardId, String scheduleId, String subscriptionId) { + deleteSubscription( + new DeleteSubscriptionRequest() + .setDashboardId(dashboardId) + .setScheduleId(scheduleId) + .setSubscriptionId(subscriptionId)); + } + + /** Delete schedule subscription. */ + public void deleteSubscription(DeleteSubscriptionRequest request) { + impl.deleteSubscription(request); + } + public Dashboard get(String dashboardId) { return get(new GetDashboardRequest().setDashboardId(dashboardId)); } @@ -65,6 +113,84 @@ public PublishedDashboard getPublished(GetPublishedDashboardRequest request) { return impl.getPublished(request); } + public Schedule getSchedule(String dashboardId, String scheduleId) { + return getSchedule( + new GetScheduleRequest().setDashboardId(dashboardId).setScheduleId(scheduleId)); + } + + /** Get dashboard schedule. */ + public Schedule getSchedule(GetScheduleRequest request) { + return impl.getSchedule(request); + } + + public Subscription getSubscription( + String dashboardId, String scheduleId, String subscriptionId) { + return getSubscription( + new GetSubscriptionRequest() + .setDashboardId(dashboardId) + .setScheduleId(scheduleId) + .setSubscriptionId(subscriptionId)); + } + + /** Get schedule subscription. */ + public Subscription getSubscription(GetSubscriptionRequest request) { + return impl.getSubscription(request); + } + + /** List dashboards. */ + public Iterable list(ListDashboardsRequest request) { + return new Paginator<>( + request, + impl::list, + ListDashboardsResponse::getDashboards, + response -> { + String token = response.getNextPageToken(); + if (token == null) { + return null; + } + return request.setPageToken(token); + }); + } + + public Iterable listSchedules(String dashboardId) { + return listSchedules(new ListSchedulesRequest().setDashboardId(dashboardId)); + } + + /** List dashboard schedules. */ + public Iterable listSchedules(ListSchedulesRequest request) { + return new Paginator<>( + request, + impl::listSchedules, + ListSchedulesResponse::getSchedules, + response -> { + String token = response.getNextPageToken(); + if (token == null) { + return null; + } + return request.setPageToken(token); + }); + } + + public Iterable listSubscriptions(String dashboardId, String scheduleId) { + return listSubscriptions( + new ListSubscriptionsRequest().setDashboardId(dashboardId).setScheduleId(scheduleId)); + } + + /** List schedule subscriptions. */ + public Iterable listSubscriptions(ListSubscriptionsRequest request) { + return new Paginator<>( + request, + impl::listSubscriptions, + ListSubscriptionsResponse::getSubscriptions, + response -> { + String token = response.getNextPageToken(); + if (token == null) { + return null; + } + return request.setPageToken(token); + }); + } + public Dashboard migrate(String sourceDashboardId) { return migrate(new MigrateDashboardRequest().setSourceDashboardId(sourceDashboardId)); } @@ -130,6 +256,19 @@ public Dashboard update(UpdateDashboardRequest request) { return impl.update(request); } + public Schedule updateSchedule(String dashboardId, String scheduleId, CronSchedule cronSchedule) { + return updateSchedule( + new UpdateScheduleRequest() + .setDashboardId(dashboardId) + .setScheduleId(scheduleId) + .setCronSchedule(cronSchedule)); + } + + /** Update dashboard schedule. */ + public Schedule updateSchedule(UpdateScheduleRequest request) { + return impl.updateSchedule(request); + } + public LakeviewService impl() { return impl; } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewImpl.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewImpl.java index 10a926490..f6b468526 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewImpl.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewImpl.java @@ -24,6 +24,50 @@ public Dashboard create(CreateDashboardRequest request) { return apiClient.POST(path, request, Dashboard.class, headers); } + @Override + public Schedule createSchedule(CreateScheduleRequest request) { + String path = + String.format("/api/2.0/lakeview/dashboards/%s/schedules", request.getDashboardId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.POST(path, request, Schedule.class, headers); + } + + @Override + public Subscription createSubscription(CreateSubscriptionRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s/subscriptions", + request.getDashboardId(), request.getScheduleId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.POST(path, request, Subscription.class, headers); + } + + @Override + public void deleteSchedule(DeleteScheduleRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s", + request.getDashboardId(), request.getScheduleId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + apiClient.DELETE(path, request, DeleteScheduleResponse.class, headers); + } + + @Override + public void deleteSubscription(DeleteSubscriptionRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s/subscriptions/%s", + request.getDashboardId(), request.getScheduleId(), request.getSubscriptionId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + apiClient.DELETE(path, request, DeleteSubscriptionResponse.class, headers); + } + @Override public Dashboard get(GetDashboardRequest request) { String path = String.format("/api/2.0/lakeview/dashboards/%s", request.getDashboardId()); @@ -41,6 +85,56 @@ public PublishedDashboard getPublished(GetPublishedDashboardRequest request) { return apiClient.GET(path, request, PublishedDashboard.class, headers); } + @Override + public Schedule getSchedule(GetScheduleRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s", + request.getDashboardId(), request.getScheduleId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + return apiClient.GET(path, request, Schedule.class, headers); + } + + @Override + public Subscription getSubscription(GetSubscriptionRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s/subscriptions/%s", + request.getDashboardId(), request.getScheduleId(), request.getSubscriptionId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + return apiClient.GET(path, request, Subscription.class, headers); + } + + @Override + public ListDashboardsResponse list(ListDashboardsRequest request) { + String path = "/api/2.0/lakeview/dashboards"; + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + return apiClient.GET(path, request, ListDashboardsResponse.class, headers); + } + + @Override + public ListSchedulesResponse listSchedules(ListSchedulesRequest request) { + String path = + String.format("/api/2.0/lakeview/dashboards/%s/schedules", request.getDashboardId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + return apiClient.GET(path, request, ListSchedulesResponse.class, headers); + } + + @Override + public ListSubscriptionsResponse listSubscriptions(ListSubscriptionsRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s/subscriptions", + request.getDashboardId(), request.getScheduleId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + return apiClient.GET(path, request, ListSubscriptionsResponse.class, headers); + } + @Override public Dashboard migrate(MigrateDashboardRequest request) { String path = "/api/2.0/lakeview/dashboards/migrate"; @@ -85,4 +179,16 @@ public Dashboard update(UpdateDashboardRequest request) { headers.put("Content-Type", "application/json"); return apiClient.PATCH(path, request, Dashboard.class, headers); } + + @Override + public Schedule updateSchedule(UpdateScheduleRequest request) { + String path = + String.format( + "/api/2.0/lakeview/dashboards/%s/schedules/%s", + request.getDashboardId(), request.getScheduleId()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.PUT(path, request, Schedule.class, headers); + } } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewService.java index d5d713404..66187e358 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/LakeviewService.java @@ -20,6 +20,18 @@ public interface LakeviewService { */ Dashboard create(CreateDashboardRequest createDashboardRequest); + /** Create dashboard schedule. */ + Schedule createSchedule(CreateScheduleRequest createScheduleRequest); + + /** Create schedule subscription. */ + Subscription createSubscription(CreateSubscriptionRequest createSubscriptionRequest); + + /** Delete dashboard schedule. */ + void deleteSchedule(DeleteScheduleRequest deleteScheduleRequest); + + /** Delete schedule subscription. */ + void deleteSubscription(DeleteSubscriptionRequest deleteSubscriptionRequest); + /** * Get dashboard. * @@ -34,6 +46,21 @@ public interface LakeviewService { */ PublishedDashboard getPublished(GetPublishedDashboardRequest getPublishedDashboardRequest); + /** Get dashboard schedule. */ + Schedule getSchedule(GetScheduleRequest getScheduleRequest); + + /** Get schedule subscription. */ + Subscription getSubscription(GetSubscriptionRequest getSubscriptionRequest); + + /** List dashboards. */ + ListDashboardsResponse list(ListDashboardsRequest listDashboardsRequest); + + /** List dashboard schedules. */ + ListSchedulesResponse listSchedules(ListSchedulesRequest listSchedulesRequest); + + /** List schedule subscriptions. */ + ListSubscriptionsResponse listSubscriptions(ListSubscriptionsRequest listSubscriptionsRequest); + /** * Migrate dashboard. * @@ -68,4 +95,7 @@ public interface LakeviewService { *

Update a draft dashboard. */ Dashboard update(UpdateDashboardRequest updateDashboardRequest); + + /** Update dashboard schedule. */ + Schedule updateSchedule(UpdateScheduleRequest updateScheduleRequest); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsRequest.java new file mode 100755 index 000000000..473b265ab --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsRequest.java @@ -0,0 +1,100 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.QueryParam; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** List dashboards */ +@Generated +public class ListDashboardsRequest { + /** The number of dashboards to return per page. */ + @QueryParam("page_size") + private Long pageSize; + + /** + * A page token, received from a previous `ListDashboards` call. This token can be used to + * retrieve the subsequent page. + */ + @QueryParam("page_token") + private String pageToken; + + /** + * The flag to include dashboards located in the trash. If unspecified, only active dashboards + * will be returned. + */ + @QueryParam("show_trashed") + private Boolean showTrashed; + + /** + * Indicates whether to include all metadata from the dashboard in the response. If unset, the + * response defaults to `DASHBOARD_VIEW_BASIC` which only includes summary metadata from the + * dashboard. + */ + @QueryParam("view") + private DashboardView view; + + public ListDashboardsRequest setPageSize(Long pageSize) { + this.pageSize = pageSize; + return this; + } + + public Long getPageSize() { + return pageSize; + } + + public ListDashboardsRequest setPageToken(String pageToken) { + this.pageToken = pageToken; + return this; + } + + public String getPageToken() { + return pageToken; + } + + public ListDashboardsRequest setShowTrashed(Boolean showTrashed) { + this.showTrashed = showTrashed; + return this; + } + + public Boolean getShowTrashed() { + return showTrashed; + } + + public ListDashboardsRequest setView(DashboardView view) { + this.view = view; + return this; + } + + public DashboardView getView() { + return view; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListDashboardsRequest that = (ListDashboardsRequest) o; + return Objects.equals(pageSize, that.pageSize) + && Objects.equals(pageToken, that.pageToken) + && Objects.equals(showTrashed, that.showTrashed) + && Objects.equals(view, that.view); + } + + @Override + public int hashCode() { + return Objects.hash(pageSize, pageToken, showTrashed, view); + } + + @Override + public String toString() { + return new ToStringer(ListDashboardsRequest.class) + .add("pageSize", pageSize) + .add("pageToken", pageToken) + .add("showTrashed", showTrashed) + .add("view", view) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsResponse.java new file mode 100755 index 000000000..de36ca888 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListDashboardsResponse.java @@ -0,0 +1,63 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Collection; +import java.util.Objects; + +@Generated +public class ListDashboardsResponse { + /** */ + @JsonProperty("dashboards") + private Collection dashboards; + + /** + * A token, which can be sent as `page_token` to retrieve the next page. If this field is omitted, + * there are no subsequent dashboards. + */ + @JsonProperty("next_page_token") + private String nextPageToken; + + public ListDashboardsResponse setDashboards(Collection dashboards) { + this.dashboards = dashboards; + return this; + } + + public Collection getDashboards() { + return dashboards; + } + + public ListDashboardsResponse setNextPageToken(String nextPageToken) { + this.nextPageToken = nextPageToken; + return this; + } + + public String getNextPageToken() { + return nextPageToken; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListDashboardsResponse that = (ListDashboardsResponse) o; + return Objects.equals(dashboards, that.dashboards) + && Objects.equals(nextPageToken, that.nextPageToken); + } + + @Override + public int hashCode() { + return Objects.hash(dashboards, nextPageToken); + } + + @Override + public String toString() { + return new ToStringer(ListDashboardsResponse.class) + .add("dashboards", dashboards) + .add("nextPageToken", nextPageToken) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesRequest.java new file mode 100755 index 000000000..6c636b0b6 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesRequest.java @@ -0,0 +1,77 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.QueryParam; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** List dashboard schedules */ +@Generated +public class ListSchedulesRequest { + /** UUID identifying the dashboard to which the schedule belongs. */ + private String dashboardId; + + /** The number of schedules to return per page. */ + @QueryParam("page_size") + private Long pageSize; + + /** + * A page token, received from a previous `ListSchedules` call. Use this to retrieve the + * subsequent page. + */ + @QueryParam("page_token") + private String pageToken; + + public ListSchedulesRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public ListSchedulesRequest setPageSize(Long pageSize) { + this.pageSize = pageSize; + return this; + } + + public Long getPageSize() { + return pageSize; + } + + public ListSchedulesRequest setPageToken(String pageToken) { + this.pageToken = pageToken; + return this; + } + + public String getPageToken() { + return pageToken; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListSchedulesRequest that = (ListSchedulesRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(pageSize, that.pageSize) + && Objects.equals(pageToken, that.pageToken); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, pageSize, pageToken); + } + + @Override + public String toString() { + return new ToStringer(ListSchedulesRequest.class) + .add("dashboardId", dashboardId) + .add("pageSize", pageSize) + .add("pageToken", pageToken) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesResponse.java new file mode 100755 index 000000000..3c3b29c79 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSchedulesResponse.java @@ -0,0 +1,63 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Collection; +import java.util.Objects; + +@Generated +public class ListSchedulesResponse { + /** + * A token that can be used as a `page_token` in subsequent requests to retrieve the next page of + * results. If this field is omitted, there are no subsequent schedules. + */ + @JsonProperty("next_page_token") + private String nextPageToken; + + /** */ + @JsonProperty("schedules") + private Collection schedules; + + public ListSchedulesResponse setNextPageToken(String nextPageToken) { + this.nextPageToken = nextPageToken; + return this; + } + + public String getNextPageToken() { + return nextPageToken; + } + + public ListSchedulesResponse setSchedules(Collection schedules) { + this.schedules = schedules; + return this; + } + + public Collection getSchedules() { + return schedules; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListSchedulesResponse that = (ListSchedulesResponse) o; + return Objects.equals(nextPageToken, that.nextPageToken) + && Objects.equals(schedules, that.schedules); + } + + @Override + public int hashCode() { + return Objects.hash(nextPageToken, schedules); + } + + @Override + public String toString() { + return new ToStringer(ListSchedulesResponse.class) + .add("nextPageToken", nextPageToken) + .add("schedules", schedules) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsRequest.java new file mode 100755 index 000000000..aae733108 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsRequest.java @@ -0,0 +1,91 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.QueryParam; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +/** List schedule subscriptions */ +@Generated +public class ListSubscriptionsRequest { + /** UUID identifying the dashboard to which the subscription belongs. */ + private String dashboardId; + + /** The number of subscriptions to return per page. */ + @QueryParam("page_size") + private Long pageSize; + + /** + * A page token, received from a previous `ListSubscriptions` call. Use this to retrieve the + * subsequent page. + */ + @QueryParam("page_token") + private String pageToken; + + /** UUID identifying the schedule to which the subscription belongs. */ + private String scheduleId; + + public ListSubscriptionsRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public ListSubscriptionsRequest setPageSize(Long pageSize) { + this.pageSize = pageSize; + return this; + } + + public Long getPageSize() { + return pageSize; + } + + public ListSubscriptionsRequest setPageToken(String pageToken) { + this.pageToken = pageToken; + return this; + } + + public String getPageToken() { + return pageToken; + } + + public ListSubscriptionsRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListSubscriptionsRequest that = (ListSubscriptionsRequest) o; + return Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(pageSize, that.pageSize) + && Objects.equals(pageToken, that.pageToken) + && Objects.equals(scheduleId, that.scheduleId); + } + + @Override + public int hashCode() { + return Objects.hash(dashboardId, pageSize, pageToken, scheduleId); + } + + @Override + public String toString() { + return new ToStringer(ListSubscriptionsRequest.class) + .add("dashboardId", dashboardId) + .add("pageSize", pageSize) + .add("pageToken", pageToken) + .add("scheduleId", scheduleId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsResponse.java new file mode 100755 index 000000000..2c8a3a199 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/ListSubscriptionsResponse.java @@ -0,0 +1,63 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Collection; +import java.util.Objects; + +@Generated +public class ListSubscriptionsResponse { + /** + * A token that can be used as a `page_token` in subsequent requests to retrieve the next page of + * results. If this field is omitted, there are no subsequent subscriptions. + */ + @JsonProperty("next_page_token") + private String nextPageToken; + + /** */ + @JsonProperty("subscriptions") + private Collection subscriptions; + + public ListSubscriptionsResponse setNextPageToken(String nextPageToken) { + this.nextPageToken = nextPageToken; + return this; + } + + public String getNextPageToken() { + return nextPageToken; + } + + public ListSubscriptionsResponse setSubscriptions(Collection subscriptions) { + this.subscriptions = subscriptions; + return this; + } + + public Collection getSubscriptions() { + return subscriptions; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ListSubscriptionsResponse that = (ListSubscriptionsResponse) o; + return Objects.equals(nextPageToken, that.nextPageToken) + && Objects.equals(subscriptions, that.subscriptions); + } + + @Override + public int hashCode() { + return Objects.hash(nextPageToken, subscriptions); + } + + @Override + public String toString() { + return new ToStringer(ListSubscriptionsResponse.class) + .add("nextPageToken", nextPageToken) + .add("subscriptions", subscriptions) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Schedule.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Schedule.java new file mode 100755 index 000000000..8897fe162 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Schedule.java @@ -0,0 +1,161 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class Schedule { + /** A timestamp indicating when the schedule was created. */ + @JsonProperty("create_time") + private String createTime; + + /** The cron expression describing the frequency of the periodic refresh for this schedule. */ + @JsonProperty("cron_schedule") + private CronSchedule cronSchedule; + + /** UUID identifying the dashboard to which the schedule belongs. */ + @JsonProperty("dashboard_id") + private String dashboardId; + + /** The display name for schedule. */ + @JsonProperty("display_name") + private String displayName; + + /** + * The etag for the schedule. Must be left empty on create, must be provided on updates to ensure + * that the schedule has not been modified since the last read, and can be optionally provided on + * delete. + */ + @JsonProperty("etag") + private String etag; + + /** The status indicates whether this schedule is paused or not. */ + @JsonProperty("pause_status") + private SchedulePauseStatus pauseStatus; + + /** UUID identifying the schedule. */ + @JsonProperty("schedule_id") + private String scheduleId; + + /** A timestamp indicating when the schedule was last updated. */ + @JsonProperty("update_time") + private String updateTime; + + public Schedule setCreateTime(String createTime) { + this.createTime = createTime; + return this; + } + + public String getCreateTime() { + return createTime; + } + + public Schedule setCronSchedule(CronSchedule cronSchedule) { + this.cronSchedule = cronSchedule; + return this; + } + + public CronSchedule getCronSchedule() { + return cronSchedule; + } + + public Schedule setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public Schedule setDisplayName(String displayName) { + this.displayName = displayName; + return this; + } + + public String getDisplayName() { + return displayName; + } + + public Schedule setEtag(String etag) { + this.etag = etag; + return this; + } + + public String getEtag() { + return etag; + } + + public Schedule setPauseStatus(SchedulePauseStatus pauseStatus) { + this.pauseStatus = pauseStatus; + return this; + } + + public SchedulePauseStatus getPauseStatus() { + return pauseStatus; + } + + public Schedule setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + public Schedule setUpdateTime(String updateTime) { + this.updateTime = updateTime; + return this; + } + + public String getUpdateTime() { + return updateTime; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Schedule that = (Schedule) o; + return Objects.equals(createTime, that.createTime) + && Objects.equals(cronSchedule, that.cronSchedule) + && Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(displayName, that.displayName) + && Objects.equals(etag, that.etag) + && Objects.equals(pauseStatus, that.pauseStatus) + && Objects.equals(scheduleId, that.scheduleId) + && Objects.equals(updateTime, that.updateTime); + } + + @Override + public int hashCode() { + return Objects.hash( + createTime, + cronSchedule, + dashboardId, + displayName, + etag, + pauseStatus, + scheduleId, + updateTime); + } + + @Override + public String toString() { + return new ToStringer(Schedule.class) + .add("createTime", createTime) + .add("cronSchedule", cronSchedule) + .add("dashboardId", dashboardId) + .add("displayName", displayName) + .add("etag", etag) + .add("pauseStatus", pauseStatus) + .add("scheduleId", scheduleId) + .add("updateTime", updateTime) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SchedulePauseStatus.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SchedulePauseStatus.java new file mode 100755 index 000000000..6872e88d7 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SchedulePauseStatus.java @@ -0,0 +1,11 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; + +@Generated +public enum SchedulePauseStatus { + PAUSED, + UNPAUSED, +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscriber.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscriber.java new file mode 100755 index 000000000..b1677565e --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscriber.java @@ -0,0 +1,66 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class Subscriber { + /** + * The destination to receive the subscription email. This parameter is mutually exclusive with + * `user_subscriber`. + */ + @JsonProperty("destination_subscriber") + private SubscriptionSubscriberDestination destinationSubscriber; + + /** + * The user to receive the subscription email. This parameter is mutually exclusive with + * `destination_subscriber`. + */ + @JsonProperty("user_subscriber") + private SubscriptionSubscriberUser userSubscriber; + + public Subscriber setDestinationSubscriber( + SubscriptionSubscriberDestination destinationSubscriber) { + this.destinationSubscriber = destinationSubscriber; + return this; + } + + public SubscriptionSubscriberDestination getDestinationSubscriber() { + return destinationSubscriber; + } + + public Subscriber setUserSubscriber(SubscriptionSubscriberUser userSubscriber) { + this.userSubscriber = userSubscriber; + return this; + } + + public SubscriptionSubscriberUser getUserSubscriber() { + return userSubscriber; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Subscriber that = (Subscriber) o; + return Objects.equals(destinationSubscriber, that.destinationSubscriber) + && Objects.equals(userSubscriber, that.userSubscriber); + } + + @Override + public int hashCode() { + return Objects.hash(destinationSubscriber, userSubscriber); + } + + @Override + public String toString() { + return new ToStringer(Subscriber.class) + .add("destinationSubscriber", destinationSubscriber) + .add("userSubscriber", userSubscriber) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscription.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscription.java new file mode 100755 index 000000000..c271bee40 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/Subscription.java @@ -0,0 +1,163 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class Subscription { + /** A timestamp indicating when the subscription was created. */ + @JsonProperty("create_time") + private String createTime; + + /** + * UserId of the user who adds subscribers (users or notification destinations) to the dashboard's + * schedule. + */ + @JsonProperty("created_by_user_id") + private Long createdByUserId; + + /** UUID identifying the dashboard to which the subscription belongs. */ + @JsonProperty("dashboard_id") + private String dashboardId; + + /** + * The etag for the subscription. Must be left empty on create, can be optionally provided on + * delete to ensure that the subscription has not been deleted since the last read. + */ + @JsonProperty("etag") + private String etag; + + /** UUID identifying the schedule to which the subscription belongs. */ + @JsonProperty("schedule_id") + private String scheduleId; + + /** Subscriber details for users and destinations to be added as subscribers to the schedule. */ + @JsonProperty("subscriber") + private Subscriber subscriber; + + /** UUID identifying the subscription. */ + @JsonProperty("subscription_id") + private String subscriptionId; + + /** A timestamp indicating when the subscription was last updated. */ + @JsonProperty("update_time") + private String updateTime; + + public Subscription setCreateTime(String createTime) { + this.createTime = createTime; + return this; + } + + public String getCreateTime() { + return createTime; + } + + public Subscription setCreatedByUserId(Long createdByUserId) { + this.createdByUserId = createdByUserId; + return this; + } + + public Long getCreatedByUserId() { + return createdByUserId; + } + + public Subscription setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public Subscription setEtag(String etag) { + this.etag = etag; + return this; + } + + public String getEtag() { + return etag; + } + + public Subscription setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + public Subscription setSubscriber(Subscriber subscriber) { + this.subscriber = subscriber; + return this; + } + + public Subscriber getSubscriber() { + return subscriber; + } + + public Subscription setSubscriptionId(String subscriptionId) { + this.subscriptionId = subscriptionId; + return this; + } + + public String getSubscriptionId() { + return subscriptionId; + } + + public Subscription setUpdateTime(String updateTime) { + this.updateTime = updateTime; + return this; + } + + public String getUpdateTime() { + return updateTime; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Subscription that = (Subscription) o; + return Objects.equals(createTime, that.createTime) + && Objects.equals(createdByUserId, that.createdByUserId) + && Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(etag, that.etag) + && Objects.equals(scheduleId, that.scheduleId) + && Objects.equals(subscriber, that.subscriber) + && Objects.equals(subscriptionId, that.subscriptionId) + && Objects.equals(updateTime, that.updateTime); + } + + @Override + public int hashCode() { + return Objects.hash( + createTime, + createdByUserId, + dashboardId, + etag, + scheduleId, + subscriber, + subscriptionId, + updateTime); + } + + @Override + public String toString() { + return new ToStringer(Subscription.class) + .add("createTime", createTime) + .add("createdByUserId", createdByUserId) + .add("dashboardId", dashboardId) + .add("etag", etag) + .add("scheduleId", scheduleId) + .add("subscriber", subscriber) + .add("subscriptionId", subscriptionId) + .add("updateTime", updateTime) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberDestination.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberDestination.java new file mode 100755 index 000000000..cfdbdda70 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberDestination.java @@ -0,0 +1,44 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class SubscriptionSubscriberDestination { + /** The canonical identifier of the destination to receive email notification. */ + @JsonProperty("destination_id") + private String destinationId; + + public SubscriptionSubscriberDestination setDestinationId(String destinationId) { + this.destinationId = destinationId; + return this; + } + + public String getDestinationId() { + return destinationId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + SubscriptionSubscriberDestination that = (SubscriptionSubscriberDestination) o; + return Objects.equals(destinationId, that.destinationId); + } + + @Override + public int hashCode() { + return Objects.hash(destinationId); + } + + @Override + public String toString() { + return new ToStringer(SubscriptionSubscriberDestination.class) + .add("destinationId", destinationId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberUser.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberUser.java new file mode 100755 index 000000000..0338eac01 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/SubscriptionSubscriberUser.java @@ -0,0 +1,42 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class SubscriptionSubscriberUser { + /** UserId of the subscriber. */ + @JsonProperty("user_id") + private Long userId; + + public SubscriptionSubscriberUser setUserId(Long userId) { + this.userId = userId; + return this; + } + + public Long getUserId() { + return userId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + SubscriptionSubscriberUser that = (SubscriptionSubscriberUser) o; + return Objects.equals(userId, that.userId); + } + + @Override + public int hashCode() { + return Objects.hash(userId); + } + + @Override + public String toString() { + return new ToStringer(SubscriptionSubscriberUser.class).add("userId", userId).toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateScheduleRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateScheduleRequest.java new file mode 100755 index 000000000..81a403800 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/dashboards/UpdateScheduleRequest.java @@ -0,0 +1,121 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.dashboards; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class UpdateScheduleRequest { + /** The cron expression describing the frequency of the periodic refresh for this schedule. */ + @JsonProperty("cron_schedule") + private CronSchedule cronSchedule; + + /** UUID identifying the dashboard to which the schedule belongs. */ + private String dashboardId; + + /** The display name for schedule. */ + @JsonProperty("display_name") + private String displayName; + + /** + * The etag for the schedule. Must be left empty on create, must be provided on updates to ensure + * that the schedule has not been modified since the last read, and can be optionally provided on + * delete. + */ + @JsonProperty("etag") + private String etag; + + /** The status indicates whether this schedule is paused or not. */ + @JsonProperty("pause_status") + private SchedulePauseStatus pauseStatus; + + /** UUID identifying the schedule. */ + private String scheduleId; + + public UpdateScheduleRequest setCronSchedule(CronSchedule cronSchedule) { + this.cronSchedule = cronSchedule; + return this; + } + + public CronSchedule getCronSchedule() { + return cronSchedule; + } + + public UpdateScheduleRequest setDashboardId(String dashboardId) { + this.dashboardId = dashboardId; + return this; + } + + public String getDashboardId() { + return dashboardId; + } + + public UpdateScheduleRequest setDisplayName(String displayName) { + this.displayName = displayName; + return this; + } + + public String getDisplayName() { + return displayName; + } + + public UpdateScheduleRequest setEtag(String etag) { + this.etag = etag; + return this; + } + + public String getEtag() { + return etag; + } + + public UpdateScheduleRequest setPauseStatus(SchedulePauseStatus pauseStatus) { + this.pauseStatus = pauseStatus; + return this; + } + + public SchedulePauseStatus getPauseStatus() { + return pauseStatus; + } + + public UpdateScheduleRequest setScheduleId(String scheduleId) { + this.scheduleId = scheduleId; + return this; + } + + public String getScheduleId() { + return scheduleId; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + UpdateScheduleRequest that = (UpdateScheduleRequest) o; + return Objects.equals(cronSchedule, that.cronSchedule) + && Objects.equals(dashboardId, that.dashboardId) + && Objects.equals(displayName, that.displayName) + && Objects.equals(etag, that.etag) + && Objects.equals(pauseStatus, that.pauseStatus) + && Objects.equals(scheduleId, that.scheduleId); + } + + @Override + public int hashCode() { + return Objects.hash(cronSchedule, dashboardId, displayName, etag, pauseStatus, scheduleId); + } + + @Override + public String toString() { + return new ToStringer(UpdateScheduleRequest.class) + .add("cronSchedule", cronSchedule) + .add("dashboardId", dashboardId) + .add("displayName", displayName) + .add("etag", etag) + .add("pauseStatus", pauseStatus) + .add("scheduleId", scheduleId) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEmailNotifications.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEmailNotifications.java index f723580f6..41345592f 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEmailNotifications.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEmailNotifications.java @@ -39,6 +39,16 @@ public class JobEmailNotifications { @JsonProperty("on_start") private Collection onStart; + /** + * A list of email addresses to notify when any streaming backlog thresholds are exceeded for any + * stream. Streaming backlog thresholds can be set in the `health` field using the following + * metrics: `STREAMING_BACKLOG_BYTES`, `STREAMING_BACKLOG_RECORDS`, `STREAMING_BACKLOG_SECONDS`, + * or `STREAMING_BACKLOG_FILES`. Alerting is based on the 10-minute average of these metrics. If + * the issue persists, notifications are resent every 30 minutes. + */ + @JsonProperty("on_streaming_backlog_exceeded") + private Collection onStreamingBacklogExceeded; + /** * A list of email addresses to be notified when a run successfully completes. A run is considered * to have completed successfully if it ends with a `TERMINATED` `life_cycle_state` and a @@ -85,6 +95,16 @@ public Collection getOnStart() { return onStart; } + public JobEmailNotifications setOnStreamingBacklogExceeded( + Collection onStreamingBacklogExceeded) { + this.onStreamingBacklogExceeded = onStreamingBacklogExceeded; + return this; + } + + public Collection getOnStreamingBacklogExceeded() { + return onStreamingBacklogExceeded; + } + public JobEmailNotifications setOnSuccess(Collection onSuccess) { this.onSuccess = onSuccess; return this; @@ -104,13 +124,19 @@ public boolean equals(Object o) { onDurationWarningThresholdExceeded, that.onDurationWarningThresholdExceeded) && Objects.equals(onFailure, that.onFailure) && Objects.equals(onStart, that.onStart) + && Objects.equals(onStreamingBacklogExceeded, that.onStreamingBacklogExceeded) && Objects.equals(onSuccess, that.onSuccess); } @Override public int hashCode() { return Objects.hash( - noAlertForSkippedRuns, onDurationWarningThresholdExceeded, onFailure, onStart, onSuccess); + noAlertForSkippedRuns, + onDurationWarningThresholdExceeded, + onFailure, + onStart, + onStreamingBacklogExceeded, + onSuccess); } @Override @@ -120,6 +146,7 @@ public String toString() { .add("onDurationWarningThresholdExceeded", onDurationWarningThresholdExceeded) .add("onFailure", onFailure) .add("onStart", onStart) + .add("onStreamingBacklogExceeded", onStreamingBacklogExceeded) .add("onSuccess", onSuccess) .toString(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEnvironment.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEnvironment.java index 9fb6ddbdd..764152939 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEnvironment.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobEnvironment.java @@ -14,9 +14,8 @@ public class JobEnvironment { private String environmentKey; /** - * The a environment entity used to preserve serverless environment side panel and jobs' - * environment for non-notebook task. In this minimal environment spec, only pip dependencies are - * supported. Next ID: 5 + * The environment entity used to preserve serverless environment side panel and jobs' environment + * for non-notebook task. In this minimal environment spec, only pip dependencies are supported. */ @JsonProperty("spec") private com.databricks.sdk.service.compute.Environment spec; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthMetric.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthMetric.java index 2989698b5..9ae30cea2 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthMetric.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthMetric.java @@ -4,8 +4,31 @@ import com.databricks.sdk.support.Generated; -/** Specifies the health metric that is being evaluated for a particular health rule. */ +/** + * Specifies the health metric that is being evaluated for a particular health rule. + * + *

* `RUN_DURATION_SECONDS`: Expected total time for a run in seconds. * + * `STREAMING_BACKLOG_BYTES`: An estimate of the maximum bytes of data waiting to be consumed across + * all streams. This metric is in Private Preview. * `STREAMING_BACKLOG_RECORDS`: An estimate of the + * maximum offset lag across all streams. This metric is in Private Preview. * + * `STREAMING_BACKLOG_SECONDS`: An estimate of the maximum consumer delay across all streams. This + * metric is in Private Preview. * `STREAMING_BACKLOG_FILES`: An estimate of the maximum number of + * outstanding files across all streams. This metric is in Private Preview. + */ @Generated public enum JobsHealthMetric { - RUN_DURATION_SECONDS, + RUN_DURATION_SECONDS, // Expected total time for a run in seconds. + STREAMING_BACKLOG_BYTES, // An estimate of the maximum bytes of data waiting to be consumed across + // all + // streams. This metric is in Private Preview. + STREAMING_BACKLOG_FILES, // An estimate of the maximum number of outstanding files across all + // streams. + // This metric is in Private Preview. + STREAMING_BACKLOG_RECORDS, // An estimate of the maximum offset lag across all streams. This + // metric is in + // Private Preview. + STREAMING_BACKLOG_SECONDS, // An estimate of the maximum consumer delay across all streams. This + // metric is + // in Private Preview. + } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthRule.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthRule.java index a7c9589f9..406782fdd 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthRule.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/JobsHealthRule.java @@ -9,7 +9,17 @@ @Generated public class JobsHealthRule { - /** Specifies the health metric that is being evaluated for a particular health rule. */ + /** + * Specifies the health metric that is being evaluated for a particular health rule. + * + *

* `RUN_DURATION_SECONDS`: Expected total time for a run in seconds. * + * `STREAMING_BACKLOG_BYTES`: An estimate of the maximum bytes of data waiting to be consumed + * across all streams. This metric is in Private Preview. * `STREAMING_BACKLOG_RECORDS`: An + * estimate of the maximum offset lag across all streams. This metric is in Private Preview. * + * `STREAMING_BACKLOG_SECONDS`: An estimate of the maximum consumer delay across all streams. This + * metric is in Private Preview. * `STREAMING_BACKLOG_FILES`: An estimate of the maximum number of + * outstanding files across all streams. This metric is in Private Preview. + */ @JsonProperty("metric") private JobsHealthMetric metric; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfiguration.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfiguration.java new file mode 100755 index 000000000..46e62b48e --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfiguration.java @@ -0,0 +1,58 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.jobs; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class PeriodicTriggerConfiguration { + /** The interval at which the trigger should run. */ + @JsonProperty("interval") + private Long interval; + + /** The unit of time for the interval. */ + @JsonProperty("unit") + private PeriodicTriggerConfigurationTimeUnit unit; + + public PeriodicTriggerConfiguration setInterval(Long interval) { + this.interval = interval; + return this; + } + + public Long getInterval() { + return interval; + } + + public PeriodicTriggerConfiguration setUnit(PeriodicTriggerConfigurationTimeUnit unit) { + this.unit = unit; + return this; + } + + public PeriodicTriggerConfigurationTimeUnit getUnit() { + return unit; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + PeriodicTriggerConfiguration that = (PeriodicTriggerConfiguration) o; + return Objects.equals(interval, that.interval) && Objects.equals(unit, that.unit); + } + + @Override + public int hashCode() { + return Objects.hash(interval, unit); + } + + @Override + public String toString() { + return new ToStringer(PeriodicTriggerConfiguration.class) + .add("interval", interval) + .add("unit", unit) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfigurationTimeUnit.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfigurationTimeUnit.java new file mode 100755 index 000000000..260063b46 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/PeriodicTriggerConfigurationTimeUnit.java @@ -0,0 +1,13 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.jobs; + +import com.databricks.sdk.support.Generated; + +@Generated +public enum PeriodicTriggerConfigurationTimeUnit { + DAYS, + HOURS, + TIME_UNIT_UNSPECIFIED, + WEEKS, +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/RunTask.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/RunTask.java index 4a46148a5..7fa8e90f3 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/RunTask.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/RunTask.java @@ -78,6 +78,13 @@ public class RunTask { @JsonProperty("end_time") private Long endTime; + /** + * The key that references an environment spec in a job. This field is required for Python script, + * Python wheel and dbt tasks when using serverless compute. + */ + @JsonProperty("environment_key") + private String environmentKey; + /** * The time in milliseconds it took to execute the commands in the JAR or notebook until they * completed, failed, timed out, were cancelled, or encountered an unexpected error. The duration @@ -338,6 +345,15 @@ public Long getEndTime() { return endTime; } + public RunTask setEnvironmentKey(String environmentKey) { + this.environmentKey = environmentKey; + return this; + } + + public String getEnvironmentKey() { + return environmentKey; + } + public RunTask setExecutionDuration(Long executionDuration) { this.executionDuration = executionDuration; return this; @@ -604,6 +620,7 @@ public boolean equals(Object o) { && Objects.equals(description, that.description) && Objects.equals(emailNotifications, that.emailNotifications) && Objects.equals(endTime, that.endTime) + && Objects.equals(environmentKey, that.environmentKey) && Objects.equals(executionDuration, that.executionDuration) && Objects.equals(existingClusterId, that.existingClusterId) && Objects.equals(forEachTask, that.forEachTask) @@ -646,6 +663,7 @@ public int hashCode() { description, emailNotifications, endTime, + environmentKey, executionDuration, existingClusterId, forEachTask, @@ -688,6 +706,7 @@ public String toString() { .add("description", description) .add("emailNotifications", emailNotifications) .add("endTime", endTime) + .add("environmentKey", environmentKey) .add("executionDuration", executionDuration) .add("existingClusterId", existingClusterId) .add("forEachTask", forEachTask) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitRun.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitRun.java index f9602d388..c07f088d5 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitRun.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitRun.java @@ -14,25 +14,17 @@ public class SubmitRun { @JsonProperty("access_control_list") private Collection accessControlList; - /** - * If condition_task, specifies a condition with an outcome that can be used to control the - * execution of other tasks. Does not require a cluster to execute and does not support retries or - * notifications. - */ - @JsonProperty("condition_task") - private ConditionTask conditionTask; - - /** - * If dbt_task, indicates that this must execute a dbt task. It requires both Databricks SQL and - * the ability to use a serverless or a pro SQL warehouse. - */ - @JsonProperty("dbt_task") - private DbtTask dbtTask; - /** An optional set of email addresses notified when the run begins or completes. */ @JsonProperty("email_notifications") private JobEmailNotifications emailNotifications; + /** + * A list of task execution environment specifications that can be referenced by tasks of this + * run. + */ + @JsonProperty("environments") + private Collection environments; + /** * An optional specification for a remote Git repository containing the source code used by tasks. * Version-controlled source code is supported by notebook, dbt, Python script, and SQL File @@ -69,13 +61,6 @@ public class SubmitRun { @JsonProperty("idempotency_token") private String idempotencyToken; - /** - * If notebook_task, indicates that this task must run a notebook. This field may not be specified - * in conjunction with spark_jar_task. - */ - @JsonProperty("notebook_task") - private NotebookTask notebookTask; - /** * Optional notification settings that are used when sending notifications to each of the * `email_notifications` and `webhook_notifications` for this run. @@ -83,14 +68,6 @@ public class SubmitRun { @JsonProperty("notification_settings") private JobNotificationSettings notificationSettings; - /** If pipeline_task, indicates that this task must execute a Pipeline. */ - @JsonProperty("pipeline_task") - private PipelineTask pipelineTask; - - /** If python_wheel_task, indicates that this job must execute a PythonWheel. */ - @JsonProperty("python_wheel_task") - private PythonWheelTask pythonWheelTask; - /** The queue settings of the one-time run. */ @JsonProperty("queue") private QueueSettings queue; @@ -102,46 +79,10 @@ public class SubmitRun { @JsonProperty("run_as") private JobRunAs runAs; - /** If run_job_task, indicates that this task must execute another job. */ - @JsonProperty("run_job_task") - private RunJobTask runJobTask; - /** An optional name for the run. The default value is `Untitled`. */ @JsonProperty("run_name") private String runName; - /** If spark_jar_task, indicates that this task must run a JAR. */ - @JsonProperty("spark_jar_task") - private SparkJarTask sparkJarTask; - - /** If spark_python_task, indicates that this task must run a Python file. */ - @JsonProperty("spark_python_task") - private SparkPythonTask sparkPythonTask; - - /** - * If `spark_submit_task`, indicates that this task must be launched by the spark submit script. - * This task can run only on new clusters. - * - *

In the `new_cluster` specification, `libraries` and `spark_conf` are not supported. Instead, - * use `--jars` and `--py-files` to add Java and Python libraries and `--conf` to set the Spark - * configurations. - * - *

`master`, `deploy-mode`, and `executor-cores` are automatically configured by Databricks; - * you _cannot_ specify them in parameters. - * - *

By default, the Spark submit job uses all available memory (excluding reserved memory for - * Databricks services). You can set `--driver-memory`, and `--executor-memory` to a smaller value - * to leave some room for off-heap usage. - * - *

The `--jars`, `--py-files`, `--files` arguments support DBFS and S3 paths. - */ - @JsonProperty("spark_submit_task") - private SparkSubmitTask sparkSubmitTask; - - /** If sql_task, indicates that this job must execute a SQL task. */ - @JsonProperty("sql_task") - private SqlTask sqlTask; - /** */ @JsonProperty("tasks") private Collection tasks; @@ -164,31 +105,22 @@ public Collection getAccess return accessControlList; } - public SubmitRun setConditionTask(ConditionTask conditionTask) { - this.conditionTask = conditionTask; - return this; - } - - public ConditionTask getConditionTask() { - return conditionTask; - } - - public SubmitRun setDbtTask(DbtTask dbtTask) { - this.dbtTask = dbtTask; + public SubmitRun setEmailNotifications(JobEmailNotifications emailNotifications) { + this.emailNotifications = emailNotifications; return this; } - public DbtTask getDbtTask() { - return dbtTask; + public JobEmailNotifications getEmailNotifications() { + return emailNotifications; } - public SubmitRun setEmailNotifications(JobEmailNotifications emailNotifications) { - this.emailNotifications = emailNotifications; + public SubmitRun setEnvironments(Collection environments) { + this.environments = environments; return this; } - public JobEmailNotifications getEmailNotifications() { - return emailNotifications; + public Collection getEnvironments() { + return environments; } public SubmitRun setGitSource(GitSource gitSource) { @@ -218,15 +150,6 @@ public String getIdempotencyToken() { return idempotencyToken; } - public SubmitRun setNotebookTask(NotebookTask notebookTask) { - this.notebookTask = notebookTask; - return this; - } - - public NotebookTask getNotebookTask() { - return notebookTask; - } - public SubmitRun setNotificationSettings(JobNotificationSettings notificationSettings) { this.notificationSettings = notificationSettings; return this; @@ -236,24 +159,6 @@ public JobNotificationSettings getNotificationSettings() { return notificationSettings; } - public SubmitRun setPipelineTask(PipelineTask pipelineTask) { - this.pipelineTask = pipelineTask; - return this; - } - - public PipelineTask getPipelineTask() { - return pipelineTask; - } - - public SubmitRun setPythonWheelTask(PythonWheelTask pythonWheelTask) { - this.pythonWheelTask = pythonWheelTask; - return this; - } - - public PythonWheelTask getPythonWheelTask() { - return pythonWheelTask; - } - public SubmitRun setQueue(QueueSettings queue) { this.queue = queue; return this; @@ -272,15 +177,6 @@ public JobRunAs getRunAs() { return runAs; } - public SubmitRun setRunJobTask(RunJobTask runJobTask) { - this.runJobTask = runJobTask; - return this; - } - - public RunJobTask getRunJobTask() { - return runJobTask; - } - public SubmitRun setRunName(String runName) { this.runName = runName; return this; @@ -290,42 +186,6 @@ public String getRunName() { return runName; } - public SubmitRun setSparkJarTask(SparkJarTask sparkJarTask) { - this.sparkJarTask = sparkJarTask; - return this; - } - - public SparkJarTask getSparkJarTask() { - return sparkJarTask; - } - - public SubmitRun setSparkPythonTask(SparkPythonTask sparkPythonTask) { - this.sparkPythonTask = sparkPythonTask; - return this; - } - - public SparkPythonTask getSparkPythonTask() { - return sparkPythonTask; - } - - public SubmitRun setSparkSubmitTask(SparkSubmitTask sparkSubmitTask) { - this.sparkSubmitTask = sparkSubmitTask; - return this; - } - - public SparkSubmitTask getSparkSubmitTask() { - return sparkSubmitTask; - } - - public SubmitRun setSqlTask(SqlTask sqlTask) { - this.sqlTask = sqlTask; - return this; - } - - public SqlTask getSqlTask() { - return sqlTask; - } - public SubmitRun setTasks(Collection tasks) { this.tasks = tasks; return this; @@ -359,24 +219,15 @@ public boolean equals(Object o) { if (o == null || getClass() != o.getClass()) return false; SubmitRun that = (SubmitRun) o; return Objects.equals(accessControlList, that.accessControlList) - && Objects.equals(conditionTask, that.conditionTask) - && Objects.equals(dbtTask, that.dbtTask) && Objects.equals(emailNotifications, that.emailNotifications) + && Objects.equals(environments, that.environments) && Objects.equals(gitSource, that.gitSource) && Objects.equals(health, that.health) && Objects.equals(idempotencyToken, that.idempotencyToken) - && Objects.equals(notebookTask, that.notebookTask) && Objects.equals(notificationSettings, that.notificationSettings) - && Objects.equals(pipelineTask, that.pipelineTask) - && Objects.equals(pythonWheelTask, that.pythonWheelTask) && Objects.equals(queue, that.queue) && Objects.equals(runAs, that.runAs) - && Objects.equals(runJobTask, that.runJobTask) && Objects.equals(runName, that.runName) - && Objects.equals(sparkJarTask, that.sparkJarTask) - && Objects.equals(sparkPythonTask, that.sparkPythonTask) - && Objects.equals(sparkSubmitTask, that.sparkSubmitTask) - && Objects.equals(sqlTask, that.sqlTask) && Objects.equals(tasks, that.tasks) && Objects.equals(timeoutSeconds, that.timeoutSeconds) && Objects.equals(webhookNotifications, that.webhookNotifications); @@ -386,24 +237,15 @@ public boolean equals(Object o) { public int hashCode() { return Objects.hash( accessControlList, - conditionTask, - dbtTask, emailNotifications, + environments, gitSource, health, idempotencyToken, - notebookTask, notificationSettings, - pipelineTask, - pythonWheelTask, queue, runAs, - runJobTask, runName, - sparkJarTask, - sparkPythonTask, - sparkSubmitTask, - sqlTask, tasks, timeoutSeconds, webhookNotifications); @@ -413,24 +255,15 @@ public int hashCode() { public String toString() { return new ToStringer(SubmitRun.class) .add("accessControlList", accessControlList) - .add("conditionTask", conditionTask) - .add("dbtTask", dbtTask) .add("emailNotifications", emailNotifications) + .add("environments", environments) .add("gitSource", gitSource) .add("health", health) .add("idempotencyToken", idempotencyToken) - .add("notebookTask", notebookTask) .add("notificationSettings", notificationSettings) - .add("pipelineTask", pipelineTask) - .add("pythonWheelTask", pythonWheelTask) .add("queue", queue) .add("runAs", runAs) - .add("runJobTask", runJobTask) .add("runName", runName) - .add("sparkJarTask", sparkJarTask) - .add("sparkPythonTask", sparkPythonTask) - .add("sparkSubmitTask", sparkSubmitTask) - .add("sqlTask", sqlTask) .add("tasks", tasks) .add("timeoutSeconds", timeoutSeconds) .add("webhookNotifications", webhookNotifications) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitTask.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitTask.java index 745b41fde..6f8a12c10 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitTask.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/SubmitTask.java @@ -18,6 +18,13 @@ public class SubmitTask { @JsonProperty("condition_task") private ConditionTask conditionTask; + /** + * If dbt_task, indicates that this must execute a dbt task. It requires both Databricks SQL and + * the ability to use a serverless or a pro SQL warehouse. + */ + @JsonProperty("dbt_task") + private DbtTask dbtTask; + /** * An optional array of objects specifying the dependency graph of the task. All tasks specified * in this field must complete successfully before executing this task. The key is `task_key`, and @@ -37,6 +44,13 @@ public class SubmitTask { @JsonProperty("email_notifications") private JobEmailNotifications emailNotifications; + /** + * The key that references an environment spec in a job. This field is required for Python script, + * Python wheel and dbt tasks when using serverless compute. + */ + @JsonProperty("environment_key") + private String environmentKey; + /** * If existing_cluster_id, the ID of an existing cluster that is used for all runs. When running * jobs or tasks on an existing cluster, you may need to manually restart the cluster if it stops @@ -159,6 +173,15 @@ public ConditionTask getConditionTask() { return conditionTask; } + public SubmitTask setDbtTask(DbtTask dbtTask) { + this.dbtTask = dbtTask; + return this; + } + + public DbtTask getDbtTask() { + return dbtTask; + } + public SubmitTask setDependsOn(Collection dependsOn) { this.dependsOn = dependsOn; return this; @@ -186,6 +209,15 @@ public JobEmailNotifications getEmailNotifications() { return emailNotifications; } + public SubmitTask setEnvironmentKey(String environmentKey) { + this.environmentKey = environmentKey; + return this; + } + + public String getEnvironmentKey() { + return environmentKey; + } + public SubmitTask setExistingClusterId(String existingClusterId) { this.existingClusterId = existingClusterId; return this; @@ -354,9 +386,11 @@ public boolean equals(Object o) { if (o == null || getClass() != o.getClass()) return false; SubmitTask that = (SubmitTask) o; return Objects.equals(conditionTask, that.conditionTask) + && Objects.equals(dbtTask, that.dbtTask) && Objects.equals(dependsOn, that.dependsOn) && Objects.equals(description, that.description) && Objects.equals(emailNotifications, that.emailNotifications) + && Objects.equals(environmentKey, that.environmentKey) && Objects.equals(existingClusterId, that.existingClusterId) && Objects.equals(forEachTask, that.forEachTask) && Objects.equals(health, that.health) @@ -381,9 +415,11 @@ public boolean equals(Object o) { public int hashCode() { return Objects.hash( conditionTask, + dbtTask, dependsOn, description, emailNotifications, + environmentKey, existingClusterId, forEachTask, health, @@ -408,9 +444,11 @@ public int hashCode() { public String toString() { return new ToStringer(SubmitTask.class) .add("conditionTask", conditionTask) + .add("dbtTask", dbtTask) .add("dependsOn", dependsOn) .add("description", description) .add("emailNotifications", emailNotifications) + .add("environmentKey", environmentKey) .add("existingClusterId", existingClusterId) .add("forEachTask", forEachTask) .add("health", health) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TaskEmailNotifications.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TaskEmailNotifications.java index 03e7e8faf..88b847f07 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TaskEmailNotifications.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TaskEmailNotifications.java @@ -39,6 +39,16 @@ public class TaskEmailNotifications { @JsonProperty("on_start") private Collection onStart; + /** + * A list of email addresses to notify when any streaming backlog thresholds are exceeded for any + * stream. Streaming backlog thresholds can be set in the `health` field using the following + * metrics: `STREAMING_BACKLOG_BYTES`, `STREAMING_BACKLOG_RECORDS`, `STREAMING_BACKLOG_SECONDS`, + * or `STREAMING_BACKLOG_FILES`. Alerting is based on the 10-minute average of these metrics. If + * the issue persists, notifications are resent every 30 minutes. + */ + @JsonProperty("on_streaming_backlog_exceeded") + private Collection onStreamingBacklogExceeded; + /** * A list of email addresses to be notified when a run successfully completes. A run is considered * to have completed successfully if it ends with a `TERMINATED` `life_cycle_state` and a @@ -85,6 +95,16 @@ public Collection getOnStart() { return onStart; } + public TaskEmailNotifications setOnStreamingBacklogExceeded( + Collection onStreamingBacklogExceeded) { + this.onStreamingBacklogExceeded = onStreamingBacklogExceeded; + return this; + } + + public Collection getOnStreamingBacklogExceeded() { + return onStreamingBacklogExceeded; + } + public TaskEmailNotifications setOnSuccess(Collection onSuccess) { this.onSuccess = onSuccess; return this; @@ -104,13 +124,19 @@ public boolean equals(Object o) { onDurationWarningThresholdExceeded, that.onDurationWarningThresholdExceeded) && Objects.equals(onFailure, that.onFailure) && Objects.equals(onStart, that.onStart) + && Objects.equals(onStreamingBacklogExceeded, that.onStreamingBacklogExceeded) && Objects.equals(onSuccess, that.onSuccess); } @Override public int hashCode() { return Objects.hash( - noAlertForSkippedRuns, onDurationWarningThresholdExceeded, onFailure, onStart, onSuccess); + noAlertForSkippedRuns, + onDurationWarningThresholdExceeded, + onFailure, + onStart, + onStreamingBacklogExceeded, + onSuccess); } @Override @@ -120,6 +146,7 @@ public String toString() { .add("onDurationWarningThresholdExceeded", onDurationWarningThresholdExceeded) .add("onFailure", onFailure) .add("onStart", onStart) + .add("onStreamingBacklogExceeded", onStreamingBacklogExceeded) .add("onSuccess", onSuccess) .toString(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TriggerSettings.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TriggerSettings.java index 069ac1c68..7ee1fe4b1 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TriggerSettings.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/TriggerSettings.java @@ -17,6 +17,10 @@ public class TriggerSettings { @JsonProperty("pause_status") private PauseStatus pauseStatus; + /** Periodic trigger settings. */ + @JsonProperty("periodic") + private PeriodicTriggerConfiguration periodic; + /** Old table trigger settings name. Deprecated in favor of `table_update`. */ @JsonProperty("table") private TableUpdateTriggerConfiguration table; @@ -43,6 +47,15 @@ public PauseStatus getPauseStatus() { return pauseStatus; } + public TriggerSettings setPeriodic(PeriodicTriggerConfiguration periodic) { + this.periodic = periodic; + return this; + } + + public PeriodicTriggerConfiguration getPeriodic() { + return periodic; + } + public TriggerSettings setTable(TableUpdateTriggerConfiguration table) { this.table = table; return this; @@ -68,13 +81,14 @@ public boolean equals(Object o) { TriggerSettings that = (TriggerSettings) o; return Objects.equals(fileArrival, that.fileArrival) && Objects.equals(pauseStatus, that.pauseStatus) + && Objects.equals(periodic, that.periodic) && Objects.equals(table, that.table) && Objects.equals(tableUpdate, that.tableUpdate); } @Override public int hashCode() { - return Objects.hash(fileArrival, pauseStatus, table, tableUpdate); + return Objects.hash(fileArrival, pauseStatus, periodic, table, tableUpdate); } @Override @@ -82,6 +96,7 @@ public String toString() { return new ToStringer(TriggerSettings.class) .add("fileArrival", fileArrival) .add("pauseStatus", pauseStatus) + .add("periodic", periodic) .add("table", table) .add("tableUpdate", tableUpdate) .toString(); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/WebhookNotifications.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/WebhookNotifications.java index e47a3b422..72d92748d 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/WebhookNotifications.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/jobs/WebhookNotifications.java @@ -32,6 +32,17 @@ public class WebhookNotifications { @JsonProperty("on_start") private Collection onStart; + /** + * An optional list of system notification IDs to call when any streaming backlog thresholds are + * exceeded for any stream. Streaming backlog thresholds can be set in the `health` field using + * the following metrics: `STREAMING_BACKLOG_BYTES`, `STREAMING_BACKLOG_RECORDS`, + * `STREAMING_BACKLOG_SECONDS`, or `STREAMING_BACKLOG_FILES`. Alerting is based on the 10-minute + * average of these metrics. If the issue persists, notifications are resent every 30 minutes. A + * maximum of 3 destinations can be specified for the `on_streaming_backlog_exceeded` property. + */ + @JsonProperty("on_streaming_backlog_exceeded") + private Collection onStreamingBacklogExceeded; + /** * An optional list of system notification IDs to call when the run completes successfully. A * maximum of 3 destinations can be specified for the `on_success` property. @@ -67,6 +78,16 @@ public Collection getOnStart() { return onStart; } + public WebhookNotifications setOnStreamingBacklogExceeded( + Collection onStreamingBacklogExceeded) { + this.onStreamingBacklogExceeded = onStreamingBacklogExceeded; + return this; + } + + public Collection getOnStreamingBacklogExceeded() { + return onStreamingBacklogExceeded; + } + public WebhookNotifications setOnSuccess(Collection onSuccess) { this.onSuccess = onSuccess; return this; @@ -85,12 +106,18 @@ public boolean equals(Object o) { onDurationWarningThresholdExceeded, that.onDurationWarningThresholdExceeded) && Objects.equals(onFailure, that.onFailure) && Objects.equals(onStart, that.onStart) + && Objects.equals(onStreamingBacklogExceeded, that.onStreamingBacklogExceeded) && Objects.equals(onSuccess, that.onSuccess); } @Override public int hashCode() { - return Objects.hash(onDurationWarningThresholdExceeded, onFailure, onStart, onSuccess); + return Objects.hash( + onDurationWarningThresholdExceeded, + onFailure, + onStart, + onStreamingBacklogExceeded, + onSuccess); } @Override @@ -99,6 +126,7 @@ public String toString() { .add("onDurationWarningThresholdExceeded", onDurationWarningThresholdExceeded) .add("onFailure", onFailure) .add("onStart", onStart) + .add("onStreamingBacklogExceeded", onStreamingBacklogExceeded) .add("onSuccess", onSuccess) .toString(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Listing.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Listing.java index f73465e34..a70a16775 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Listing.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/Listing.java @@ -17,6 +17,13 @@ public class Listing { @JsonProperty("id") private String id; + /** + * we can not use just ProviderListingSummary since we already have same name on entity side of + * the state + */ + @JsonProperty("provider_summary") + private ProviderListingSummaryInfo providerSummary; + /** Next Number: 26 */ @JsonProperty("summary") private ListingSummary summary; @@ -39,6 +46,15 @@ public String getId() { return id; } + public Listing setProviderSummary(ProviderListingSummaryInfo providerSummary) { + this.providerSummary = providerSummary; + return this; + } + + public ProviderListingSummaryInfo getProviderSummary() { + return providerSummary; + } + public Listing setSummary(ListingSummary summary) { this.summary = summary; return this; @@ -55,12 +71,13 @@ public boolean equals(Object o) { Listing that = (Listing) o; return Objects.equals(detail, that.detail) && Objects.equals(id, that.id) + && Objects.equals(providerSummary, that.providerSummary) && Objects.equals(summary, that.summary); } @Override public int hashCode() { - return Objects.hash(detail, id, summary); + return Objects.hash(detail, id, providerSummary, summary); } @Override @@ -68,6 +85,7 @@ public String toString() { return new ToStringer(Listing.class) .add("detail", detail) .add("id", id) + .add("providerSummary", providerSummary) .add("summary", summary) .toString(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconFile.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconFile.java new file mode 100755 index 000000000..f5c7bb2fa --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconFile.java @@ -0,0 +1,74 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.marketplace; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +@Generated +public class ProviderIconFile { + /** */ + @JsonProperty("icon_file_id") + private String iconFileId; + + /** */ + @JsonProperty("icon_file_path") + private String iconFilePath; + + /** */ + @JsonProperty("icon_type") + private ProviderIconType iconType; + + public ProviderIconFile setIconFileId(String iconFileId) { + this.iconFileId = iconFileId; + return this; + } + + public String getIconFileId() { + return iconFileId; + } + + public ProviderIconFile setIconFilePath(String iconFilePath) { + this.iconFilePath = iconFilePath; + return this; + } + + public String getIconFilePath() { + return iconFilePath; + } + + public ProviderIconFile setIconType(ProviderIconType iconType) { + this.iconType = iconType; + return this; + } + + public ProviderIconType getIconType() { + return iconType; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ProviderIconFile that = (ProviderIconFile) o; + return Objects.equals(iconFileId, that.iconFileId) + && Objects.equals(iconFilePath, that.iconFilePath) + && Objects.equals(iconType, that.iconType); + } + + @Override + public int hashCode() { + return Objects.hash(iconFileId, iconFilePath, iconType); + } + + @Override + public String toString() { + return new ToStringer(ProviderIconFile.class) + .add("iconFileId", iconFileId) + .add("iconFilePath", iconFilePath) + .add("iconType", iconType) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconType.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconType.java new file mode 100755 index 000000000..3ba80cc99 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderIconType.java @@ -0,0 +1,12 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.marketplace; + +import com.databricks.sdk.support.Generated; + +@Generated +public enum ProviderIconType { + DARK, + PRIMARY, + PROVIDER_ICON_TYPE_UNSPECIFIED, +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingSummaryInfo.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingSummaryInfo.java new file mode 100755 index 000000000..76fee974e --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/marketplace/ProviderListingSummaryInfo.java @@ -0,0 +1,79 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.marketplace; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Collection; +import java.util.Objects; + +/** + * we can not use just ProviderListingSummary since we already have same name on entity side of the + * state + */ +@Generated +public class ProviderListingSummaryInfo { + /** */ + @JsonProperty("description") + private String description; + + /** */ + @JsonProperty("icon_files") + private Collection iconFiles; + + /** */ + @JsonProperty("name") + private String name; + + public ProviderListingSummaryInfo setDescription(String description) { + this.description = description; + return this; + } + + public String getDescription() { + return description; + } + + public ProviderListingSummaryInfo setIconFiles(Collection iconFiles) { + this.iconFiles = iconFiles; + return this; + } + + public Collection getIconFiles() { + return iconFiles; + } + + public ProviderListingSummaryInfo setName(String name) { + this.name = name; + return this; + } + + public String getName() { + return name; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ProviderListingSummaryInfo that = (ProviderListingSummaryInfo) o; + return Objects.equals(description, that.description) + && Objects.equals(iconFiles, that.iconFiles) + && Objects.equals(name, that.name); + } + + @Override + public int hashCode() { + return Objects.hash(description, iconFiles, name); + } + + @Override + public String toString() { + return new ToStringer(ProviderListingSummaryInfo.class) + .add("description", description) + .add("iconFiles", iconFiles) + .add("name", name) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/pipelines/PipelineLibrary.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/pipelines/PipelineLibrary.java index b01a0851a..ee047e91a 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/pipelines/PipelineLibrary.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/pipelines/PipelineLibrary.java @@ -21,7 +21,7 @@ public class PipelineLibrary { @JsonProperty("maven") private com.databricks.sdk.service.compute.MavenLibrary maven; - /** The path to a notebook that defines a pipeline and is stored in the workspace. */ + /** The path to a notebook that defines a pipeline and is stored in the Databricks workspace. */ @JsonProperty("notebook") private NotebookLibrary notebook; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/App.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/App.java index 63c23e0bc..80245e7a3 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/App.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/App.java @@ -36,6 +36,14 @@ public class App { @JsonProperty("pending_deployment") private AppDeployment pendingDeployment; + /** */ + @JsonProperty("service_principal_id") + private Long servicePrincipalId; + + /** */ + @JsonProperty("service_principal_name") + private String servicePrincipalName; + /** */ @JsonProperty("status") private AppStatus status; @@ -106,6 +114,24 @@ public AppDeployment getPendingDeployment() { return pendingDeployment; } + public App setServicePrincipalId(Long servicePrincipalId) { + this.servicePrincipalId = servicePrincipalId; + return this; + } + + public Long getServicePrincipalId() { + return servicePrincipalId; + } + + public App setServicePrincipalName(String servicePrincipalName) { + this.servicePrincipalName = servicePrincipalName; + return this; + } + + public String getServicePrincipalName() { + return servicePrincipalName; + } + public App setStatus(AppStatus status) { this.status = status; return this; @@ -153,6 +179,8 @@ public boolean equals(Object o) { && Objects.equals(description, that.description) && Objects.equals(name, that.name) && Objects.equals(pendingDeployment, that.pendingDeployment) + && Objects.equals(servicePrincipalId, that.servicePrincipalId) + && Objects.equals(servicePrincipalName, that.servicePrincipalName) && Objects.equals(status, that.status) && Objects.equals(updateTime, that.updateTime) && Objects.equals(updater, that.updater) @@ -168,6 +196,8 @@ public int hashCode() { description, name, pendingDeployment, + servicePrincipalId, + servicePrincipalName, status, updateTime, updater, @@ -183,6 +213,8 @@ public String toString() { .add("description", description) .add("name", name) .add("pendingDeployment", pendingDeployment) + .add("servicePrincipalId", servicePrincipalId) + .add("servicePrincipalName", servicePrincipalName) .add("status", status) .add("updateTime", updateTime) .add("updater", updater) diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsAPI.java index 175d0ee43..cdd5949d6 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsAPI.java @@ -264,6 +264,19 @@ public Iterable listDeployments(ListAppDeploymentsRequest request }); } + public AppDeployment start(String name) { + return start(new StartAppRequest().setName(name)); + } + + /** + * Start an app. + * + *

Start the last active deployment of the app in the workspace. + */ + public AppDeployment start(StartAppRequest request) { + return impl.start(request); + } + public void stop(String name) { stop(new StopAppRequest().setName(name)); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsImpl.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsImpl.java index 99f7f1bc5..e44f5e4c9 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsImpl.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsImpl.java @@ -84,6 +84,15 @@ public ListAppDeploymentsResponse listDeployments(ListAppDeploymentsRequest requ return apiClient.GET(path, request, ListAppDeploymentsResponse.class, headers); } + @Override + public AppDeployment start(StartAppRequest request) { + String path = String.format("/api/2.0/preview/apps/%s/start", request.getName()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.POST(path, request, AppDeployment.class, headers); + } + @Override public void stop(StopAppRequest request) { String path = String.format("/api/2.0/preview/apps/%s/stop", request.getName()); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsService.java index 64d55a269..bd4ae037a 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AppsService.java @@ -69,6 +69,13 @@ public interface AppsService { */ ListAppDeploymentsResponse listDeployments(ListAppDeploymentsRequest listAppDeploymentsRequest); + /** + * Start an app. + * + *

Start the last active deployment of the app in the workspace. + */ + AppDeployment start(StartAppRequest startAppRequest); + /** * Stop an app. * diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigInput.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigInput.java index f06839ab6..e34add98e 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigInput.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigInput.java @@ -11,28 +11,25 @@ public class AutoCaptureConfigInput { /** * The name of the catalog in Unity Catalog. NOTE: On update, you cannot change the catalog name - * if it was already set. + * if the inference table is already enabled. */ @JsonProperty("catalog_name") private String catalogName; - /** - * If inference tables are enabled or not. NOTE: If you have already disabled payload logging - * once, you cannot enable again. - */ + /** Indicates whether the inference table is enabled. */ @JsonProperty("enabled") private Boolean enabled; /** * The name of the schema in Unity Catalog. NOTE: On update, you cannot change the schema name if - * it was already set. + * the inference table is already enabled. */ @JsonProperty("schema_name") private String schemaName; /** * The prefix of the table in Unity Catalog. NOTE: On update, you cannot change the prefix name if - * it was already set. + * the inference table is already enabled. */ @JsonProperty("table_name_prefix") private String tableNamePrefix; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigOutput.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigOutput.java index c9b0a3f06..34ff0f488 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigOutput.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/AutoCaptureConfigOutput.java @@ -13,7 +13,7 @@ public class AutoCaptureConfigOutput { @JsonProperty("catalog_name") private String catalogName; - /** If inference tables are enabled or not. */ + /** Indicates whether the inference table is enabled. */ @JsonProperty("enabled") private Boolean enabled; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneAPI.java new file mode 100755 index 000000000..05aef2cb6 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneAPI.java @@ -0,0 +1,41 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. +package com.databricks.sdk.service.serving; + +import com.databricks.sdk.core.ApiClient; +import com.databricks.sdk.support.Generated; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * Serving endpoints DataPlane provides a set of operations to interact with data plane endpoints + * for Serving endpoints service. + */ +@Generated +public class ServingEndpointsDataPlaneAPI { + private static final Logger LOG = LoggerFactory.getLogger(ServingEndpointsDataPlaneAPI.class); + + private final ServingEndpointsDataPlaneService impl; + + /** Regular-use constructor */ + public ServingEndpointsDataPlaneAPI(ApiClient apiClient) { + impl = new ServingEndpointsDataPlaneImpl(apiClient); + } + + /** Constructor for mocks */ + public ServingEndpointsDataPlaneAPI(ServingEndpointsDataPlaneService mock) { + impl = mock; + } + + public QueryEndpointResponse query(String name) { + return query(new QueryEndpointInput().setName(name)); + } + + /** Query a serving endpoint. */ + public QueryEndpointResponse query(QueryEndpointInput request) { + return impl.query(request); + } + + public ServingEndpointsDataPlaneService impl() { + return impl; + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneImpl.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneImpl.java new file mode 100755 index 000000000..b1bf77ab9 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneImpl.java @@ -0,0 +1,26 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. +package com.databricks.sdk.service.serving; + +import com.databricks.sdk.core.ApiClient; +import com.databricks.sdk.support.Generated; +import java.util.HashMap; +import java.util.Map; + +/** Package-local implementation of ServingEndpointsDataPlane */ +@Generated +class ServingEndpointsDataPlaneImpl implements ServingEndpointsDataPlaneService { + private final ApiClient apiClient; + + public ServingEndpointsDataPlaneImpl(ApiClient apiClient) { + this.apiClient = apiClient; + } + + @Override + public QueryEndpointResponse query(QueryEndpointInput request) { + String path = String.format("/serving-endpoints/%s/invocations", request.getName()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.POST(path, request, QueryEndpointResponse.class, headers); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneService.java new file mode 100755 index 000000000..def58d4a2 --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/ServingEndpointsDataPlaneService.java @@ -0,0 +1,18 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. +package com.databricks.sdk.service.serving; + +import com.databricks.sdk.support.Generated; + +/** + * Serving endpoints DataPlane provides a set of operations to interact with data plane endpoints + * for Serving endpoints service. + * + *

This is the high-level interface, that contains generated methods. + * + *

Evolving: this interface is under development. Method signatures may change. + */ +@Generated +public interface ServingEndpointsDataPlaneService { + /** Query a serving endpoint. */ + QueryEndpointResponse query(QueryEndpointInput queryEndpointInput); +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StartAppRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StartAppRequest.java new file mode 100755 index 000000000..4386245bc --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/serving/StartAppRequest.java @@ -0,0 +1,40 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.serving; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import java.util.Objects; + +@Generated +public class StartAppRequest { + /** The name of the app. */ + private String name; + + public StartAppRequest setName(String name) { + this.name = name; + return this; + } + + public String getName() { + return name; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + StartAppRequest that = (StartAppRequest) o; + return Objects.equals(name, that.name); + } + + @Override + public int hashCode() { + return Objects.hash(name); + } + + @Override + public String toString() { + return new ToStringer(StartAppRequest.class).add("name", name).toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/settings/ComplianceStandard.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/settings/ComplianceStandard.java index ba776ddf0..0e3b7ecfa 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/settings/ComplianceStandard.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/settings/ComplianceStandard.java @@ -8,6 +8,7 @@ @Generated public enum ComplianceStandard { COMPLIANCE_STANDARD_UNSPECIFIED, + CYBER_ESSENTIAL_PLUS, FEDRAMP_HIGH, FEDRAMP_IL5, FEDRAMP_MODERATE, diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sharing/Privilege.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sharing/Privilege.java index e413a6d0e..87d99f5c9 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sharing/Privilege.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sharing/Privilege.java @@ -38,7 +38,6 @@ public enum Privilege { REFRESH, SELECT, SET_SHARE_PERMISSION, - SINGLE_USER_ACCESS, USAGE, USE_CATALOG, USE_CONNECTION, diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertQuery.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertQuery.java index 34800625e..f0a02211e 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertQuery.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertQuery.java @@ -16,7 +16,7 @@ public class AlertQuery { /** * Data source ID maps to the ID of the data source used by the resource and is distinct from the - * warehouse ID. [Learn more]. + * warehouse ID. [Learn more] * *

[Learn more]: https://docs.databricks.com/api/workspace/datasources/list */ diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsAPI.java index b78b442ce..077817eb2 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsAPI.java @@ -11,6 +11,11 @@ * object that periodically runs a query, evaluates a condition of its result, and notifies one or * more users and/or notification destinations if the condition was met. Alerts can be scheduled * using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ @Generated public class AlertsAPI { @@ -38,6 +43,11 @@ public Alert create(String name, AlertOptions options, String queryId) { *

Creates an alert. An alert is a Databricks SQL object that periodically runs a query, * evaluates a condition of its result, and notifies users or notification destinations if the * condition was met. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Alert create(CreateAlert request) { return impl.create(request); @@ -50,8 +60,13 @@ public void delete(String alertId) { /** * Delete an alert. * - *

Deletes an alert. Deleted alerts are no longer accessible and cannot be restored. **Note:** + *

Deletes an alert. Deleted alerts are no longer accessible and cannot be restored. **Note**: * Unlike queries and dashboards, alerts cannot be moved to the trash. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public void delete(DeleteAlertRequest request) { impl.delete(request); @@ -65,6 +80,11 @@ public Alert get(String alertId) { * Get an alert. * *

Gets an alert. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Alert get(GetAlertRequest request) { return impl.get(request); @@ -74,6 +94,11 @@ public Alert get(GetAlertRequest request) { * Get alerts. * *

Gets a list of alerts. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Iterable list() { return impl.list(); @@ -88,6 +113,11 @@ public void update(String alertId, String name, AlertOptions options, String que * Update an alert. * *

Updates an alert. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public void update(EditAlert request) { impl.update(request); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsService.java index 5d2517076..1ca26ecda 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/AlertsService.java @@ -10,6 +10,11 @@ * more users and/or notification destinations if the condition was met. Alerts can be scheduled * using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources + * *

This is the high-level interface, that contains generated methods. * *

Evolving: this interface is under development. Method signatures may change. @@ -22,14 +27,24 @@ public interface AlertsService { *

Creates an alert. An alert is a Databricks SQL object that periodically runs a query, * evaluates a condition of its result, and notifies users or notification destinations if the * condition was met. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Alert create(CreateAlert createAlert); /** * Delete an alert. * - *

Deletes an alert. Deleted alerts are no longer accessible and cannot be restored. **Note:** + *

Deletes an alert. Deleted alerts are no longer accessible and cannot be restored. **Note**: * Unlike queries and dashboards, alerts cannot be moved to the trash. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ void delete(DeleteAlertRequest deleteAlertRequest); @@ -37,6 +52,11 @@ public interface AlertsService { * Get an alert. * *

Gets an alert. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Alert get(GetAlertRequest getAlertRequest); @@ -44,6 +64,11 @@ public interface AlertsService { * Get alerts. * *

Gets a list of alerts. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Collection list(); @@ -51,6 +76,11 @@ public interface AlertsService { * Update an alert. * *

Updates an alert. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ void update(EditAlert editAlert); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsAPI.java index 26e3e7891..bbea13537 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsAPI.java @@ -72,8 +72,8 @@ public Dashboard get(GetDashboardRequest request) { * *

Fetch a paginated list of dashboard objects. * - *

### **Warning: Calling this API concurrently 10 or more times could result in throttling, - * service degradation, or a temporary ban.** + *

**Warning**: Calling this API concurrently 10 or more times could result in throttling, + * service degradation, or a temporary ban. */ public Iterable list(ListDashboardsRequest request) { request.setPage(1L); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsService.java index 6752f8d77..8608d16b0 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DashboardsService.java @@ -40,8 +40,8 @@ public interface DashboardsService { * *

Fetch a paginated list of dashboard objects. * - *

### **Warning: Calling this API concurrently 10 or more times could result in throttling, - * service degradation, or a temporary ban.** + *

**Warning**: Calling this API concurrently 10 or more times could result in throttling, + * service degradation, or a temporary ban. */ ListResponse list(ListDashboardsRequest listDashboardsRequest); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSource.java index d4355ae9c..3dfc9f064 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSource.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSource.java @@ -12,7 +12,7 @@ public class DataSource { /** * Data source ID maps to the ID of the data source used by the resource and is distinct from the - * warehouse ID. [Learn more]. + * warehouse ID. [Learn more] * *

[Learn more]: https://docs.databricks.com/api/workspace/datasources/list */ diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesAPI.java index 70108d2db..e8c3909d1 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesAPI.java @@ -15,6 +15,11 @@ *

This API does not support searches. It returns the full list of SQL warehouses in your * workspace. We advise you to use any text editor, REST client, or `grep` to search the response * from this API for the name of your SQL warehouse as it appears in Databricks SQL. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ @Generated public class DataSourcesAPI { @@ -38,6 +43,11 @@ public DataSourcesAPI(DataSourcesService mock) { *

Retrieves a full list of SQL warehouses available in this workspace. All fields that appear * in this API response are enumerated for clarity. However, you need only a SQL warehouse's `id` * to create new queries against it. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Iterable list() { return impl.list(); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesService.java index f75d45f8c..46f020f1c 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DataSourcesService.java @@ -14,6 +14,11 @@ * workspace. We advise you to use any text editor, REST client, or `grep` to search the response * from this API for the name of your SQL warehouse as it appears in Databricks SQL. * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources + * *

This is the high-level interface, that contains generated methods. * *

Evolving: this interface is under development. Method signatures may change. @@ -26,6 +31,11 @@ public interface DataSourcesService { *

Retrieves a full list of SQL warehouses available in this workspace. All fields that appear * in this API response are enumerated for clarity. However, you need only a SQL warehouse's `id` * to create new queries against it. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Collection list(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsAPI.java index ce00a1f0b..2f25572bf 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsAPI.java @@ -19,6 +19,11 @@ * *

- `CAN_MANAGE`: Allows all actions: read, run, edit, delete, modify permissions (superset of * `CAN_RUN`) + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ @Generated public class DbsqlPermissionsAPI { @@ -44,6 +49,11 @@ public GetResponse get(ObjectTypePlural objectType, String objectId) { * Get object ACL. * *

Gets a JSON representation of the access control list (ACL) for a specified object. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public GetResponse get(GetDbsqlPermissionRequest request) { return impl.get(request); @@ -58,6 +68,11 @@ public SetResponse set(ObjectTypePlural objectType, String objectId) { * *

Sets the access control list (ACL) for a specified object. This operation will complete * rewrite the ACL. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public SetResponse set(SetRequest request) { return impl.set(request); @@ -74,6 +89,11 @@ public Success transferOwnership( * *

Transfers ownership of a dashboard, query, or alert to an active user. Requires an admin API * key. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Success transferOwnership(TransferOwnershipRequest request) { return impl.transferOwnership(request); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsService.java index 1ed4b61a2..9e6f1ea69 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/DbsqlPermissionsService.java @@ -17,6 +17,11 @@ *

- `CAN_MANAGE`: Allows all actions: read, run, edit, delete, modify permissions (superset of * `CAN_RUN`) * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources + * *

This is the high-level interface, that contains generated methods. * *

Evolving: this interface is under development. Method signatures may change. @@ -27,6 +32,11 @@ public interface DbsqlPermissionsService { * Get object ACL. * *

Gets a JSON representation of the access control list (ACL) for a specified object. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ GetResponse get(GetDbsqlPermissionRequest getDbsqlPermissionRequest); @@ -35,6 +45,11 @@ public interface DbsqlPermissionsService { * *

Sets the access control list (ACL) for a specified object. This operation will complete * rewrite the ACL. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ SetResponse set(SetRequest setRequest); @@ -43,6 +58,11 @@ public interface DbsqlPermissionsService { * *

Transfers ownership of a dashboard, query, or alert to an active user. Requires an admin API * key. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Success transferOwnership(TransferOwnershipRequest transferOwnershipRequest); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/ExecuteStatementRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/ExecuteStatementRequest.java index 12577b7a1..1e5d98553 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/ExecuteStatementRequest.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/ExecuteStatementRequest.java @@ -176,8 +176,9 @@ public class ExecuteStatementRequest { private String waitTimeout; /** - * Warehouse upon which to execute a statement. See also [What are SQL - * warehouses?](/sql/admin/warehouse-type.html) + * Warehouse upon which to execute a statement. See also [What are SQL warehouses?] + * + *

[What are SQL warehouses?]: https://docs.databricks.com/sql/admin/warehouse-type.html */ @JsonProperty("warehouse_id") private String warehouseId; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesAPI.java index 20f0e15e2..a3f468319 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesAPI.java @@ -11,6 +11,11 @@ * These endpoints are used for CRUD operations on query definitions. Query definitions include the * target SQL warehouse, query text, name, description, tags, parameters, and visualizations. * Queries can be scheduled using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ @Generated public class QueriesAPI { @@ -39,6 +44,11 @@ public QueriesAPI(QueriesService mock) { * copy the `data_source_id` from an existing query. * *

**Note**: You cannot add a visualization until you create the query. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Query create(QueryPostContent request) { return impl.create(request); @@ -53,6 +63,11 @@ public void delete(String queryId) { * *

Moves a query to the trash. Trashed queries immediately disappear from searches and list * views, and they cannot be used for alerts. The trash is deleted after 30 days. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public void delete(DeleteQueryRequest request) { impl.delete(request); @@ -67,6 +82,11 @@ public Query get(String queryId) { * *

Retrieve a query object definition along with contextual permissions information about the * currently authenticated user. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Query get(GetQueryRequest request) { return impl.get(request); @@ -77,8 +97,13 @@ public Query get(GetQueryRequest request) { * *

Gets a list of queries. Optionally, this list can be filtered by a search term. * - *

### **Warning: Calling this API concurrently 10 or more times could result in throttling, - * service degradation, or a temporary ban.** + *

**Warning**: Calling this API concurrently 10 or more times could result in throttling, + * service degradation, or a temporary ban. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Iterable list(ListQueriesRequest request) { request.setPage(1L); @@ -105,6 +130,11 @@ public void restore(String queryId) { * *

Restore a query that has been moved to the trash. A restored query appears in list views and * searches. You can use restored queries for alerts. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public void restore(RestoreQueryRequest request) { impl.restore(request); @@ -120,6 +150,11 @@ public Query update(String queryId) { *

Modify this query definition. * *

**Note**: You cannot undo this operation. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ public Query update(QueryEditContent request) { return impl.update(request); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesService.java index a4ecf429d..5aa07d289 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueriesService.java @@ -8,6 +8,11 @@ * target SQL warehouse, query text, name, description, tags, parameters, and visualizations. * Queries can be scheduled using the `sql_task` type of the Jobs API, e.g. :method:jobs/create. * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources + * *

This is the high-level interface, that contains generated methods. * *

Evolving: this interface is under development. Method signatures may change. @@ -25,6 +30,11 @@ public interface QueriesService { * copy the `data_source_id` from an existing query. * *

**Note**: You cannot add a visualization until you create the query. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Query create(QueryPostContent queryPostContent); @@ -33,6 +43,11 @@ public interface QueriesService { * *

Moves a query to the trash. Trashed queries immediately disappear from searches and list * views, and they cannot be used for alerts. The trash is deleted after 30 days. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ void delete(DeleteQueryRequest deleteQueryRequest); @@ -41,6 +56,11 @@ public interface QueriesService { * *

Retrieve a query object definition along with contextual permissions information about the * currently authenticated user. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Query get(GetQueryRequest getQueryRequest); @@ -49,8 +69,13 @@ public interface QueriesService { * *

Gets a list of queries. Optionally, this list can be filtered by a search term. * - *

### **Warning: Calling this API concurrently 10 or more times could result in throttling, - * service degradation, or a temporary ban.** + *

**Warning**: Calling this API concurrently 10 or more times could result in throttling, + * service degradation, or a temporary ban. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ QueryList list(ListQueriesRequest listQueriesRequest); @@ -59,6 +84,11 @@ public interface QueriesService { * *

Restore a query that has been moved to the trash. A restored query appears in list views and * searches. You can use restored queries for alerts. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ void restore(RestoreQueryRequest restoreQueryRequest); @@ -68,6 +98,11 @@ public interface QueriesService { *

Modify this query definition. * *

**Note**: You cannot undo this operation. + * + *

**Note**: A new version of the Databricks SQL API will soon be available. [Learn more] + * + *

[Learn more]: + * https://docs.databricks.com/en/whats-coming.html#updates-to-the-databricks-sql-api-for-managing-queries-alerts-and-data-sources */ Query update(QueryEditContent queryEditContent); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/Query.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/Query.java index f936b674c..be339a6a4 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/Query.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/Query.java @@ -20,7 +20,7 @@ public class Query { /** * Data source ID maps to the ID of the data source used by the resource and is distinct from the - * warehouse ID. [Learn more]. + * warehouse ID. [Learn more] * *

[Learn more]: https://docs.databricks.com/api/workspace/datasources/list */ diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryEditContent.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryEditContent.java index 187aaf93f..8d8fb1701 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryEditContent.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryEditContent.java @@ -12,7 +12,7 @@ public class QueryEditContent { /** * Data source ID maps to the ID of the data source used by the resource and is distinct from the - * warehouse ID. [Learn more]. + * warehouse ID. [Learn more] * *

[Learn more]: https://docs.databricks.com/api/workspace/datasources/list */ diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryPostContent.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryPostContent.java index 32e959521..fbc1aedcb 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryPostContent.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/QueryPostContent.java @@ -12,7 +12,7 @@ public class QueryPostContent { /** * Data source ID maps to the ID of the data source used by the resource and is distinct from the - * warehouse ID. [Learn more]. + * warehouse ID. [Learn more] * *

[Learn more]: https://docs.databricks.com/api/workspace/datasources/list */ diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/StatementParameterListItem.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/StatementParameterListItem.java index c0d10a5a0..29dcc8b9d 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/StatementParameterListItem.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/sql/StatementParameterListItem.java @@ -16,8 +16,10 @@ public class StatementParameterListItem { /** * The data type, given as a string. For example: `INT`, `STRING`, `DECIMAL(10,2)`. If no type is * given the type is assumed to be `STRING`. Complex types, such as `ARRAY`, `MAP`, and `STRUCT` - * are not supported. For valid types, refer to the section [Data - * types](/sql/language-manual/functions/cast.html) of the SQL language reference. + * are not supported. For valid types, refer to the section [Data types] of the SQL language + * reference. + * + *

[Data types]: https://docs.databricks.com/sql/language-manual/functions/cast.html */ @JsonProperty("type") private String typeValue; diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexNextPageRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexNextPageRequest.java new file mode 100755 index 000000000..7f7407cde --- /dev/null +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexNextPageRequest.java @@ -0,0 +1,74 @@ +// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +package com.databricks.sdk.service.vectorsearch; + +import com.databricks.sdk.support.Generated; +import com.databricks.sdk.support.ToStringer; +import com.fasterxml.jackson.annotation.JsonProperty; +import java.util.Objects; + +/** Request payload for getting next page of results. */ +@Generated +public class QueryVectorIndexNextPageRequest { + /** Name of the endpoint. */ + @JsonProperty("endpoint_name") + private String endpointName; + + /** Name of the vector index to query. */ + private String indexName; + + /** Page token returned from previous `QueryVectorIndex` or `QueryVectorIndexNextPage` API. */ + @JsonProperty("page_token") + private String pageToken; + + public QueryVectorIndexNextPageRequest setEndpointName(String endpointName) { + this.endpointName = endpointName; + return this; + } + + public String getEndpointName() { + return endpointName; + } + + public QueryVectorIndexNextPageRequest setIndexName(String indexName) { + this.indexName = indexName; + return this; + } + + public String getIndexName() { + return indexName; + } + + public QueryVectorIndexNextPageRequest setPageToken(String pageToken) { + this.pageToken = pageToken; + return this; + } + + public String getPageToken() { + return pageToken; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + QueryVectorIndexNextPageRequest that = (QueryVectorIndexNextPageRequest) o; + return Objects.equals(endpointName, that.endpointName) + && Objects.equals(indexName, that.indexName) + && Objects.equals(pageToken, that.pageToken); + } + + @Override + public int hashCode() { + return Objects.hash(endpointName, indexName, pageToken); + } + + @Override + public String toString() { + return new ToStringer(QueryVectorIndexNextPageRequest.class) + .add("endpointName", endpointName) + .add("indexName", indexName) + .add("pageToken", pageToken) + .toString(); + } +} diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexRequest.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexRequest.java index 245748834..ad2f364fa 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexRequest.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexRequest.java @@ -35,6 +35,10 @@ public class QueryVectorIndexRequest { @JsonProperty("query_text") private String queryText; + /** The query type to use. Choices are `ANN` and `HYBRID`. Defaults to `ANN`. */ + @JsonProperty("query_type") + private String queryType; + /** * Query vector. Required for Direct Vector Access Index and Delta Sync Index using self-managed * vectors. @@ -91,6 +95,15 @@ public String getQueryText() { return queryText; } + public QueryVectorIndexRequest setQueryType(String queryType) { + this.queryType = queryType; + return this; + } + + public String getQueryType() { + return queryType; + } + public QueryVectorIndexRequest setQueryVector(Collection queryVector) { this.queryVector = queryVector; return this; @@ -119,6 +132,7 @@ public boolean equals(Object o) { && Objects.equals(indexName, that.indexName) && Objects.equals(numResults, that.numResults) && Objects.equals(queryText, that.queryText) + && Objects.equals(queryType, that.queryType) && Objects.equals(queryVector, that.queryVector) && Objects.equals(scoreThreshold, that.scoreThreshold); } @@ -126,7 +140,14 @@ public boolean equals(Object o) { @Override public int hashCode() { return Objects.hash( - columns, filtersJson, indexName, numResults, queryText, queryVector, scoreThreshold); + columns, + filtersJson, + indexName, + numResults, + queryText, + queryType, + queryVector, + scoreThreshold); } @Override @@ -137,6 +158,7 @@ public String toString() { .add("indexName", indexName) .add("numResults", numResults) .add("queryText", queryText) + .add("queryType", queryType) .add("queryVector", queryVector) .add("scoreThreshold", scoreThreshold) .toString(); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexResponse.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexResponse.java index 64a126c89..c0a809cf8 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexResponse.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/QueryVectorIndexResponse.java @@ -13,6 +13,14 @@ public class QueryVectorIndexResponse { @JsonProperty("manifest") private ResultManifest manifest; + /** + * [Optional] Token that can be used in `QueryVectorIndexNextPage` API to get next page of + * results. If more than 1000 results satisfy the query, they are returned in groups of 1000. + * Empty value means no more results. + */ + @JsonProperty("next_page_token") + private String nextPageToken; + /** Data returned in the query result. */ @JsonProperty("result") private ResultData result; @@ -26,6 +34,15 @@ public ResultManifest getManifest() { return manifest; } + public QueryVectorIndexResponse setNextPageToken(String nextPageToken) { + this.nextPageToken = nextPageToken; + return this; + } + + public String getNextPageToken() { + return nextPageToken; + } + public QueryVectorIndexResponse setResult(ResultData result) { this.result = result; return this; @@ -40,18 +57,21 @@ public boolean equals(Object o) { if (this == o) return true; if (o == null || getClass() != o.getClass()) return false; QueryVectorIndexResponse that = (QueryVectorIndexResponse) o; - return Objects.equals(manifest, that.manifest) && Objects.equals(result, that.result); + return Objects.equals(manifest, that.manifest) + && Objects.equals(nextPageToken, that.nextPageToken) + && Objects.equals(result, that.result); } @Override public int hashCode() { - return Objects.hash(manifest, result); + return Objects.hash(manifest, nextPageToken, result); } @Override public String toString() { return new ToStringer(QueryVectorIndexResponse.class) .add("manifest", manifest) + .add("nextPageToken", nextPageToken) .add("result", result) .toString(); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesAPI.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesAPI.java index 227088e6f..0217d992f 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesAPI.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesAPI.java @@ -130,6 +130,20 @@ public QueryVectorIndexResponse queryIndex(QueryVectorIndexRequest request) { return impl.queryIndex(request); } + public QueryVectorIndexResponse queryNextPage(String indexName) { + return queryNextPage(new QueryVectorIndexNextPageRequest().setIndexName(indexName)); + } + + /** + * Query next page. + * + *

Use `next_page_token` returned from previous `QueryVectorIndex` or + * `QueryVectorIndexNextPage` request to fetch next page of results. + */ + public QueryVectorIndexResponse queryNextPage(QueryVectorIndexNextPageRequest request) { + return impl.queryNextPage(request); + } + public ScanVectorIndexResponse scanIndex(String indexName) { return scanIndex(new ScanVectorIndexRequest().setIndexName(indexName)); } diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesImpl.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesImpl.java index 5daa314b2..15429adcb 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesImpl.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesImpl.java @@ -66,6 +66,16 @@ public QueryVectorIndexResponse queryIndex(QueryVectorIndexRequest request) { return apiClient.POST(path, request, QueryVectorIndexResponse.class, headers); } + @Override + public QueryVectorIndexResponse queryNextPage(QueryVectorIndexNextPageRequest request) { + String path = + String.format("/api/2.0/vector-search/indexes/%s/query-next-page", request.getIndexName()); + Map headers = new HashMap<>(); + headers.put("Accept", "application/json"); + headers.put("Content-Type", "application/json"); + return apiClient.POST(path, request, QueryVectorIndexResponse.class, headers); + } + @Override public ScanVectorIndexResponse scanIndex(ScanVectorIndexRequest request) { String path = String.format("/api/2.0/vector-search/indexes/%s/scan", request.getIndexName()); diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesService.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesService.java index c71451c97..c1f1110fe 100755 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesService.java +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/service/vectorsearch/VectorSearchIndexesService.java @@ -62,6 +62,15 @@ DeleteDataVectorIndexResponse deleteDataVectorIndex( */ QueryVectorIndexResponse queryIndex(QueryVectorIndexRequest queryVectorIndexRequest); + /** + * Query next page. + * + *

Use `next_page_token` returned from previous `QueryVectorIndex` or + * `QueryVectorIndexNextPage` request to fetch next page of results. + */ + QueryVectorIndexResponse queryNextPage( + QueryVectorIndexNextPageRequest queryVectorIndexNextPageRequest); + /** * Scan an index. * diff --git a/examples/docs/pom.xml b/examples/docs/pom.xml index f15db3c7f..c452bd95c 100644 --- a/examples/docs/pom.xml +++ b/examples/docs/pom.xml @@ -24,7 +24,7 @@ com.databricks databricks-sdk-java - 0.26.0 + 0.27.0 diff --git a/examples/spring-boot-oauth-u2m-demo/pom.xml b/examples/spring-boot-oauth-u2m-demo/pom.xml index 2d8ceab3c..e95055e70 100644 --- a/examples/spring-boot-oauth-u2m-demo/pom.xml +++ b/examples/spring-boot-oauth-u2m-demo/pom.xml @@ -37,7 +37,7 @@ com.databricks databricks-sdk-java - 0.26.0 + 0.27.0 com.fasterxml.jackson.datatype diff --git a/pom.xml b/pom.xml index 9d05c6c21..49e82ddec 100644 --- a/pom.xml +++ b/pom.xml @@ -4,7 +4,7 @@ 4.0.0 com.databricks databricks-sdk-parent - 0.26.0 + 0.27.0 pom Databricks SDK for Java The Databricks SDK for Java includes functionality to accelerate development with Java for diff --git a/shaded/pom.xml b/shaded/pom.xml index 4e0315675..6684f68c5 100644 --- a/shaded/pom.xml +++ b/shaded/pom.xml @@ -4,7 +4,7 @@ 4.0.0 - 0.26.0 + 0.27.0 com.databricks