Skip to content

Commit 3b354d9

Browse files
authored
Merge pull request #17 from data-platform-hq/fix_refactor
feat: module refactor
2 parents 0a3a93d + 63970cb commit 3b354d9

File tree

5 files changed

+4
-71
lines changed

5 files changed

+4
-71
lines changed

README.md

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -117,8 +117,7 @@ No modules.
117117
| [databricks_token.pat](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/token) | resource |
118118
| [databricks_user.this](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/user) | resource |
119119
| [azurerm_role_assignment.this](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/role_assignment) | resource |
120-
| [databricks_cluster_policy.this](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/cluster_policy) | resource |
121-
| [databricks_cluster.this](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/cluster) | resource |
120+
| [databricks_cluster.this](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/cluster) | resource |
122121
| [databricks_mount.adls](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mount) | resource |
123122
| [databricks_secret_scope.main](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/secret_scope) | resource |
124123
| [databricks_secret.main](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/secret) | resource |
@@ -134,11 +133,9 @@ No modules.
134133
| <a name="input_sp_key_secret_name"></a> [sp\_key\_secret\_name](#input\_sp\_key\_secret\_name) | The name of Azure Key Vault secret that contains client secret of Service Principal to access in Azure Key Vault | `string` | n/a | yes |
135134
| <a name="input_tenant_id_secret_name"></a> [tenant\_id\_secret\_name](#input\_tenant\_id\_secret\_name) | The name of Azure Key Vault secret that contains tenant ID secret of Service Principal to access in Azure Key Vault | `string` | n/a | yes |
136135
| <a name="input_key_vault_id"></a> [key\_vault\_id](#input\_key\_vault\_id) | ID of the Key Vault instance where the Secret resides | `string` | n/a | yes |
137-
| <a name="input_sku"></a> [sku](#input\_sku) | The sku to use for the Databricks Workspace: [standard \ premium \ trial] | `string` | "standard" | no |
138136
| <a name="input_pat_token_lifetime_seconds"></a> [pat\_token\_lifetime\_seconds](#input\_pat\_token\_lifetime\_seconds) | The lifetime of the token, in seconds. If no lifetime is specified, the token remains valid indefinitely | `number` | 315569520 | no |
139137
| <a name="input_users"></a> [users](#input\_users) | List of users to access Databricks | `list(string)` | [] | no |
140138
| <a name="input_permissions"></a> [permissions](#input\_permissions) | Databricks Workspace permission maps | `list(map(string))` | <pre> [{ <br> object_id = null <br> role = null <br> }] </pre> | no |
141-
| <a name="input_custom_cluster_policies"></a> [custom\_cluster\_policies](#input\_custom\_cluster\_policies) | Provides an ability to create custom cluster policy, assign it to cluster and grant CAN_USE permissions on it to certain custom groups | <pre>list(object({<br> name = string<br> can_use = list(string)<br> definition = any<br> assigned = bool<br>}))</pre> | <pre>[{<br> name = null<br> can_use = null<br> definition = null<br> assigned = false<br>}]</pre> | no |
142139
| <a name="input_cluster_nodes_availability"></a> [cluster\_nodes\_availability](#input\_cluster\_nodes\_availability) | Availability type used for all subsequent nodes past the first_on_demand ones: [SPOT_AZURE \ SPOT_WITH_FALLBACK_AZURE \ ON_DEMAND_AZURE] | `string` | null | no |
143140
| <a name="input_first_on_demand"></a> [first\_on\_demand](#input\_first\_on\_demand) | The first first_on_demand nodes of the cluster will be placed on on-demand instances: [[ \:number ]] | `number` | 0 | no |
144141
| <a name="input_spot_bid_max_price"></a> [spot\_bid\_max\_price](#input\_spot\_bid\_max\_price) | The max price for Azure spot instances. Use -1 to specify lowest price | `number` | -1 | no |
@@ -163,8 +160,6 @@ No modules.
163160
| ------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------- |
164161
| <a name="output_token"></a> [token](#output\_token) | Databricks Personal Authorization Token |
165162
| <a name="output_cluster_id"></a> [cluster\_id](#output\_cluster\_id) | Databricks Cluster Id |
166-
| <a name="output_cluster_policies_object"></a> [cluster\_policies\_object](#output\_cluster\_policies\_object) | Databricks Cluster Policies object map |
167-
| <a name="output_secret_scope_object"></a> [secret_scope\_object](#output\_secret_scope\_object) | Databricks-managed Secret Scope object map to create ACLs |
168163
<!-- END_TF_DOCS -->
169164

170165
## License

main.tf

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ resource "databricks_token" "pat" {
1919
}
2020

2121
resource "databricks_user" "this" {
22-
for_each = var.sku == "premium" ? [] : toset(var.users)
22+
for_each = toset(var.users)
2323
user_name = each.value
2424
lifecycle { ignore_changes = [external_id] }
2525
}
@@ -34,24 +34,12 @@ resource "azurerm_role_assignment" "this" {
3434
principal_id = each.value.object_id
3535
}
3636

37-
resource "databricks_cluster_policy" "this" {
38-
for_each = var.sku == "premium" ? {
39-
for param in var.custom_cluster_policies : (param.name) => param.definition
40-
if param.definition != null
41-
} : {}
42-
43-
name = each.key
44-
definition = jsonencode(each.value)
45-
}
46-
4737
resource "databricks_cluster" "this" {
4838
cluster_name = var.custom_default_cluster_name == null ? "shared autoscaling" : var.custom_default_cluster_name
4939
spark_version = var.spark_version
5040
spark_conf = var.spark_conf
5141
spark_env_vars = var.spark_env_vars
5242

53-
policy_id = var.sku == "premium" ? one([for policy in var.custom_cluster_policies : databricks_cluster_policy.this[policy.name].id if policy.assigned]) : null
54-
5543
data_security_mode = var.data_security_mode
5644
node_type_id = var.node_type
5745
autotermination_minutes = var.autotermination_minutes

outputs.tf

Lines changed: 0 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -7,20 +7,3 @@ output "cluster_id" {
77
value = databricks_cluster.this.id
88
description = "Databricks Cluster Id"
99
}
10-
11-
output "cluster_policies_object" {
12-
value = [for policy in var.custom_cluster_policies : {
13-
id = databricks_cluster_policy.this[policy.name].id
14-
name = databricks_cluster_policy.this[policy.name].name
15-
can_use = policy.can_use
16-
} if policy.definition != null && var.sku == "premium"]
17-
description = "Databricks Cluster Policies object map"
18-
}
19-
20-
output "secret_scope_object" {
21-
value = [for param in var.secret_scope : {
22-
scope_name = databricks_secret_scope.this[param.scope_name].name
23-
acl = param.acl
24-
} if param.acl != null]
25-
description = "Databricks-managed Secret Scope object map to create ACLs"
26-
}

secrets.tf

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ locals {
1414
# Secret Scope with SP secrets for mounting Azure Data Lake Storage
1515
resource "databricks_secret_scope" "main" {
1616
name = "main"
17-
initial_manage_principal = var.sku == "premium" ? null : "users"
17+
initial_manage_principal = "users"
1818
}
1919

2020
resource "databricks_secret" "main" {
@@ -33,7 +33,7 @@ resource "databricks_secret_scope" "this" {
3333
}
3434

3535
name = each.key
36-
initial_manage_principal = var.sku == "premium" ? null : "users"
36+
initial_manage_principal = "users"
3737
}
3838

3939
resource "databricks_secret" "this" {

variables.tf

Lines changed: 0 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -23,12 +23,6 @@ variable "key_vault_id" {
2323
description = "ID of the Key Vault instance where the Secret resides"
2424
}
2525

26-
variable "sku" {
27-
type = string
28-
description = "The sku to use for the Databricks Workspace: [standard|premium|trial]"
29-
default = "standard"
30-
}
31-
3226
variable "pat_token_lifetime_seconds" {
3327
type = number
3428
description = "The lifetime of the token, in seconds. If no lifetime is specified, the token remains valid indefinitely"
@@ -52,33 +46,6 @@ variable "permissions" {
5246
]
5347
}
5448

55-
# Cluster policy variables
56-
variable "custom_cluster_policies" {
57-
type = list(object({
58-
name = string
59-
can_use = list(string)
60-
definition = any
61-
assigned = bool
62-
}))
63-
description = <<-EOT
64-
Provides an ability to create custom cluster policy, assign it to cluster and grant CAN_USE permissions on it to certain custom groups
65-
name - name of custom cluster policy to create
66-
can_use - list of string, where values are custom group names, there groups have to be created with Terraform;
67-
definition - JSON document expressed in Databricks Policy Definition Language. No need to call 'jsonencode()' function on it when providing a value;
68-
assigned - boolean flag which assigns policy to default 'shared autoscaling' cluster, only single custom policy could be assigned;
69-
EOT
70-
default = [{
71-
name = null
72-
can_use = null
73-
definition = null
74-
assigned = false
75-
}]
76-
validation {
77-
condition = length([for policy in var.custom_cluster_policies : policy.assigned if policy.assigned]) <= 1
78-
error_message = "Only single cluster policy assignment allowed. Please set 'assigned' parameter to 'true' for exact one or none policy"
79-
}
80-
}
81-
8249
# Shared autoscaling cluster config variables
8350
variable "cluster_nodes_availability" {
8451
type = string

0 commit comments

Comments
 (0)