-
Notifications
You must be signed in to change notification settings - Fork 146
Description
Describe the bug
A workflow dbt job terminates with an error Max retries exceeded with url: ... (Caused by ResponseError('too many 503 error responses')) just when it starts sending SQL commands to the cluster.
No changes has been made to the code or yml files.
This occurs only with SQL Warehouse, not with SQL Warehouse Serverless
Steps To Reproduce
The problem occurs in a workflow in a databricks workspace with the following settings
Running on Azure, Databricks Premium, not Unity Catalog
Job cluster single node Standard_DS3_v2
Work cluster SQL Warehouse Pro X-Small, Cluster count: Active 0 Min 1 Max 1, Channel Current, Cost optimized
git source Azure Devops
Settings for library version dbt-databricks>=1.0.0,<2.0.0
Start the workflow. After the job cluster has been created and the SQL Warehouse has been started an error is shown in the log:
-
dbt deps --profiles-dir ../misc/misc/ -t prod
10:14:06 Running with dbt=1.9.1
10:14:07 Updating lock file in file path: /tmp/tmp-dbt-run-126728701395951/piab/dbt/package-lock.yml
10:14:07 Installing calogica/dbt_expectations
10:14:07 Installed from version 0.10.4
10:14:07 Up to date!
10:14:07 Installing dbt-labs/dbt_utils
10:14:08 Installed from version 1.1.1
10:14:08 Updated version available: 1.3.0
10:14:08 Installing calogica/dbt_date
10:14:08 Installed from version 0.10.1
10:14:08 Up to date!
10:14:08
10:14:08 Updates available for packages: ['dbt-labs/dbt_utils']
Update your versions in packages.yml, then run dbt deps -
dbt build --profiles-dir ../misc/misc/ -t prod -f
10:14:10 Running with dbt=1.9.1
10:14:11 Registered adapter: databricks=1.9.1
10:14:12 Unable to do partial parsing because saved manifest not found. Starting full parse. -
dbt build --profiles-dir ../misc/misc/ -t prod -f
10:14:10 Running with dbt=1.9.1
10:14:11 Registered adapter: databricks=1.9.1
10:14:12 Unable to do partial parsing because saved manifest not found. Starting full parse.
10:14:31 Found 435 models, 103 snapshots, 1 analysis, 8 seeds, 1559 data tests, 123 sources, 8 exposures, 999 macros
10:14:32
10:14:32 Concurrency: 12 threads (target='prod')
10:14:32
10:14:58
10:14:58 Finished running in 0 hours 0 minutes and 26.49 seconds (26.49s).
10:14:58 Encountered an error:
Database Error
HTTPSConnectionPool(host='adb-130132662866554.14.azuredatabricks.net', port=443): Max retries exceeded with url: /sql/1.0/warehouses/660a2880f1cab4fb (Caused by ResponseError('too many 503 error responses'))
Expected behavior
dbt-databricks workflow start with any error as shown below
-
dbt deps --profiles-dir ../misc/misc/ -t prod
10:20:37 Running with dbt=1.9.1
10:20:37 Updating lock file in file path: /tmp/tmp-dbt-run-355636934123336/piab/dbt/package-lock.yml
10:20:37 Installing calogica/dbt_expectations
10:20:38 Installed from version 0.10.4
10:20:38 Up to date!
10:20:38 Installing dbt-labs/dbt_utils
10:20:38 Installed from version 1.1.1
10:20:38 Updated version available: 1.3.0
10:20:38 Installing calogica/dbt_date
10:20:38 Installed from version 0.10.1
10:20:38 Up to date!
10:20:38
10:20:38 Updates available for packages: ['dbt-labs/dbt_utils']
Update your versions in packages.yml, then run dbt deps -
dbt build --profiles-dir ../misc/misc/ -t prod -f
10:20:41 Running with dbt=1.9.1
10:20:42 Registered adapter: databricks=1.9.1
10:20:42 Unable to do partial parsing because saved manifest not found. Starting full parse.
10:21:01 Found 435 models, 103 snapshots, 1 analysis, 8 seeds, 1559 data tests, 123 sources, 8 exposures, 999 macros
10:21:02
10:21:02 Concurrency: 12 threads (target='prod')
10:21:02
10:21:19 1 of 1986 START sql table model staging.rollup12helper ......................... [RUN]
10:21:19 2 of 1986 START sql table model staging.rollup24helper ......................... [RUN]
Screenshots and log output
NA
System information
The output of dbt --version
:
dbt=1.9.1
Registered adapter: databricks=1.9.1
The operating system you're using:
NA
The output of python --version
:
NA
Additional context
Add any other context about the problem here.