Skip to content

Conversation

@radhikaathalye-db
Copy link

@radhikaathalye-db radhikaathalye-db commented Sep 16, 2025

Changes

What does this PR do?

Add a new method to the DashboardManager class to upload DuckDB extract to UC volume. Add tests for the same.

Relevant implementation details

Caveats/things to watch out for when reviewing:

Linked issues

Resolves #..

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

@github-actions
Copy link

github-actions bot commented Sep 16, 2025

❌ 25/27 passed, 2 failed, 1m18s total

❌ test_transpile_sql_file: AssertionError (8.189s)
AssertionError
[gw5] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
20:04 INFO [databricks.labs.lakebridge.install] Installing Databricks bladebridge transpiler (v0.1.18)
20:04 DEBUG [databricks.labs.lakebridge.install] Created virtual environment with context: namespace(env_dir=PosixPath('/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv'), env_name='.venv', prompt='(bladebridge) ', executable='/home/runner/work/lakebridge/lakebridge/.venv/bin/python', python_dir='/home/runner/work/lakebridge/lakebridge/.venv/bin', python_exe='python', inc_path='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/include', bin_path='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin', bin_name='bin', env_exe='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python', env_exec_cmd='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python')
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed bladebridge transpiler (v0.1.18)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Detected virtual environment to use at: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin:/home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python ['-m', 'databricks.labs.bladebridge.server', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3674, client_info=ClientInfo(name='lakebridge', version='0.10.8+2720251023200421'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration', initialization_options={'remorph': {'source-dialect': 'teradata'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:04 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: TEST_DEFAULT_WAREHOUSE_ID
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/output
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')]
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (result: TranspileResult(transpiled_code='select cole(hello) world from table;\n', success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', 'c4fd9b82019fc13a3e052d5f0b55060d68fa8a45']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0)))]))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 1)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (result: TranspileResult(transpiled_code="CREATE TABLE REF_TABLE\n    ,NO FALLBACK\n(\n    col1    BYTEINT NOT NULL,\n    col2    SMALLINT NOT NULL,\n    col3    INTEGER NOT NULL,\n    col4    BIGINT NOT NULL,\n    col5    DECIMAL(10,2) NOT NULL,\n    col6    DECIMAL(18,4) NOT NULL,\n    col7    TIMESTAMP(1) NOT NULL,\n    col8    TIME,\n    col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,\n    col10   CHAR(01) NOT NULL,\n    col11   CHAR(04) NOT NULL,\n    col12   CHAR(4),\n    col13   DECIMAL(10,0) NOT NULL,\n    col14   DECIMAL(18,6) NOT NULL,\n    col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,\n    col16   DATE FORMAT 'YY/MM/DD',\n    col17   VARCHAR(30) NOT CASESPECIFIC,\n    col18   FLOAT NOT NULL\n    )\n    UNIQUE PRIMARY INDEX (col1, col3);\n", success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', '4fa5245083bb70b9886aef825a472ff168fa8a46']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))]))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 1)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', 'c4fd9b82019fc13a3e052d5f0b55060d68fa8a45']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0))), TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', '4fa5245083bb70b9886aef825a472ff168fa8a46']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))])
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
20:04 INFO [databricks.labs.lakebridge.install] Installing Databricks bladebridge transpiler (v0.1.18)
20:04 DEBUG [databricks.labs.lakebridge.install] Created virtual environment with context: namespace(env_dir=PosixPath('/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv'), env_name='.venv', prompt='(bladebridge) ', executable='/home/runner/work/lakebridge/lakebridge/.venv/bin/python', python_dir='/home/runner/work/lakebridge/lakebridge/.venv/bin', python_exe='python', inc_path='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/include', bin_path='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin', bin_name='bin', env_exe='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python', env_exec_cmd='/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python')
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed bladebridge transpiler (v0.1.18)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Detected virtual environment to use at: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin:/home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python ['-m', 'databricks.labs.bladebridge.server', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3674, client_info=ClientInfo(name='lakebridge', version='0.10.8+2720251023200421'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration', initialization_options={'remorph': {'source-dialect': 'teradata'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:04 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: TEST_DEFAULT_WAREHOUSE_ID
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/output
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')]
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (result: TranspileResult(transpiled_code='select cole(hello) world from table;\n', success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', 'c4fd9b82019fc13a3e052d5f0b55060d68fa8a45']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0)))]))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 1)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (result: TranspileResult(transpiled_code="CREATE TABLE REF_TABLE\n    ,NO FALLBACK\n(\n    col1    BYTEINT NOT NULL,\n    col2    SMALLINT NOT NULL,\n    col3    INTEGER NOT NULL,\n    col4    BIGINT NOT NULL,\n    col5    DECIMAL(10,2) NOT NULL,\n    col6    DECIMAL(18,4) NOT NULL,\n    col7    TIMESTAMP(1) NOT NULL,\n    col8    TIME,\n    col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,\n    col10   CHAR(01) NOT NULL,\n    col11   CHAR(04) NOT NULL,\n    col12   CHAR(4),\n    col13   DECIMAL(10,0) NOT NULL,\n    col14   DECIMAL(18,6) NOT NULL,\n    col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,\n    col16   DATE FORMAT 'YY/MM/DD',\n    col17   VARCHAR(30) NOT CASESPECIFIC,\n    col18   FLOAT NOT NULL\n    )\n    UNIQUE PRIMARY INDEX (col1, col3);\n", success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', '4fa5245083bb70b9886aef825a472ff168fa8a46']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))]))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 1)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', 'c4fd9b82019fc13a3e052d5f0b55060d68fa8a45']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0))), TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', '4fa5245083bb70b9886aef825a472ff168fa8a46']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))])
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
[gw5] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_sql_file: AssertionError (10.968s)
AssertionError
[gw5] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
20:04 DEBUG [databricks.labs.lakebridge.install] Maven metadata for com.databricks.labs:databricks-morph-plugin: b'<?xml version="1.0" encoding="UTF-8"?>\n<metadata>\n  <groupId>com.databricks.labs</groupId>\n  <artifactId>databricks-morph-plugin</artifactId>\n  <versioning>\n    <latest>0.6.9</latest>\n    <release>0.6.9</release>\n    <versions>\n      <version>0.2.0</version>\n      <version>0.3.0</version>\n      <version>0.4.0</version>\n      <version>0.5.0</version>\n      <version>0.6.0</version>\n      <version>0.6.1</version>\n      <version>0.6.2</version>\n      <version>0.6.3</version>\n      <version>0.6.4</version>\n      <version>0.6.5</version>\n      <version>0.6.6</version>\n      <version>0.6.7</version>\n      <version>0.6.8</version>\n      <version>0.6.9</version>\n    </versions>\n    <lastUpdated>20251016201151</lastUpdated>\n  </versioning>\n</metadata>\n'
20:04 INFO [databricks.labs.lakebridge.install] Installing Databricks morpheus transpiler (v0.6.9)
20:04 DEBUG [databricks.labs.lakebridge.install] Downloaded maven artefact from https://repo.maven.apache.org/maven2/com/databricks/labs/databricks-morph-plugin/0.6.9/databricks-morph-plugin-0.6.9.jar to /tmp/tmpeg6uwquv
20:04 DEBUG [databricks.labs.lakebridge.install] Moving /tmp/tmpeg6uwquv to /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/labs/remorph-transpilers/morpheus/lib/databricks-morph-plugin.jar
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed: com.databricks.labs:databricks-morph-plugin:0.6.9
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed morpheus transpiler (v0.6.9)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /usr/bin/java ['-jar', 'databricks-morph-plugin.jar', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/labs/remorph-transpilers/morpheus/lib)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3674, client_info=ClientInfo(name='lakebridge', version='0.10.8+2720251023200424'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration', initialization_options={'remorph': {'source-dialect': 'snowflake'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:04 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: TEST_DEFAULT_WAREHOUSE_ID
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/output
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql')]
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (result: TranspileResult(transpiled_code='SELECT\n  COLE(hello) AS world\nFROM\n  table;', success_count=1, error_list=[TranspileError(code='None', kind=<ErrorKind.INTERNAL: 'INTERNAL'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='The call to function COLE is passed through without change. Check that it is exactly equivalent in Databricks SQL', range=CodeRange(start=CodePosition(line=1, character=2), end=CodePosition(line=1, character=7)))]))
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Validating query with catalog catalog and schema schema
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Analysis Exception : NOT IGNORED: Flag as Function Missing error [UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished validating transpiled code for file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (result: ValidationResult(validated_sql='-------------- Exception Start-------------------\n/*\n[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].\n*/\nSELECT\n  COLE(hello) AS world\nFROM\n  table;\n ---------------Exception End --------------------\n', exception_msg='[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].'))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (errors: 2)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (result: TranspileResult(transpiled_code='CREATE TABLE employee (\n  employee_id DECIMAL(38, 0),\n  first_name VARCHAR(50) NOT NULL,\n  last_name VARCHAR(50) NOT NULL,\n  birth_date DATE,\n  hire_date DATE,\n  salary DECIMAL(10, 2),\n  department_id DECIMAL(38, 0),\n  remarks VARIANT\n);', success_count=1, error_list=[]))
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Validating query with catalog catalog and schema schema
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished validating transpiled code for file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (result: ValidationResult(validated_sql='CREATE TABLE employee (\n  employee_id DECIMAL(38, 0),\n  first_name VARCHAR(50) NOT NULL,\n  last_name VARCHAR(50) NOT NULL,\n  birth_date DATE,\n  hire_date DATE,\n  salary DECIMAL(10, 2),\n  department_id DECIMAL(38, 0),\n  remarks VARIANT\n);', exception_msg=None))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (errors: 0)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='None', kind=<ErrorKind.INTERNAL: 'INTERNAL'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='The call to function COLE is passed through without change. Check that it is exactly equivalent in Databricks SQL', range=CodeRange(start=CodePosition(line=1, character=2), end=CodePosition(line=1, character=7))), TranspileError(code='VALIDATION_ERROR', kind=<ErrorKind.VALIDATION: 'VALIDATION'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].', range=None)])
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 1
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': None}
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
20:04 DEBUG [databricks.labs.lakebridge.install] Maven metadata for com.databricks.labs:databricks-morph-plugin: b'<?xml version="1.0" encoding="UTF-8"?>\n<metadata>\n  <groupId>com.databricks.labs</groupId>\n  <artifactId>databricks-morph-plugin</artifactId>\n  <versioning>\n    <latest>0.6.9</latest>\n    <release>0.6.9</release>\n    <versions>\n      <version>0.2.0</version>\n      <version>0.3.0</version>\n      <version>0.4.0</version>\n      <version>0.5.0</version>\n      <version>0.6.0</version>\n      <version>0.6.1</version>\n      <version>0.6.2</version>\n      <version>0.6.3</version>\n      <version>0.6.4</version>\n      <version>0.6.5</version>\n      <version>0.6.6</version>\n      <version>0.6.7</version>\n      <version>0.6.8</version>\n      <version>0.6.9</version>\n    </versions>\n    <lastUpdated>20251016201151</lastUpdated>\n  </versioning>\n</metadata>\n'
20:04 INFO [databricks.labs.lakebridge.install] Installing Databricks morpheus transpiler (v0.6.9)
20:04 DEBUG [databricks.labs.lakebridge.install] Downloaded maven artefact from https://repo.maven.apache.org/maven2/com/databricks/labs/databricks-morph-plugin/0.6.9/databricks-morph-plugin-0.6.9.jar to /tmp/tmpeg6uwquv
20:04 DEBUG [databricks.labs.lakebridge.install] Moving /tmp/tmpeg6uwquv to /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/labs/remorph-transpilers/morpheus/lib/databricks-morph-plugin.jar
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed: com.databricks.labs:databricks-morph-plugin:0.6.9
20:04 INFO [databricks.labs.lakebridge.install] Successfully installed morpheus transpiler (v0.6.9)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /usr/bin/java ['-jar', 'databricks-morph-plugin.jar', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/labs/remorph-transpilers/morpheus/lib)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3674, client_info=ClientInfo(name='lakebridge', version='0.10.8+2720251023200424'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration', initialization_options={'remorph': {'source-dialect': 'snowflake'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:04 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: TEST_DEFAULT_WAREHOUSE_ID
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw5/test_transpile_sql_file1/output
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql')]
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (result: TranspileResult(transpiled_code='SELECT\n  COLE(hello) AS world\nFROM\n  table;', success_count=1, error_list=[TranspileError(code='None', kind=<ErrorKind.INTERNAL: 'INTERNAL'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='The call to function COLE is passed through without change. Check that it is exactly equivalent in Databricks SQL', range=CodeRange(start=CodePosition(line=1, character=2), end=CodePosition(line=1, character=7)))]))
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Validating query with catalog catalog and schema schema
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Analysis Exception : NOT IGNORED: Flag as Function Missing error [UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished validating transpiled code for file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (result: ValidationResult(validated_sql='-------------- Exception Start-------------------\n/*\n[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].\n*/\nSELECT\n  COLE(hello) AS world\nFROM\n  table;\n ---------------Exception End --------------------\n', exception_msg='[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].'))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql (errors: 2)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (result: TranspileResult(transpiled_code='CREATE TABLE employee (\n  employee_id DECIMAL(38, 0),\n  first_name VARCHAR(50) NOT NULL,\n  last_name VARCHAR(50) NOT NULL,\n  birth_date DATE,\n  hire_date DATE,\n  salary DECIMAL(10, 2),\n  department_id DECIMAL(38, 0),\n  remarks VARIANT\n);', success_count=1, error_list=[]))
20:04 DEBUG [databricks.labs.lakebridge.helpers.validation] Validating query with catalog catalog and schema schema
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished validating transpiled code for file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (result: ValidationResult(validated_sql='CREATE TABLE employee (\n  employee_id DECIMAL(38, 0),\n  first_name VARCHAR(50) NOT NULL,\n  last_name VARCHAR(50) NOT NULL,\n  birth_date DATE,\n  hire_date DATE,\n  salary DECIMAL(10, 2),\n  department_id DECIMAL(38, 0),\n  remarks VARIANT\n);', exception_msg=None))
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql (errors: 0)
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='None', kind=<ErrorKind.INTERNAL: 'INTERNAL'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='The call to function COLE is passed through without change. Check that it is exactly equivalent in Databricks SQL', range=CodeRange(start=CodePosition(line=1, character=2), end=CodePosition(line=1, character=7))), TranspileError(code='VALIDATION_ERROR', kind=<ErrorKind.VALIDATION: 'VALIDATION'>, severity=<ErrorSeverity.WARNING: 'WARNING'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/snowflake/integration/dummy_function.sql'), message='[UNRESOLVED_ROUTINE] Cannot resolve routine `COLE` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].', range=None)])
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 1
20:04 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': None}
20:04 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
[gw5] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Running from acceptance #2738

@gueniai gueniai added the feat/profiler Issues related to profilers label Sep 16, 2025

# Validate inputs
if not os.path.exists(local_file_path):
print(f"Error: Local file not found: {local_file_path}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll want to use logger.error() statements here as opposed to printing to the console. Here is a good reference: https://github.com/databrickslabs/lakebridge/blob/main/src/databricks/labs/lakebridge/assessments/configure_assessment.py#L43

url = f"{workspace_url}/api/2.0/fs/files{volume_path}"

with open(local_file_path, 'rb') as f:
response = requests.put(url, headers=headers, data=f)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@radhikaathalye-db could you pull the upstream changes from the branch feature/add_local_dashboards. We'll want to use the Python SDK (WorkspaceClient) to upload files to the UC volume vs. using the REST API. Thanks!

@patch("os.path.exists")
@patch("builtins.open", new_callable=MagicMock)
def test_upload_duckdb_to_uc_volume_failure(mock_open, mock_exists, dashboard_manager):
mock_exists.return_value = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't overwrite = True avoid this type of scenario?

@goodwillpunning goodwillpunning force-pushed the feature/add_local_dashboards branch from 2dcdf50 to 25c1724 Compare October 29, 2025 21:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat/profiler Issues related to profilers stacked PR Should be reviewed, but not merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants