Skip to content

Commit dfee24d

Browse files
committed
Remove WorkflowLinter
1 parent f58aa24 commit dfee24d

File tree

5 files changed

+9
-70
lines changed

5 files changed

+9
-70
lines changed

README.md

Lines changed: 3 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -401,9 +401,8 @@ which can be used for further analysis and decision-making through the [assessme
401401
9. `assess_pipelines`: This task scans through all the Pipelines and identifies those pipelines that have Azure Service Principals embedded in their configurations. A list of all the pipelines with matching configurations is stored in the `$inventory.pipelines` table.
402402
10. `assess_azure_service_principals`: This task scans through all the clusters configurations, cluster policies, job cluster configurations, Pipeline configurations, and Warehouse configuration and identifies all the Azure Service Principals who have been given access to the Azure storage accounts via spark configurations referred in those entities. The list of all the Azure Service Principals referred in those configurations is saved in the `$inventory.azure_service_principals` table.
403403
11. `assess_global_init_scripts`: This task scans through all the global init scripts and identifies if there is an Azure Service Principal who has been given access to the Azure storage accounts via spark configurations referred in those scripts.
404-
12. `assess_dashboards`: This task scans through all the dashboards and analyzes embedded queries for migration problems. It also collects direct filesystem access patterns that require attention.
405-
13. `assess_workflows`: This task scans through all the jobs and tasks and analyzes notebooks and files for migration problems. It also collects direct filesystem access patterns that require attention.
406-
404+
12. `assess_dashboards`: This task scans through all the dashboards and analyzes embedded queries for migration problems which it persists in `$inventory_database.query_problems`. It also collects direct filesystem access patterns that require attention which it persists in `$inventory_database.directfs_in_queries`.
405+
13. `assess_workflows`: This task scans through all the jobs and tasks and analyzes notebooks and files for migration problems which it persists in `$inventory_database.workflow_problems`. It also collects direct filesystem access patterns that require attention which it persists in `$inventory_database.directfs_in_paths`.
407406

408407
![report](docs/assessment-report.png)
409408

@@ -726,25 +725,10 @@ in the Migration dashboard.
726725

727726
[[back to top](#databricks-labs-ucx)]
728727

729-
## Jobs Static Code Analysis Workflow
730-
731-
The `workflow-linter` workflow lints accessible code from 2 sources:
732-
- all workflows/jobs present in the workspace
733-
- all dashboards/queries present in the workspace
734-
The linting emits problems indicating what to resolve for making the code Unity Catalog compatible.
735-
The linting also locates direct filesystem access that need to be migrated.
736-
737-
Once the workflow completes:
738-
- problems are stored in the `$inventory_database.workflow_problems`/`$inventory_database.query_problems` table
739-
- direct filesystem access are stored in the `$inventory_database.directfs_in_paths`/`$inventory_database.directfs_in_queries` table
740-
- all the above are displayed in the Migration dashboard.
728+
### Linter message codes
741729

742730
![code compatibility problems](docs/code_compatibility_problems.png)
743731

744-
[[back to top](#databricks-labs-ucx)]
745-
746-
### Linter message codes
747-
748732
Here's the detailed explanation of the linter message codes:
749733

750734
#### `cannot-autofix-table-reference`

src/databricks/labs/ucx/assessment/workflows.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -184,20 +184,20 @@ def crawl_groups(self, ctx: RuntimeContext):
184184
ctx.group_manager.snapshot()
185185

186186
@job_task
187-
def assess_workflows(self, ctx: RuntimeContext):
188-
"""Scans all jobs for migration issues in notebooks jobs.
187+
def assess_dashboards(self, ctx: RuntimeContext):
188+
"""Scans all dashboards for migration issues in SQL code of embedded widgets.
189189
190190
Also, stores direct filesystem accesses for display in the migration dashboard.
191191
"""
192-
ctx.workflow_linter.refresh_report(ctx.sql_backend, ctx.inventory_database)
192+
ctx.query_linter.refresh_report(ctx.sql_backend, ctx.inventory_database)
193193

194194
@job_task
195-
def assess_dashboards(self, ctx: RuntimeContext):
196-
"""Scans all dashboards for migration issues in SQL code of embedded widgets.
195+
def assess_workflows(self, ctx: RuntimeContext):
196+
"""Scans all jobs for migration issues in notebooks jobs.
197197
198198
Also, stores direct filesystem accesses for display in the migration dashboard.
199199
"""
200-
ctx.query_linter.refresh_report(ctx.sql_backend, ctx.inventory_database)
200+
ctx.workflow_linter.refresh_report(ctx.sql_backend, ctx.inventory_database)
201201

202202

203203
class Failing(Workflow):

src/databricks/labs/ucx/runtime.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@
2020
)
2121
from databricks.labs.ucx.progress.workflows import MigrationProgress
2222
from databricks.labs.ucx.recon.workflows import MigrationRecon
23-
from databricks.labs.ucx.source_code.workflows import WorkflowLinter
2423
from databricks.labs.ucx.workspace_access.workflows import (
2524
GroupMigration,
2625
PermissionsMigrationAPI,
@@ -58,7 +57,6 @@ def all(cls):
5857
ScanTablesInMounts(),
5958
MigrateTablesInMounts(),
6059
PermissionsMigrationAPI(),
61-
WorkflowLinter(),
6260
MigrationRecon(),
6361
Failing(),
6462
]

src/databricks/labs/ucx/source_code/workflows.py

Lines changed: 0 additions & 23 deletions
This file was deleted.

tests/integration/source_code/test_jobs.py

Lines changed: 0 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -32,26 +32,6 @@
3232
from tests.unit.source_code.test_graph import _TestDependencyGraph
3333

3434

35-
@retried(on=[NotFound], timeout=timedelta(minutes=5))
36-
def test_running_real_workflow_linter_job(installation_ctx, make_job) -> None:
37-
# Deprecated file system path in call to: /mnt/things/e/f/g
38-
job = make_job(content="spark.read.table('a_table').write.csv('/mnt/things/e/f/g')\n")
39-
ctx = installation_ctx.replace(config_transform=lambda wc: replace(wc, include_job_ids=[job.job_id]))
40-
ctx.workspace_installation.run()
41-
ctx.deployed_workflows.run_workflow("workflow-linter")
42-
ctx.deployed_workflows.validate_step("workflow-linter")
43-
44-
# This test merely tests that the workflows produces records of the expected types; record content is not checked.
45-
cursor = ctx.sql_backend.fetch(f"SELECT COUNT(*) AS count FROM {ctx.inventory_database}.workflow_problems")
46-
result = next(cursor)
47-
if result['count'] == 0:
48-
installation_ctx.deployed_workflows.relay_logs("workflow-linter")
49-
assert False, "No workflow problems found"
50-
dfsa_records = installation_ctx.directfs_access_crawler_for_paths.snapshot()
51-
used_table_records = installation_ctx.used_tables_crawler_for_paths.snapshot()
52-
assert dfsa_records and used_table_records
53-
54-
5535
@retried(on=[NotFound], timeout=timedelta(minutes=2))
5636
def test_linter_from_context(simple_ctx, make_job) -> None:
5737
# This code is similar to test_running_real_workflow_linter_job, but it's executed on the caller side and is easier

0 commit comments

Comments
 (0)