[SPARK-51178][CONNECT][PYTHON] Raise proper PySpark error instead of … #12594
build_main.yml
on: push
Run
/
Check changes
38s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
27m 57s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
3 warnings
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~8JXUIP.dockerbuild
Expired
|
28.7 KB |
sha256:a6b46a5c6ba49d31a9251300fe3670b84efc220ee359099325c41c3fd8539c4e
|
|
apache~spark~ISOP5L.dockerbuild
Expired
|
23.1 KB |
sha256:aa1f9333ce699f296d3788466adc7f06d88c5ed5696314a52615ba84069cc5f7
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
Expired
|
188 KB |
sha256:90d0b2f5d305ad5d4623f83e398a88d4c861eaf62c092c65bbe160838bb9ebb4
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.11
Expired
|
196 KB |
sha256:4512c8d21aca8752a250f7cfa4f43b20c96fb703d40663e83565cc3bacbd60c7
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
Expired
|
278 KB |
sha256:d6e02c6e43bff67b324cafda40f7bb0588b365e7404008f09e44389870864a12
|
|