[SPARK-51242][CONENCT][PYTHON] Improve Column performance when DQC is… #12595
build_main.yml
on: push
Run
/
Check changes
38s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
28m 51s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
3 warnings
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~EVNLQR.dockerbuild
Expired
|
28.8 KB |
sha256:f3d2a5f307da53a03328f737a09ae780dc905259e7db24914805fa392a6567d5
|
|
apache~spark~LB6VE4.dockerbuild
Expired
|
24.2 KB |
sha256:2356922196e7d62e8fd9c90427bc295a544f185814436301b2b8a1428da0be17
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
Expired
|
189 KB |
sha256:7cc63c2ee47ec691addf07b0060f91aac7c334068edc2ee9ad4e32dd4b35339b
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.11
Expired
|
196 KB |
sha256:6e412e69c483ee5fd7f114953f3b7dc102f7bfdf1dd34b1a7ea4fbb8a25291d2
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
Expired
|
279 KB |
sha256:13784e4a151dc6bd8ac67a7628b0f5fc8e11d6536e06ec4c31e47c5c8aa17afe
|
|