Merge remote-tracking branch 'upstream/master' into scripting-for-imp… #170
build_main.yml
on: push
Run
/
Check changes
35s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 17s
Run
/
Run TPC-DS queries with SF=1
1h 20m
Run
/
Run Docker integration tests
1h 27m
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
18s
Matrix: Run / build
Run
/
Build modules: sparkr
25m 46s
Run
/
Linters, licenses, and dependencies
28m 10s
Run
/
Documentation generation
34m 53s
Matrix: Run / pyspark
Annotations
3 warnings
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
test-results-docker-integration--17-hadoop3-hive2.3
Expired
|
42.5 KB |
sha256:f42712ec654726f092ee3f2858b80f0d2a81e1f52f9cd6e347266c00335de28a
|
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.11
Expired
|
172 KB |
sha256:6e05804f8cea8bc3d65f4ea3559124fa57f1c087488acc2b50d48491a7257d18
|
|