[MINOR][DOCS] Add missing backticks in `Upgrading from PySpark 3.5 to… #12599
build_main.yml
on: push
Run
/
Check changes
36s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
28m 15s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
3 warnings
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~G0HMM4.dockerbuild
Expired
|
24.5 KB |
sha256:6226358396c0c8489587109766386b68f48a66fab3fe0f304cb75d88d4e1e55a
|
|
apache~spark~LRUXTY.dockerbuild
Expired
|
29.2 KB |
sha256:865b7a91f743eb3d1c28482c8897ef947ebf3fd392ada3b9100bb7ac5f78914f
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
Expired
|
188 KB |
sha256:a01ad6c99ab62851af778eb06f7b5d1c625bb0b90ecbc6968cce70133ef433b9
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.11
Expired
|
197 KB |
sha256:d39846800c1dc47b7e9e332c1dd064723905e0442e0e8c1cd615260b53322c7a
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
Expired
|
279 KB |
sha256:b79f0096ca90cbe4ff7d3b80fe1b268d4e9f5bd167bd2442f068691d89136d8b
|
|