[SPARK-52308][CORE] Fix the bug that the actual configs of the driver are overwritten #51019
+35
−1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
After Spark Driver is started, it is not allowed to overwrite Driver-related configurations.
Why are the changes needed?
Fix the problem that the driver-related configuration was overwritten by the user by mistake, resulting in the final spark UI display error.
Does this PR introduce any user-facing change?
Yes. The values of driver-related configurations on the Spark UI will show the actual effective values
How was this patch tested?
The manual test is as follows:
Test Scripts:
./bin/spark-submit --conf spark.driver.memory=10g --conf spark.driver.memoryOverhead=2g --queue dev test_pyspark.py
test_pyspark.py
Spark UI display before repair:

Spark UI display after repair:

BTW, Spark does not support double type values for memory configuration by default. So when we used Spark code to parse the event log, we encountered this problem. This PR is to avoid this kind of problem from the source.

Was this patch authored or co-authored using generative AI tooling?
No