You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-51430][PYTHON] Stop PySpark context logger from propagating logs to stdout
### What changes were proposed in this pull request?
This PR stops PySpark context logger from propagating logs to stdout.
### Why are the changes needed?
To improve user experience. Currently you can see this logging message in pyspark shell
```
In [1]: from pyspark.sql.functions import col
In [2]: spark.range(10).select(col("id2")).show()
{"ts": "2025-03-06 16:55:36.678", "level": "ERROR", "logger": "DataFrameQueryContextLogger", "msg": "[UNRESOLVED_COLUMN.WITH_SUGGESTION] A column, variable, or function parameter with name `id2` cannot be resolved. Did you mean one of the following? [`id`]. SQLSTATE: 42703", "context": {"file": "<ipython-input-2-bac6211f25a7>", "line": "1", "fragment": "col", "errorClass": "UNRESOLVED_COLUMN.WITH_SUGGESTION"}, "exception": {"class": "Py4JJavaError", "msg": "An error occurred while calling o34.select...
```
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Existing tests
### Was this patch authored or co-authored using generative AI tooling?
No
Closes#50198 from allisonwang-db/spark-51430-logging.
Authored-by: Allison Wang <allison.wang@databricks.com>
Signed-off-by: Allison Wang <allison.wang@databricks.com>
0 commit comments