Skip to content

Commit 8043304

Browse files
[SPARK-51430][PYTHON] Stop PySpark context logger from propagating logs to stdout
### What changes were proposed in this pull request? This PR stops PySpark context logger from propagating logs to stdout. ### Why are the changes needed? To improve user experience. Currently you can see this logging message in pyspark shell ``` In [1]: from pyspark.sql.functions import col In [2]: spark.range(10).select(col("id2")).show() {"ts": "2025-03-06 16:55:36.678", "level": "ERROR", "logger": "DataFrameQueryContextLogger", "msg": "[UNRESOLVED_COLUMN.WITH_SUGGESTION] A column, variable, or function parameter with name `id2` cannot be resolved. Did you mean one of the following? [`id`]. SQLSTATE: 42703", "context": {"file": "<ipython-input-2-bac6211f25a7>", "line": "1", "fragment": "col", "errorClass": "UNRESOLVED_COLUMN.WITH_SUGGESTION"}, "exception": {"class": "Py4JJavaError", "msg": "An error occurred while calling o34.select... ``` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Existing tests ### Was this patch authored or co-authored using generative AI tooling? No Closes #50198 from allisonwang-db/spark-51430-logging. Authored-by: Allison Wang <allison.wang@databricks.com> Signed-off-by: Allison Wang <allison.wang@databricks.com>
1 parent 1434d68 commit 8043304

File tree

1 file changed

+2
-0
lines changed
  • python/pyspark/errors/exceptions

1 file changed

+2
-0
lines changed

python/pyspark/errors/exceptions/base.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -159,6 +159,7 @@ def _log_exception(self) -> None:
159159
if context:
160160
if context.contextType().name == "DataFrame":
161161
logger = PySparkLogger.getLogger("DataFrameQueryContextLogger")
162+
logger.propagate = False
162163
call_site = context.callSite().split(":")
163164
line = call_site[1] if len(call_site) == 2 else ""
164165
logger.exception(
@@ -170,6 +171,7 @@ def _log_exception(self) -> None:
170171
)
171172
else:
172173
logger = PySparkLogger.getLogger("SQLQueryContextLogger")
174+
logger.propagate = False
173175
logger.exception(
174176
self.getMessage(),
175177
errorClass=self.getCondition(),

0 commit comments

Comments
 (0)