You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "gerashegalov (via GitHub)" <gi...@apache.org> on 2023/03/14 01:50:56 UTC

[GitHub] [spark] gerashegalov commented on a diff in pull request #40372: [SPARK-42752][PYSPARK][SQL] Make PySpark exceptions printable during initialization

gerashegalov commented on code in PR #40372:
URL: https://github.com/apache/spark/pull/40372#discussion_r1134801751


##########
python/pyspark/errors/exceptions/captured.py:
##########
@@ -65,8 +65,15 @@ def __str__(self) -> str:
         assert SparkContext._jvm is not None
 
         jvm = SparkContext._jvm
-        sql_conf = jvm.org.apache.spark.sql.internal.SQLConf.get()
-        debug_enabled = sql_conf.pysparkJVMStacktraceEnabled()
+
+        # SPARK-42752: default to True to see issues with initialization
+        debug_enabled = True
+        try:
+            sql_conf = jvm.org.apache.spark.sql.internal.SQLConf.get()
+            debug_enabled = sql_conf.pysparkJVMStacktraceEnabled()
+        except BaseException:

Review Comment:
   I advocate for keeping the likelihood of an unhelpful unprintable exception during initialization to the minimum. I would not want to revisit the issue for other runtime exceptions.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org