You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "heyihong (via GitHub)" <gi...@apache.org> on 2023/09/21 00:02:38 UTC

[GitHub] [spark] heyihong commented on pull request #43017: [SPARK-45239] Reduce default spark.connect.jvmStacktrace.maxSize

heyihong commented on PR #43017:
URL: https://github.com/apache/spark/pull/43017#issuecomment-1728573383

   > Hi @heyihong , since you already have this error enrichment PR: #42987 Does it make sense to still support this `spark.connect.jvmStacktrace.maxSize` in spark connect? Also, just wondering, why do we want to enable this `spark.sql.pyspark.jvmStacktrace.enabled` config?
   
   For long term, no. But for short term, we need to be backward compatible with the old clients. We should turn off `spark.sql.pyspark.jvmStacktrace.enabled` once error enrichment feature is done. But we still need to keep the config for a while since it is the only way for old clients to get jvmStacktrace in the error messages. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org