You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/11/05 07:03:03 UTC

[GitHub] [spark] wzhfy commented on pull request #30248: [SPARK-33339][PYTHON] Pyspark application maybe hangup because of wor…

wzhfy commented on pull request #30248:
URL: https://github.com/apache/spark/pull/30248#issuecomment-722185651


   @li36909 Could you update the description? Currently it's misleading. 
   @HyukjinKwon Actually `system.exit` is throw because user writes some incorrect code (e.g. try to create a SparkContext in a executor task while no SPARK_HOME is available on that node, or the udf case @li36909 mentioned). We have found several such cases from unskilled Spark customers.  
   It would be better for Spark to throw an exception to the user, rather than hanging and making users not know what to do.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org