You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/22 02:24:07 UTC

[GitHub] [spark] rezasafi opened a new pull request #24670: [SPARK-25139][SPARK-18406][CORE][BRANCH-2.3] Avoid NonFatals to kill the Executor in PythonRunner

rezasafi opened a new pull request #24670: [SPARK-25139][SPARK-18406][CORE][BRANCH-2.3] Avoid NonFatals to kill the Executor in PythonRunner
URL: https://github.com/apache/spark/pull/24670
 
 
   
   ## What changes were proposed in this pull request?
   
   Python uses a prefetch approach to read the result from upstream and serve them in another thread, thus it's possible that if the children operator doesn't consume all the data then the Task cleanup may happen before Python side read process finishes, this in turn create a race condition that the block read locks are freed during Task cleanup and then the reader try to release the read lock it holds and find it has been released, in this case we shall hit a AssertionError.
   
   We shall catch the AssertionError in PythonRunner and prevent this kill the Executor.
   
   ## How was this patch tested?
   
   Hard to write a unit test case for this case, manually verified with failed job.
   
   Closes #24542 from jiangxb1987/pyError.
   
   Authored-by: Xingbo Jiang <xi...@databricks.com>
   Signed-off-by: HyukjinKwon <gu...@apache.org>
   (cherry picked from commit e63fbfcf206b8c98396668736fd880fb6787a4f2)
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org