You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by jo...@apache.org on 2015/02/17 02:57:22 UTC

spark git commit: [SPARK-5788] [PySpark] capture the exception in python write thread

Repository: spark
Updated Branches:
  refs/heads/master 1294a6e01 -> b1bd1dd32


[SPARK-5788] [PySpark] capture the exception in python write thread

The exception in Python writer thread will shutdown executor.

Author: Davies Liu <da...@databricks.com>

Closes #4577 from davies/exception and squashes the following commits:

eb0ceff [Davies Liu] Update PythonRDD.scala
139b0db [Davies Liu] capture the exception in python write thread


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b1bd1dd3
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b1bd1dd3
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b1bd1dd3

Branch: refs/heads/master
Commit: b1bd1dd3228ef50fa7310d466afd834b8cb1f22e
Parents: 1294a6e
Author: Davies Liu <da...@databricks.com>
Authored: Mon Feb 16 17:57:14 2015 -0800
Committer: Josh Rosen <jo...@databricks.com>
Committed: Mon Feb 16 17:57:14 2015 -0800

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/b1bd1dd3/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala b/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
index b89effc..2527211 100644
--- a/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
+++ b/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
@@ -248,13 +248,13 @@ private[spark] class PythonRDD(
       } catch {
         case e: Exception if context.isCompleted || context.isInterrupted =>
           logDebug("Exception thrown after task completion (likely due to cleanup)", e)
-          worker.shutdownOutput()
+          Utils.tryLog(worker.shutdownOutput())
 
         case e: Exception =>
           // We must avoid throwing exceptions here, because the thread uncaught exception handler
           // will kill the whole executor (see org.apache.spark.executor.Executor).
           _exception = e
-          worker.shutdownOutput()
+          Utils.tryLog(worker.shutdownOutput())
       } finally {
         // Release memory used by this thread for shuffles
         env.shuffleMemoryManager.releaseMemoryForThisThread()


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org