You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sameer Agarwal (JIRA)" <ji...@apache.org> on 2016/04/07 08:40:25 UTC

[jira] [Created] (SPARK-14454) Better exception handling while marking tasks as failed

Sameer Agarwal created SPARK-14454:
--------------------------------------

             Summary: Better exception handling while marking tasks as failed
                 Key: SPARK-14454
                 URL: https://issues.apache.org/jira/browse/SPARK-14454
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: Sameer Agarwal


Add support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in WriterContainer.scala:

{code}
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
{code}

If markTaskFailed or currentWriter.close throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the exception that are thrown within the catch block and rethrowing the original exception.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org