You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jin xing (JIRA)" <ji...@apache.org> on 2018/05/24 11:51:00 UTC
[jira] [Updated] (SPARK-24379) BroadcastExchangeExec should catch
SparkOutOfMemory and re-throw SparkFatalException, which wraps
SparkOutOfMemory inside.
[ https://issues.apache.org/jira/browse/SPARK-24379?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
jin xing updated SPARK-24379:
-----------------------------
Description:
After SPARK-22827, Spark won't fails the entire executor but only fails the task suffering SparkOutOfMemoryError. In current BroadcastExchangeExec, it try-catch OutOfMemoryError. Think about below scenario:
# SparkOutOfMemoryError(subclass of OutOfMemoryError) is thrown in scala.concurrent.Future;
# SparkOutOfMemoryError is caught and a OutOfMemoryError is wrapped in SparkFatalException and re-thrown;
# ThreadUtils.awaitResult catches SparkFatalException and a OutOfMemoryError is thrown;
# The OutOfMemoryError will go to SparkUncaughtExceptionHandler.uncaughtException and Executor fails.
So it make more sense to catch SparkOutOfMemory and re-throw SparkFatalException, which wraps SparkOutOfMemory inside.
was:
After SPARK-22827, Spark won't fails the entire executor but only fails the task suffering SparkOutOfMemoryError. In current BroadcastExchangeExec, it's try-catch OutOfMemoryError. Think about below scenario:
# SparkOutOfMemoryError(subclass of OutOfMemoryError) is thrown in scala.concurrent.Future;
# SparkOutOfMemoryError is caught and a OutOfMemoryError is wrapped in SparkFatalException and re-thrown;
# ThreadUtils.awaitResult catches SparkFatalException and a OutOfMemoryError is thrown;
# The OutOfMemoryError will go to SparkUncaughtExceptionHandler.uncaughtException and Executor fails.
So it make more sense to catch SparkOutOfMemory and re-throw SparkFatalException, which wraps SparkOutOfMemory inside.
> BroadcastExchangeExec should catch SparkOutOfMemory and re-throw SparkFatalException, which wraps SparkOutOfMemory inside.
> --------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-24379
> URL: https://issues.apache.org/jira/browse/SPARK-24379
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.0
> Reporter: jin xing
> Priority: Major
>
> After SPARK-22827, Spark won't fails the entire executor but only fails the task suffering SparkOutOfMemoryError. In current BroadcastExchangeExec, it try-catch OutOfMemoryError. Think about below scenario:
> # SparkOutOfMemoryError(subclass of OutOfMemoryError) is thrown in scala.concurrent.Future;
> # SparkOutOfMemoryError is caught and a OutOfMemoryError is wrapped in SparkFatalException and re-thrown;
> # ThreadUtils.awaitResult catches SparkFatalException and a OutOfMemoryError is thrown;
> # The OutOfMemoryError will go to SparkUncaughtExceptionHandler.uncaughtException and Executor fails.
> So it make more sense to catch SparkOutOfMemory and re-throw SparkFatalException, which wraps SparkOutOfMemory inside.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org