You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/11 07:26:11 UTC
[jira] [Assigned] (SPARK-12276) Prevent RejectedExecutionException
by checking if ThreadPoolExecutor is shutdown and its capacity
[ https://issues.apache.org/jira/browse/SPARK-12276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-12276:
------------------------------------
Assignee: Apache Spark
> Prevent RejectedExecutionException by checking if ThreadPoolExecutor is shutdown and its capacity
> -------------------------------------------------------------------------------------------------
>
> Key: SPARK-12276
> URL: https://issues.apache.org/jira/browse/SPARK-12276
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: Liang-Chi Hsieh
> Assignee: Apache Spark
> Priority: Minor
>
> We noticed that it is possible to throw RejectedExecutionException when submitting thread in AppClient. The error is like following. We should add some checks to prevent it.
> java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@2077082c rejected from java.util.concurrent.ThreadPoolExecutor@66b9915a[Running, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 1]
> at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
> at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
> at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
> at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
> at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org