You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "KaiXinXIaoLei (JIRA)" <ji...@apache.org> on 2015/07/11 02:20:04 UTC

[jira] [Comment Edited] (SPARK-8974) The spark-dynamic-executor-allocation may be not supported

    [ https://issues.apache.org/jira/browse/SPARK-8974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14623115#comment-14623115 ] 

KaiXinXIaoLei edited comment on SPARK-8974 at 7/11/15 12:19 AM:
----------------------------------------------------------------

I test this case. When  the state of ApplicationMaster is dead or disconnected, and tasks are submitted, executors are requested to register. But the new ApplicationMaster does not start, so the thread of spark-dynamic-executor-allocation will throw exception and retry three times(default). So when new ApplicationMaster start, the thread of spark-dynamic-executor-allocation is dead and is not recovered. So executor allocation is not supported. 


was (Author: kaixinxiaolei):
I test this case. When  the state of ApplicationMaster is dead or disconnected, and tasks are submitted, Executors that have been requested to register. But the new ApplicationMaster does not start, so the thread of spark-dynamic-executor-allocation will throw exception and retry three times(default). So when new ApplicationMaster start, the thread of spark-dynamic-executor-allocation is dead and is not recovered. So executor allocation is not supported. 

> The spark-dynamic-executor-allocation may be not supported
> ----------------------------------------------------------
>
>                 Key: SPARK-8974
>                 URL: https://issues.apache.org/jira/browse/SPARK-8974
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: KaiXinXIaoLei
>             Fix For: 1.5.0
>
>
> In yarn-client mode and config option "spark.dynamicAllocation.enabled " is true, when the state of ApplicationMaster is dead or disconnected, if the tasks are submitted  before new ApplicationMaster start. The thread of spark-dynamic-executor-allocation will throw exception, When ApplicationMaster is running and not tasks are running, the number of executor is not zero. So feture of dynamicAllocation are not  supported.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org