You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "KaiXinXIaoLei (JIRA)" <ji...@apache.org> on 2015/07/27 14:36:04 UTC

[jira] [Issue Comment Deleted] (SPARK-9097) Tasks are not completed but there is no executors

     [ https://issues.apache.org/jira/browse/SPARK-9097?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

KaiXinXIaoLei updated SPARK-9097:
---------------------------------
    Comment: was deleted

(was: [~srowen], This problem, I analysis code recently. I have a questions. 

In this problem, executor 52 is lost because "no recent heartbeats" at 2015-07-08 15:03:30, , so "2015-07-08 15:03:30,585 | INFO  | Removed 52 successfully in removeExecutor" . 
But in log:

2015-07-08 16:56:31,741 | ERROR | [sparkDriver-akka.actor.default-dispatcher-45] | Lost executor 52 on linux-174: remote Rpc client disassociated 
2015-07-08 16:56:31,741 | WARN  | [sparkDriver-akka.actor.default-dispatcher-6] | Association with remote system [akka.tcp://sparkExecutor@linux-174:23326] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
2015-07-08 16:56:31,741 | INFO  | [sparkDriver-akka.actor.default-dispatcher-45] | Re-queueing tasks for 52 from TaskSet 167.0 
2015-07-08 16:56:31,742 | INFO  | [dag-scheduler-event-loop] | Executor lost: 52 (epoch 29) 
2015-07-08 16:56:31,742 | INFO  | [sparkDriver-akka.actor.default-dispatcher-48] | Trying to remove executor 52 from BlockManagerMaster. 
2015-07-08 16:56:31,742 | INFO  | [dag-scheduler-event-loop] | Removed 52 successfully in removeExecutor 

So i want to know why executor 52 is removed again?  Is it not removed successfully at 15:03:30? Thanks.
 I think i make clear.Thanks.)

> Tasks are not completed but  there is no  executors
> ---------------------------------------------------
>
>                 Key: SPARK-9097
>                 URL: https://issues.apache.org/jira/browse/SPARK-9097
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: KaiXinXIaoLei
>         Attachments: number of executor is zero.png, runing tasks.png
>
>
> I set the value of "spark.dynamicAllocation.enabled" is true, and set "spark.dynamicAllocation.minExecutors = 0". I submit tasks to run. Tasks are not completed, but the number of executor is zero.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org