You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/04/03 05:28:52 UTC
[jira] [Updated] (SPARK-6640) Executor may connect to
HeartbeartReceiver before it's setup in the driver side
[ https://issues.apache.org/jira/browse/SPARK-6640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Or updated SPARK-6640:
-----------------------------
Assignee: Shixiong Zhu
> Executor may connect to HeartbeartReceiver before it's setup in the driver side
> -------------------------------------------------------------------------------
>
> Key: SPARK-6640
> URL: https://issues.apache.org/jira/browse/SPARK-6640
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.3.0
> Reporter: Shixiong Zhu
> Assignee: Shixiong Zhu
>
> Here is the current code about starting LocalBackend and creating HeartbeatReceiver:
> {code}
> // Create and start the scheduler
> private[spark] var (schedulerBackend, taskScheduler) =
> SparkContext.createTaskScheduler(this, master)
> private val heartbeatReceiver = env.actorSystem.actorOf(
> Props(new HeartbeatReceiver(this, taskScheduler)), "HeartbeatReceiver")
> {code}
> When creating LocalBackend, it will start `LocalActor`. `LocalActor` will create Executor, and Executor's constructor will retrieve `HeartbeatReceiver`.
> So we should make sure this line:
> {code}
> private val heartbeatReceiver = env.actorSystem.actorOf(
> Props(new HeartbeatReceiver(this, taskScheduler)), "HeartbeatReceiver")
> {code}
> happen before "creating LocalActor".
> However, current codes can not guarantee that. Sometimes, creating Executor will crash. The issue was reported by sparkdi <sh...@dubna.us> in http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-td22265.html#a22324
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org