You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/07/31 22:16:04 UTC

[jira] [Commented] (SPARK-9439) ExternalShuffleService should be robust to NodeManager restarts in yarn

    [ https://issues.apache.org/jira/browse/SPARK-9439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14649763#comment-14649763 ] 

Apache Spark commented on SPARK-9439:
-------------------------------------

User 'squito' has created a pull request for this issue:
https://github.com/apache/spark/pull/7839

> ExternalShuffleService should be robust to NodeManager restarts in yarn
> -----------------------------------------------------------------------
>
>                 Key: SPARK-9439
>                 URL: https://issues.apache.org/jira/browse/SPARK-9439
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle
>    Affects Versions: 1.4.1
>            Reporter: Imran Rashid
>            Priority: Critical
>
> Yarn applications should be robust to NodeManager restarts in general.   However, if you run with the external shuffle service enabled, after a NM restart, you will observe failures like:
> {noformat}
> 2015-07-22 18:30:18,212 ERROR org.apache.spark.network.server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 5405054848584757735
> java.lang.RuntimeException: Executor is not registered (appId=application_1437612356649_0008, execId=73)
>         at org.apache.spark.network.shuffle.ExternalShuffleBlockManager.getBlockData(ExternalShuffleBlockManager.java:105)
> ...
> {noformat}
> This is because when the NM restarts (and restarts the ExternalShuffleService), it doesn't call {{ExternalShuffleBlockResolver#registerExecutor}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org