You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/10/07 01:59:26 UTC
[jira] [Commented] (SPARK-10964) YARN + dynamic allocation not
working with new RPC backend
[ https://issues.apache.org/jira/browse/SPARK-10964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14946036#comment-14946036 ]
Apache Spark commented on SPARK-10964:
--------------------------------------
User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/9005
> YARN + dynamic allocation not working with new RPC backend
> ----------------------------------------------------------
>
> Key: SPARK-10964
> URL: https://issues.apache.org/jira/browse/SPARK-10964
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 1.6.0
> Reporter: Marcelo Vanzin
> Priority: Blocker
>
> The AM fails to register with the driver and you end up seeing a lot of exceptions like this (with dynamic allocation on):
> {noformat}
> 15/10/06 14:41:16 ERROR YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(8,0,Map()) to AM was unsuccessful
> java.lang.NullPointerException
> at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$3.apply$mcV$sp(YarnSchedulerBackend.scala:188)
> at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$3.apply(YarnSchedulerBackend.scala:186)
> at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$3.apply(YarnSchedulerBackend.scala:186)
> {noformat}
> This probably affects more than just dynamic allocation, but I haven't tested. I'm testing a fix currently.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org