You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2015/04/02 04:20:53 UTC

[jira] [Commented] (SPARK-6653) New configuration property to specify port for sparkYarnAM actor system

    [ https://issues.apache.org/jira/browse/SPARK-6653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14391993#comment-14391993 ] 

Shixiong Zhu commented on SPARK-6653:
-------------------------------------

Could you send a pull request to https://github.com/apache/spark ?

And because this is a yarn configuration, I recommend "spark.yarn.am.port".

> New configuration property to specify port for sparkYarnAM actor system
> -----------------------------------------------------------------------
>
>                 Key: SPARK-6653
>                 URL: https://issues.apache.org/jira/browse/SPARK-6653
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 1.3.0
>         Environment: Spark On Yarn
>            Reporter: Manoj Samel
>
> In 1.3.0 code line sparkYarnAM actor system is started on random port. See org.apache.spark.deploy.yarn ApplicationMaster.scala:282
> actorSystem = AkkaUtils.createActorSystem("sparkYarnAM", Utils.localHostName, 0, conf = sparkConf, securityManager = securityMgr)._1
> This may be issue when ports between Spark client and the Yarn cluster are limited by firewall and not all ports are open between client and Yarn cluster.
> Proposal is to introduce new property spark.am.actor.port and change code to
> val port = sparkConf.getInt("spark.am.actor.port", 0)
>     actorSystem = AkkaUtils.createActorSystem("sparkYarnAM", Utils.localHostName, port,
>       conf = sparkConf, securityManager = securityMgr)._1



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org