You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/12/30 13:18:13 UTC

[jira] [Commented] (SPARK-5006) spark.port.maxRetries doesn't work

    [ https://issues.apache.org/jira/browse/SPARK-5006?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14261052#comment-14261052 ] 

Apache Spark commented on SPARK-5006:
-------------------------------------

User 'WangTaoTheTonic' has created a pull request for this issue:
https://github.com/apache/spark/pull/3841

> spark.port.maxRetries doesn't work
> ----------------------------------
>
>                 Key: SPARK-5006
>                 URL: https://issues.apache.org/jira/browse/SPARK-5006
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>            Reporter: WangTaoTheTonic
>
> We normally config spark.port.maxRetries in properties file or SparkConf. But in Utils.scala it read from SparkEnv's conf. As SparkEnv is an object whose env need to be set after JVM is launched and Utils.scala is also an object. So in most cases portMaxRetries will get the default value 16.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org