You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/01/08 20:59:35 UTC

[jira] [Updated] (SPARK-5006) spark.port.maxRetries doesn't work

     [ https://issues.apache.org/jira/browse/SPARK-5006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or updated SPARK-5006:
-----------------------------
    Affects Version/s: 1.1.0

> spark.port.maxRetries doesn't work
> ----------------------------------
>
>                 Key: SPARK-5006
>                 URL: https://issues.apache.org/jira/browse/SPARK-5006
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.1.0
>            Reporter: WangTaoTheTonic
>
> We normally config spark.port.maxRetries in properties file or SparkConf. But in Utils.scala it read from SparkEnv's conf. As SparkEnv is an object whose env need to be set after JVM is launched and Utils.scala is also an object. So in most cases portMaxRetries will get the default value 16.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org