You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/09/18 07:00:37 UTC

[jira] [Resolved] (SPARK-3565) make code consistent with document

     [ https://issues.apache.org/jira/browse/SPARK-3565?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Patrick Wendell resolved SPARK-3565.
------------------------------------
    Resolution: Fixed
      Assignee: WangTaoTheTonic

Fixed by: https://github.com/apache/spark/pull/2427

> make code consistent with document
> ----------------------------------
>
>                 Key: SPARK-3565
>                 URL: https://issues.apache.org/jira/browse/SPARK-3565
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: WangTaoTheTonic
>            Assignee: WangTaoTheTonic
>            Priority: Minor
>
> The configuration item represent "Default number of retries in binding to a port" in code is "spark.ports.maxRetries" while "spark.port.maxRetries" in document configuration.md. We need to make them consistent.
>  
> In org.apache.spark.util.Utils.scala:
>   /**
>    * Default number of retries in binding to a port.
>    */
>   val portMaxRetries: Int = {
>     if (sys.props.contains("spark.testing")) {
>       // Set a higher number of retries for tests...
>       sys.props.get("spark.port.maxRetries").map(_.toInt).getOrElse(100)
>     } else {
>       Option(SparkEnv.get)
>         .flatMap(_.conf.getOption("spark.port.maxRetries"))
>         .map(_.toInt)
>         .getOrElse(16)
>     }
>   }
> In configuration.md:
> <tr>
>   <td><code>spark.port.maxRetries</code></td>
>   <td>16</td>
>   <td>
>     Maximum number of retries when binding to a port before giving up.
>   </td>
> </tr>



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org