You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2015/04/15 00:53:34 UTC

[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/5514

     [SPARK-6046] [core] Reorganize deprecated config support in SparkConf. 

    This change tries to follow the chosen way for handling deprecated
    configs in SparkConf: all values (old and new) are kept in the conf
    object, and newer names take precedence over older ones when
    retrieving the value.
    
    Warnings are logged when config options are set, which generally happens
    on the driver node (where the logs are most visible).

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark SPARK-6046

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/5514.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #5514
    
----
commit 2c93209871ae64d5a081c05ed752e121fbb736bf
Author: Marcelo Vanzin <va...@cloudera.com>
Date:   2015-04-14T20:24:51Z

    [SPARK-6046] [core] Reorganize deprecated config support in SparkConf.
    
    This change tries to follow the chosen way for handling deprecated
    configs in SparkConf: all values (old and new) are kept in the conf
    object, and newer names take precedence over older ones when
    retrieving the value.
    
    Warnings are logged when config options are set, which generally happens
    on the driver node (where the logs are most visible).

commit ab203511f2fb5e30dfb1fd07b5b387ad63de186c
Author: Marcelo Vanzin <va...@cloudera.com>
Date:   2015-04-14T22:44:48Z

    Update FsHistoryProvider to only retrieve new config key.
    
    In the process, deprecate the name that mentions the units
    now that we can define the unit in the config value itself.
    
    Also tweak a couple of messages.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93533582
  
      [Test build #30367 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30367/consoleFull) for   PR 5514 at commit [`9371529`](https://github.com/apache/spark/commit/9371529f5770ecf569f6d25240ec504526e73284).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28450144
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -443,61 +461,55 @@ private[spark] object SparkConf extends Logging {
       }
     
       /**
    -   * Translate the configuration key if it is deprecated and has a replacement, otherwise just
    -   * returns the provided key.
    -   *
    -   * @param userKey Configuration key from the user / caller.
    -   * @param warn Whether to print a warning if the key is deprecated. Warnings will be printed
    -   *             only once for each key.
    +   * Looks for available deprecated keys for the given config option, and return the first
    +   * value available.
        */
    -  private def translateConfKey(userKey: String, warn: Boolean = false): String = {
    -    deprecatedConfigs.get(userKey)
    -      .map { deprecatedKey =>
    -        if (warn) {
    -          deprecatedKey.warn()
    -        }
    -        deprecatedKey.newName.getOrElse(userKey)
    -      }.getOrElse(userKey)
    +  def getDeprecatedConfig(key: String, conf: SparkConf): Option[String] = {
    +    configsWithAlternatives.get(key).flatMap(
    +      _.collectFirst {
    +        case x if conf.contains(x.key) =>
    +          val value = conf.get(x.key)
    +          x.translation.map(_(value)).getOrElse(value)
    --- End diff --
    
    Your code looks a little cleaner, so I'll use that, but I'd like to avoid adding a body to `AlternateConfig`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93697367
  
    LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28399679
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -183,36 +187,36 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         Utils.timeStringAsSeconds(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as seconds, falling back to a default if not set. If no 
    +  /**
    +   * Get a time parameter as seconds, falling back to a default if not set. If no
        * suffix is provided then seconds are assumed.
    -   * 
    +   *
        */
       def getTimeAsSeconds(key: String, defaultValue: String): Long = {
         Utils.timeStringAsSeconds(get(key, defaultValue))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no
    +   * suffix is provided then milliseconds are assumed.
        * @throws NoSuchElementException
        */
       def getTimeAsMs(key: String): Long = {
         Utils.timeStringAsMs(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds, falling back to a default if not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds, falling back to a default if not set. If no
    +   * suffix is provided then milliseconds are assumed.
        */
       def getTimeAsMs(key: String, defaultValue: String): Long = {
         Utils.timeStringAsMs(get(key, defaultValue))
       }
    -  
    +
     
       /** Get a parameter as an Option */
       def getOption(key: String): Option[String] = {
    -    Option(settings.get(key))
    +    Option(settings.get(key)).orElse(getDeprecatedConfig(key, this))
    --- End diff --
    
    Open question, not necessarily demanding any change -- is it easier to just set a value under all aliases, when any of them is set? (and of course warn when anything deprecated is retrieved or set?)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28444130
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -183,36 +187,36 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         Utils.timeStringAsSeconds(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as seconds, falling back to a default if not set. If no 
    +  /**
    +   * Get a time parameter as seconds, falling back to a default if not set. If no
        * suffix is provided then seconds are assumed.
    -   * 
    +   *
    --- End diff --
    
    remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28437452
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -49,11 +49,8 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
       private val NOT_STARTED = "<Not Started>"
     
       // Interval between each check for event log updates
    -  private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
    -    .orElse(conf.getOption("spark.history.fs.updateInterval"))
    -    .orElse(conf.getOption("spark.history.updateInterval"))
    -    .map(_.toInt)
    -    .getOrElse(10) * 1000
    +  private val UPDATE_INTERVAL_MS = Utils.timeStringAsSeconds(
    --- End diff --
    
    Ah good catch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93560457
  
      [Test build #30367 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30367/consoleFull) for   PR 5514 at commit [`9371529`](https://github.com/apache/spark/commit/9371529f5770ecf569f6d25240ec504526e73284).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.
     * This patch does not change any dependencies.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28447612
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -49,11 +49,7 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
       private val NOT_STARTED = "<Not Started>"
     
       // Interval between each check for event log updates
    -  private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
    -    .orElse(conf.getOption("spark.history.fs.updateInterval"))
    -    .orElse(conf.getOption("spark.history.updateInterval"))
    -    .map(_.toInt)
    -    .getOrElse(10) * 1000
    +  private val UPDATE_INTERVAL_MS = conf.getTimeAsMs("spark.history.fs.update.interval", "10s")
    --- End diff --
    
    Fair point, for backwards compatibility


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93481783
  
      [Test build #30355 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30355/consoleFull) for   PR 5514 at commit [`2445d48`](https://github.com/apache/spark/commit/2445d483811a7a93ac0dae36a4d19d1c5bb2e9a1).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93104320
  
      [Test build #30282 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30282/consoleFull) for   PR 5514 at commit [`ab20351`](https://github.com/apache/spark/commit/ab203511f2fb5e30dfb1fd07b5b387ad63de186c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28437640
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -183,36 +187,36 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         Utils.timeStringAsSeconds(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as seconds, falling back to a default if not set. If no 
    +  /**
    +   * Get a time parameter as seconds, falling back to a default if not set. If no
        * suffix is provided then seconds are assumed.
    -   * 
    +   *
        */
       def getTimeAsSeconds(key: String, defaultValue: String): Long = {
         Utils.timeStringAsSeconds(get(key, defaultValue))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no
    +   * suffix is provided then milliseconds are assumed.
        * @throws NoSuchElementException
        */
       def getTimeAsMs(key: String): Long = {
         Utils.timeStringAsMs(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds, falling back to a default if not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds, falling back to a default if not set. If no
    +   * suffix is provided then milliseconds are assumed.
        */
       def getTimeAsMs(key: String, defaultValue: String): Long = {
         Utils.timeStringAsMs(get(key, defaultValue))
       }
    -  
    +
     
       /** Get a parameter as an Option */
       def getOption(key: String): Option[String] = {
    -    Option(settings.get(key))
    +    Option(settings.get(key)).orElse(getDeprecatedConfig(key, this))
    --- End diff --
    
    I think that would be more complicated. Because then you'd also have to figure out that when the user sets, for example, `spark.history.fs.updateInterval`, you should also set `spark.history.updateInterval`, and that information is not readily available in the current data structures. Also, it would use a tiny bit more memory for no particular gain.
    
    It's doable if I reorganize things, but I don't think it's worth it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93530712
  
      [Test build #30366 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30366/consoleFull) for   PR 5514 at commit [`6cf3f11`](https://github.com/apache/spark/commit/6cf3f1165529568a9eb974897df2ecbaa1bca788).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28447131
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -443,61 +461,55 @@ private[spark] object SparkConf extends Logging {
       }
     
       /**
    -   * Translate the configuration key if it is deprecated and has a replacement, otherwise just
    -   * returns the provided key.
    -   *
    -   * @param userKey Configuration key from the user / caller.
    -   * @param warn Whether to print a warning if the key is deprecated. Warnings will be printed
    -   *             only once for each key.
    +   * Looks for available deprecated keys for the given config option, and return the first
    +   * value available.
        */
    -  private def translateConfKey(userKey: String, warn: Boolean = false): String = {
    -    deprecatedConfigs.get(userKey)
    -      .map { deprecatedKey =>
    -        if (warn) {
    -          deprecatedKey.warn()
    -        }
    -        deprecatedKey.newName.getOrElse(userKey)
    -      }.getOrElse(userKey)
    +  def getDeprecatedConfig(key: String, conf: SparkConf): Option[String] = {
    +    configsWithAlternatives.get(key).flatMap(
    +      _.collectFirst {
    +        case x if conf.contains(x.key) =>
    +          val value = conf.get(x.key)
    +          x.translation.map(_(value)).getOrElse(value)
    +      })
    +  }
    +
    +  /**
    +   * Logs a warning message if the given config key is deprecated.
    +   */
    +  def logDeprecationWarning(key: String): Unit = {
    +    deprecatedConfigs.get(key).foreach { cfg =>
    +      logWarning(
    +        s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " +
    +        s"may be removed in the future. ${cfg.deprecationMessage}")
    +    }
    +
    +    allAlternatives.get(key).foreach { case (newKey, cfg) =>
    +      logWarning(
    +        s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " +
    +        s"and may be removed in the future. Please use the new key '$newKey' instead.")
    +    }
       }
     
       /**
    -   * Holds information about keys that have been deprecated or renamed.
    +   * Holds information about keys that have been deprecated and do not have a replacement.
    --- End diff --
    
    minor, but do you think it's more natural for `DeprecatedConfig` to hold the key itself as one of its fields? Then we'll construct a map around a list of such configs for easier access. Similarly for `AlternativeConfig`. I would think that this class contains all the information I need to translate the values (including the new key). Not a big deal if you don't change this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93125691
  
      [Test build #30282 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30282/consoleFull) for   PR 5514 at commit [`ab20351`](https://github.com/apache/spark/commit/ab203511f2fb5e30dfb1fd07b5b387ad63de186c).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.
     * This patch **adds the following new dependencies:**
       * `snappy-java-1.1.1.7.jar`
    
     * This patch **removes the following dependencies:**
       * `snappy-java-1.1.1.6.jar`



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93563349
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/30366/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/5514


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28446358
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -443,61 +461,55 @@ private[spark] object SparkConf extends Logging {
       }
     
       /**
    -   * Translate the configuration key if it is deprecated and has a replacement, otherwise just
    -   * returns the provided key.
    -   *
    -   * @param userKey Configuration key from the user / caller.
    -   * @param warn Whether to print a warning if the key is deprecated. Warnings will be printed
    -   *             only once for each key.
    +   * Looks for available deprecated keys for the given config option, and return the first
    +   * value available.
        */
    -  private def translateConfKey(userKey: String, warn: Boolean = false): String = {
    -    deprecatedConfigs.get(userKey)
    -      .map { deprecatedKey =>
    -        if (warn) {
    -          deprecatedKey.warn()
    -        }
    -        deprecatedKey.newName.getOrElse(userKey)
    -      }.getOrElse(userKey)
    +  def getDeprecatedConfig(key: String, conf: SparkConf): Option[String] = {
    +    configsWithAlternatives.get(key).flatMap(
    +      _.collectFirst {
    +        case x if conf.contains(x.key) =>
    +          val value = conf.get(x.key)
    +          x.translation.map(_(value)).getOrElse(value)
    --- End diff --
    
    how about
    ```
    configsWithAlternatives.get(key).flatMap { alternatives =>
      alternatives.collectFirst { case alt if conf.contains(alt.key) =>
        val value = conf.get(alt.key)
        alt.translate(value)
      }
    }
    ```
    and then define a `translate` method in `AlternateConfig`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93563335
  
      [Test build #30366 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30366/consoleFull) for   PR 5514 at commit [`6cf3f11`](https://github.com/apache/spark/commit/6cf3f1165529568a9eb974897df2ecbaa1bca788).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.
     * This patch does not change any dependencies.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93524543
  
      [Test build #30355 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/30355/consoleFull) for   PR 5514 at commit [`2445d48`](https://github.com/apache/spark/commit/2445d483811a7a93ac0dae36a4d19d1c5bb2e9a1).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.
     * This patch does not change any dependencies.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28399546
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -49,11 +49,8 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
       private val NOT_STARTED = "<Not Started>"
     
       // Interval between each check for event log updates
    -  private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
    -    .orElse(conf.getOption("spark.history.fs.updateInterval"))
    -    .orElse(conf.getOption("spark.history.updateInterval"))
    -    .map(_.toInt)
    -    .getOrElse(10) * 1000
    +  private val UPDATE_INTERVAL_MS = Utils.timeStringAsSeconds(
    --- End diff --
    
    get as milliseconds not seconds?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93521865
  
    yeah, let me fix those (especially the seconds vs. ms one).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28389478
  
    --- Diff: docs/monitoring.md ---
    @@ -86,11 +86,12 @@ follows:
         </td>
       </tr>
       <tr>
    -    <td>spark.history.fs.update.interval.seconds</td>
    -    <td>10</td>
    +    <td>spark.history.fs.update.interval</td>
    --- End diff --
    
    Actually, not a catch. I'm adding the new config name (dropping the ".seconds" suffix) since now that we can set the value with units, the suffix looks weird.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28446291
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -443,61 +461,55 @@ private[spark] object SparkConf extends Logging {
       }
     
       /**
    -   * Translate the configuration key if it is deprecated and has a replacement, otherwise just
    -   * returns the provided key.
    -   *
    -   * @param userKey Configuration key from the user / caller.
    -   * @param warn Whether to print a warning if the key is deprecated. Warnings will be printed
    -   *             only once for each key.
    +   * Looks for available deprecated keys for the given config option, and return the first
    +   * value available.
        */
    -  private def translateConfKey(userKey: String, warn: Boolean = false): String = {
    -    deprecatedConfigs.get(userKey)
    -      .map { deprecatedKey =>
    -        if (warn) {
    -          deprecatedKey.warn()
    -        }
    -        deprecatedKey.newName.getOrElse(userKey)
    -      }.getOrElse(userKey)
    +  def getDeprecatedConfig(key: String, conf: SparkConf): Option[String] = {
    +    configsWithAlternatives.get(key).flatMap(
    +      _.collectFirst {
    +        case x if conf.contains(x.key) =>
    +          val value = conf.get(x.key)
    +          x.translation.map(_(value)).getOrElse(value)
    +      })
    --- End diff --
    
    how about
    ```
    configsWithAlternatives.get(key).flatMap { alternatives =>
      alternatives.collectFirst { case alt if conf.contains(alt.key) =>
        alt.translate(conf.get(alt.key))
      }
    }
    ```
    and then define a `translate` method in `AlternateConfig`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28447253
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -49,11 +49,7 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
       private val NOT_STARTED = "<Not Started>"
     
       // Interval between each check for event log updates
    -  private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
    -    .orElse(conf.getOption("spark.history.fs.updateInterval"))
    -    .orElse(conf.getOption("spark.history.updateInterval"))
    -    .map(_.toInt)
    -    .getOrElse(10) * 1000
    +  private val UPDATE_INTERVAL_MS = conf.getTimeAsMs("spark.history.fs.update.interval", "10s")
    --- End diff --
    
    actually, this needs to be `conf.getTimeAsSeconds(...) * 1000`, because the user can pass in a value of "5" and what they really mean is seconds.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28447418
  
    --- Diff: docs/monitoring.md ---
    @@ -86,11 +86,12 @@ follows:
         </td>
       </tr>
       <tr>
    -    <td>spark.history.fs.update.interval.seconds</td>
    -    <td>10</td>
    +    <td>spark.history.fs.update.interval</td>
    +    <td>10s</td>
         <td>
    -      The period, in seconds, at which information displayed by this history server is updated.
    +      The period at which information displayed by this history server is updated.
           Each update checks for any changes made to the event logs in persisted storage.
    +      The default unit is seconds.
    --- End diff --
    
    I actually think we should just leave this one out. We want to encourage the user to use the new notation.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28389311
  
    --- Diff: docs/monitoring.md ---
    @@ -86,11 +86,12 @@ follows:
         </td>
       </tr>
       <tr>
    -    <td>spark.history.fs.update.interval.seconds</td>
    -    <td>10</td>
    +    <td>spark.history.fs.update.interval</td>
    --- End diff --
    
    oops, good catch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93560487
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/30367/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28445180
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -400,19 +397,40 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
     
     private[spark] object SparkConf extends Logging {
     
    -  private val deprecatedConfigs: Map[String, DeprecatedConfig] = {
    -    val configs = Seq(
    -      DeprecatedConfig("spark.files.userClassPathFirst", "spark.executor.userClassPathFirst",
    -        "1.3"),
    -      DeprecatedConfig("spark.yarn.user.classpath.first", null, "1.3",
    -        "Use spark.{driver,executor}.userClassPathFirst instead."),
    -      DeprecatedConfig("spark.history.fs.updateInterval",
    -        "spark.history.fs.update.interval.seconds",
    -        "1.3", "Use spark.history.fs.update.interval.seconds instead"),
    -      DeprecatedConfig("spark.history.updateInterval",
    -        "spark.history.fs.update.interval.seconds",
    -        "1.3", "Use spark.history.fs.update.interval.seconds instead"))
    -    configs.map { x => (x.oldName, x) }.toMap
    +  /**
    +   * Maps deprecated config keys to information about the deprecation.
    +   *
    +   * The extra information is logged as a warning when the config is present in the user's
    +   * configuration.
    +   */
    +  private val deprecatedConfigs = Map[String, DeprecatedConfig](
    +    "spark.yarn.user.classpath.first" ->
    +      DeprecatedConfig("1.3", "Please use spark.{driver,executor}.userClassPathFirst instead.")
    +    )
    +
    +  /**
    +   * Maps a current config key to alternate keys that were used in previous version of Spark.
    +   *
    +   * The alternates are used in the order defined in this map. If deprecated configs are
    +   * present in the user's configuration, a warning is logged.
    +   */
    +  private val configsWithAlternatives = Map[String, Seq[AlternateConfig]](
    +    "spark.executor.userClassPathFirst" -> Seq(
    +      AlternateConfig("spark.files.userClassPathFirst", "1.3")),
    +    "spark.history.fs.update.interval" -> Seq(
    +      AlternateConfig("spark.history.fs.update.interval.seconds", "1.4"),
    +      AlternateConfig("spark.history.fs.updateInterval", "1.3"),
    +      AlternateConfig("spark.history.updateInterval", "1.3"))
    +    )
    +
    +  /**
    +   * A view of `configsWithAlternatives` that makes it more efficient to look up deprecated
    +   * config keys.
    +   */
    +  private val allAlternatives: Map[String, (String, AlternateConfig)] = {
    --- End diff --
    
    can you add a description of what the respective strings represent


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93521119
  
    This is great. I think this is mergeable as is aside from one functional comment, though I'd like to see the rest of the code style comments addressed as well.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93125698
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/30282/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28447169
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -49,11 +49,7 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
       private val NOT_STARTED = "<Not Started>"
     
       // Interval between each check for event log updates
    -  private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
    -    .orElse(conf.getOption("spark.history.fs.updateInterval"))
    -    .orElse(conf.getOption("spark.history.updateInterval"))
    -    .map(_.toInt)
    -    .getOrElse(10) * 1000
    +  private val UPDATE_INTERVAL_MS = conf.getTimeAsMs("spark.history.fs.update.interval", "10s")
    --- End diff --
    
    yay!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28446729
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -183,36 +187,36 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         Utils.timeStringAsSeconds(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as seconds, falling back to a default if not set. If no 
    +  /**
    +   * Get a time parameter as seconds, falling back to a default if not set. If no
        * suffix is provided then seconds are assumed.
    -   * 
    +   *
        */
       def getTimeAsSeconds(key: String, defaultValue: String): Long = {
         Utils.timeStringAsSeconds(get(key, defaultValue))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no
    +   * suffix is provided then milliseconds are assumed.
        * @throws NoSuchElementException
        */
       def getTimeAsMs(key: String): Long = {
         Utils.timeStringAsMs(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds, falling back to a default if not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds, falling back to a default if not set. If no
    +   * suffix is provided then milliseconds are assumed.
        */
       def getTimeAsMs(key: String, defaultValue: String): Long = {
         Utils.timeStringAsMs(get(key, defaultValue))
       }
    -  
    +
     
       /** Get a parameter as an Option */
       def getOption(key: String): Option[String] = {
    -    Option(settings.get(key))
    +    Option(settings.get(key)).orElse(getDeprecatedConfig(key, this))
    --- End diff --
    
    I think this is simpler


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28442714
  
    --- Diff: docs/monitoring.md ---
    @@ -86,11 +86,12 @@ follows:
         </td>
       </tr>
       <tr>
    -    <td>spark.history.fs.update.interval.seconds</td>
    -    <td>10</td>
    +    <td>spark.history.fs.update.interval</td>
    --- End diff --
    
    good catch in the sense that a previous PR #5236 missed this time config


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/5514#issuecomment-93524561
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/30355/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-6046] [core] Reorganize deprecated conf...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28444051
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -134,13 +135,16 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
     
       /** Set multiple parameters together */
       def setAll(settings: Traversable[(String, String)]): SparkConf = {
    +    settings.foreach { case (k, v) => logDeprecationWarning(k) }
    --- End diff --
    
    Not your change, but I wonder if this one should just keep calling `set`. This currently doesn't handle null keys or values


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org