You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by knusbaum <gi...@git.apache.org> on 2014/07/01 23:09:02 UTC

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

GitHub user knusbaum opened a pull request:

    https://github.com/apache/spark/pull/1279

    [SPARK-2165] spark on yarn: add support for setting maxAppAttempts in the ApplicationSubmissionContext

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/knusbaum/spark master

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/1279.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1279
    
----
commit 41e8a394cd74e42f2228eb880442cb0d6902f275
Author: Kyle Nusbaum <kn...@yahoo-inc.com>
Date:   2014-06-24T20:19:16Z

    Testing

commit c2a2b69b623a792bc3e7e1e278a2be2668573632
Author: Kyle Nusbaum <kn...@yahoo-inc.com>
Date:   2014-07-01T20:46:35Z

    Preparing for pull

commit b69955080537bebccc1f2e4bf05ee584a1e429f9
Author: Kyle Nusbaum <kn...@yahoo-inc.com>
Date:   2014-07-01T20:48:44Z

    Merge remote-tracking branch 'community/master'

commit 2532b6755ff2876516679b0c90e97fd031a111df
Author: Kyle Nusbaum <kn...@yahoo-inc.com>
Date:   2014-07-01T21:05:15Z

    Cleanup

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52653259
  
    So one thing I just realized we need to update is in the ApplicationMaster we are using the maxAppAttempts to determine if its the last AM retry.  Currently its just grabbing the cluster maximum.  We need to update that to handle this config.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by sryza <gi...@git.apache.org>.
Github user sryza commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r16387993
  
    --- Diff: docs/running-on-yarn.md ---
    @@ -125,6 +125,14 @@ Most of the configs are the same for Spark on YARN as for other deployment modes
          the environment of the executor launcher. 
       </td>
     </tr>
    +<tr>
    +  <td><code>spark.yarn.maxappattempts</code></td>
    +  <td>YARN Default</td>
    +  <td>
    +  The maximum number of attempts that will be made to submit the application.
    +  See <a href="https://hadoop.apache.org/docs/current/api/org/apache/hadoop/yarn/api/records/ApplicationSubmissionContext.html#setMaxAppAttempts%28int%29">this YARN Doc.</a>
    --- End diff --
    
    Can we make this spark.yarn.maxAppAttempts to fit with the camel case that other Spark properties use? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-62331405
  
    Okay let's close this issue for now and he can reopen it if he has time.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r14821748
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -124,6 +124,14 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         set("spark.home", home)
       }
     
    +  /**
    +   * Set the max number of submission retries the Spark client will attempt
    +   * before giving up
    +   */
    --- End diff --
    
    We haven't been adding specific routines to set the configs.  The user can just set it using the existing SparkConf.set routines so I think we should remove this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r14822197
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getIntOption("spark.maxappattempts") match {
    +      case Some(v) => appContext.setMaxAppAttempts(v)
    +      case None => logDebug("Not setting max app attempts.")
    --- End diff --
    
    Can you add something like "cluster default setting will be used" to the log statement?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-47710371
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r14822032
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -167,6 +175,8 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         getOption(key).map(_.toInt).getOrElse(defaultValue)
       }
     
    +  def getIntOption(key: String): Option[Int] = getOption(key).map(_.toInt)
    --- End diff --
    
    To keep things consistent (these api's are public) I don't think we should add the getIntOption without adding other routines like getLongOption, etc.   For now can you just use getOption and then make it an Int.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r16386222
  
    --- Diff: docs/running-on-yarn.md ---
    @@ -125,6 +125,14 @@ Most of the configs are the same for Spark on YARN as for other deployment modes
          the environment of the executor launcher. 
       </td>
     </tr>
    +<tr>
    +  <td><code>spark.yarn.maxappattempts</code></td>
    +  <td>YARN Default</td>
    +  <td>
    +  The maximum number of attempts that will be made to submit the application.
    +  See <a href="https://hadoop.apache.org/docs/current/api/org/apache/hadoop/yarn/api/records/ApplicationSubmissionContext.html#setMaxAppAttempts%28int%29">this YARN Doc.</a>
    --- End diff --
    
    I'd rather not link to the yarn documentation here in case it changes locations. Perhaps add a statement about it not being allowed to be bigger then the cluster configured max. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52559454
  
    Something must be wrong with the QA box as this patch doesn't add any classes.  The test failure is unrelated to this patch also.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52549664
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18777/consoleFull) for   PR 1279 at commit [`f848797`](https://github.com/apache/spark/commit/f84879792caba054df5c6804582dcd494998e060).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52630875
  
    Jenkins, test this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52631256
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18843/consoleFull) for   PR 1279 at commit [`f848797`](https://github.com/apache/spark/commit/f84879792caba054df5c6804582dcd494998e060).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r17459926
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.yarn.maxappattempts").map(_.toInt) match {
    --- End diff --
    
    Ah and I see where you got the submissions from.  I'm fine with any of those also.  We'll see if Kyle responds otherwise I might take this over.  We should just make sure to document it we'll so user understands.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-55349034
  
    This has merge conflicts, @knusbaum can you rebase on master?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r16371320
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.maxappattempts").map(_.toInt) match {
    --- End diff --
    
    lets make the config name yarn specific.   spark.yarn.maxappattempts.
    
    Also can you update the yarn documentation to include the new config and description.  lookin docs/running-on-yarn.md


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r17459005
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.yarn.maxappattempts").map(_.toInt) match {
    --- End diff --
    
    its not the number of application submission attempts though, this is how many times the resource manager will retry the application for you.  to me application submission is done by the client, not a retry from the RM.  what isn't clear about the current one so we can come up with something more clear? We could do something like ApplicationMasterMaxAttempts, am-maxattempts, rmappattempts


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52555880
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18777/consoleFull) for   PR 1279 at commit [`f848797`](https://github.com/apache/spark/commit/f84879792caba054df5c6804582dcd494998e060).
     * This patch **fails** unit tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `case class OutputFaker(output: Seq[Attribute], child: SparkPlan) extends SparkPlan `
      * `    implicit class LogicalPlanHacks(s: SchemaRDD) `
      * `    implicit class PhysicalPlanHacks(originalPlan: SparkPlan) `
      * `class FakeParquetSerDe extends SerDe `



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-54694631
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/1279


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r17458674
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.yarn.maxappattempts").map(_.toInt) match {
    --- End diff --
    
    I would prefer this to be called `spark.yarn.applicationSubmissionMaxAttempts` to be more specific


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r17459758
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.yarn.maxappattempts").map(_.toInt) match {
    --- End diff --
    
    I see, our client only submits it once, but it's up to the RM how many attempts it makes to set up state for the application or launch the AM container etc. (The docs at https://hadoop.apache.org/docs/current/api/org/apache/hadoop/yarn/api/records/ApplicationSubmissionContext.html#setMaxAppAttempts(int) are pretty misleading...)
    
    Then maybe it makes sense to call this `spark.yarn.application.rmMaxAttempts` or something?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r14822125
  
    --- Diff: yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -108,6 +108,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         val appContext = Records.newRecord(classOf[ApplicationSubmissionContext])
         appContext.setApplicationId(appId)
         appContext.setApplicationName(args.appName)
    +    sparkConf.getIntOption("spark.maxappattempts") match {
    +      case Some(v) => appContext.setMaxAppAttempts(v)
    --- End diff --
    
    hadoop 0.23 (yarn alpha) doesn't have a setMaxAppAttempts routine.  Just remove this and only do it in the yarn stable version.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1279#discussion_r16385617
  
    --- Diff: yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -81,6 +81,10 @@ class Client(clientArgs: ClientArguments, hadoopConf: Configuration, spConf: Spa
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(amContainer)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption("spark.yarn.maxappattempts").map(_.toInt) match {
    +      case Some(v) => appContext.setMaxAppAttempts(v)
    +      case None => logDebug("Not setting spark.yarn.maxappattempts. Cluster default will be used.")
    --- End diff --
    
    we should change the debug statement to say something like spark.yarn.maxappattempts is not set, using cluster default... 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-55351694
  
    Also note Kyle was intern at yahoo but has went back to school not sure if he will have time to continue this.  We can wait to see if he responds


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52549316
  
    Jenkins, test this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2165] spark on yarn: add support for se...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1279#issuecomment-52639449
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18843/consoleFull) for   PR 1279 at commit [`f848797`](https://github.com/apache/spark/commit/f84879792caba054df5c6804582dcd494998e060).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org