You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by tgravescs <gi...@git.apache.org> on 2016/07/20 18:00:02 UTC

[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

GitHub user tgravescs opened a pull request:

    https://github.com/apache/spark/pull/14287

    [SPARK-16650] Improve documentation of spark.task.maxFailures

    Clarify documentation on spark.task.maxFailures
    
    No tests run as its documentation

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tgravescs/spark SPARK-16650

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/14287.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #14287
    
----
commit 6ce0179b13dc1b3c0db56c00b43580e9fba3738e
Author: Tom Graves <tg...@yahoo-inc.com>
Date:   2016-07-20T17:57:25Z

    [SPARK-16650] Improve documentation of spark.task.maxFailures

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62615/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    @squito @kayousterhout 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62622/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    **[Test build #62622 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62622/consoleFull)** for PR 14287 at commit [`ffef4ab`](https://github.com/apache/spark/commit/ffef4abb3a69ef791705f6811fc525a5b5ed8865).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/14287


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    oops didn't see Kay's suggestion, that sounds better to me.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    **[Test build #62615 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62615/consoleFull)** for PR 14287 at commit [`6ce0179`](https://github.com/apache/spark/commit/6ce0179b13dc1b3c0db56c00b43580e9fba3738e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    I find the description "it isn't affected by the number of failures across different tasks." a little bit confusing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    **[Test build #62622 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62622/consoleFull)** for PR 14287 at commit [`ffef4ab`](https://github.com/apache/spark/commit/ffef4abb3a69ef791705f6811fc525a5b5ed8865).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14287#discussion_r71585682
  
    --- Diff: docs/configuration.md ---
    @@ -1188,7 +1188,9 @@ Apart from these, the following properties are also available, and may be useful
       <td><code>spark.task.maxFailures</code></td>
       <td>4</td>
       <td>
    -    Number of individual task failures before giving up on the job.
    +    Number of failures of any particular task before giving up on the job. The same
    +    task has to fail this number of attempts, it isn't affected by the number of failures
    --- End diff --
    
    Sounds good. I am always open to grammar suggestions.  I will update shortly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by kayousterhout <gi...@git.apache.org>.
Github user kayousterhout commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14287#discussion_r71582023
  
    --- Diff: docs/configuration.md ---
    @@ -1188,7 +1188,9 @@ Apart from these, the following properties are also available, and may be useful
       <td><code>spark.task.maxFailures</code></td>
       <td>4</td>
       <td>
    -    Number of individual task failures before giving up on the job.
    +    Number of failures of any particular task before giving up on the job. The same
    +    task has to fail this number of attempts, it isn't affected by the number of failures
    --- End diff --
    
    "The total number of failures spread across different tasks will not cause the job to fail; a particular task has to fail this number of attempts."? (just to make it grammatically correct)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by kayousterhout <gi...@git.apache.org>.
Github user kayousterhout commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14287#discussion_r71582143
  
    --- Diff: docs/configuration.md ---
    @@ -1188,7 +1188,9 @@ Apart from these, the following properties are also available, and may be useful
       <td><code>spark.task.maxFailures</code></td>
       <td>4</td>
       <td>
    -    Number of individual task failures before giving up on the job.
    +    Number of failures of any particular task before giving up on the job. The same
    +    task has to fail this number of attempts, it isn't affected by the number of failures
    --- End diff --
    
    (or your version is fine too if you don't think the grammar matters; either one is clear enough to the user I think)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by kayousterhout <gi...@git.apache.org>.
Github user kayousterhout commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14287#discussion_r71576631
  
    --- Diff: docs/configuration.md ---
    @@ -1188,7 +1188,9 @@ Apart from these, the following properties are also available, and may be useful
       <td><code>spark.task.maxFailures</code></td>
       <td>4</td>
       <td>
    -    Number of individual task failures before giving up on the job.
    +    Number of failures of any particular task before giving up on the job. The same
    +    task has to fail this number of attempts, it isn't affected by the number of failures
    --- End diff --
    
    for 2nd sentence maybe "Failures spread across different tasks will only cause the job to fail if one particular task has failed this number of times."? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    **[Test build #62615 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62615/consoleFull)** for PR 14287 at commit [`6ce0179`](https://github.com/apache/spark/commit/6ce0179b13dc1b3c0db56c00b43580e9fba3738e).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14287: [SPARK-16650] Improve documentation of spark.task...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14287#discussion_r71581337
  
    --- Diff: docs/configuration.md ---
    @@ -1188,7 +1188,9 @@ Apart from these, the following properties are also available, and may be useful
       <td><code>spark.task.maxFailures</code></td>
       <td>4</td>
       <td>
    -    Number of individual task failures before giving up on the job.
    +    Number of failures of any particular task before giving up on the job. The same
    +    task has to fail this number of attempts, it isn't affected by the number of failures
    --- End diff --
    
    I can see the 2 sentence not being clear, but the second part of your suggestion about "if one particular task has failed" sounds a bit out of place to me, how about:
    
    "The total number of failures spread across different tasks will not cause the job to fail, it has to be a particular task failing this number of attempts."


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by tgravescs <gi...@git.apache.org>.
Github user tgravescs commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    @rxin  Other suggestions or should I just remove that?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by squito <gi...@git.apache.org>.
Github user squito commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    lgtm


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    Merged to master/2.0


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14287: [SPARK-16650] Improve documentation of spark.task.maxFai...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14287
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org