You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by srowen <gi...@git.apache.org> on 2014/09/29 12:32:48 UTC

[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

GitHub user srowen opened a pull request:

    https://github.com/apache/spark/pull/2575

    SPARK-2626 [DOCS] Stop SparkContext in all examples

    Call SparkContext.stop() in all examples (and touch up minor nearby code style issues while at it)

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/srowen/spark SPARK-2626

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2575.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2575
    
----
commit 5b2baaede1896725292808c0b25042374a64ef79
Author: Sean Owen <so...@cloudera.com>
Date:   2014-09-29T09:12:59Z

    Call SparkContext.stop() in all examples (and touch up minor nearby code style issues while at it)

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148735
  
    --- Diff: examples/src/main/java/org/apache/spark/examples/sql/JavaSparkSQL.java ---
    @@ -61,7 +61,8 @@ public static void main(String[] args) throws Exception {
         // Load a text file and convert each line to a Java Bean.
         JavaRDD<Person> people = ctx.textFile("examples/src/main/resources/people.txt").map(
           new Function<String, Person>() {
    -        public Person call(String line) throws Exception {
    +        @Override
    --- End diff --
    
    are these additions of "@override" intentional?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57396627
  
    Thanks for doing this. LGTM.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148709
  
    --- Diff: examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java ---
    @@ -31,7 +31,6 @@
      * Usage: JavaSparkPi [slices]
      */
     public final class JavaSparkPi {
    -  
    --- End diff --
    
    it's just whitespace


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57149547
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/20971/


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18152078
  
    --- Diff: examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala ---
    @@ -44,11 +44,11 @@ object GroupByTest {
             arr1(i) = (ranGen.nextInt(Int.MaxValue), byteArr)
           }
           arr1
    -    }.cache
    +    }.cache()
    --- End diff --
    
    thanks, i learned something new and it's early on a monday - for others interested: http://docs.scala-lang.org/style/method-invocation.html


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148749
  
    --- Diff: examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala ---
    @@ -44,11 +44,11 @@ object GroupByTest {
             arr1(i) = (ranGen.nextInt(Int.MaxValue), byteArr)
           }
           arr1
    -    }.cache
    +    }.cache()
    --- End diff --
    
    it's scala, was it intentional to add the parens?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57157929
  
    +1, lgtm


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57143424
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20971/consoleFull) for   PR 2575 at commit [`5b2baae`](https://github.com/apache/spark/commit/5b2baaede1896725292808c0b25042374a64ef79).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18152125
  
    --- Diff: examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java ---
    @@ -61,5 +60,7 @@ public Integer call(Integer integer, Integer integer2) {
         });
     
         System.out.println("Pi is roughly " + 4.0 * count / n);
    +
    +    jsc.stop();
    --- End diff --
    
    thanks for the sad reminder that java6 is still an anchor for spark


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57512487
  
    LGTM, too.  I've merged this into master.  Thanks for fixing this!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/2575


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148901
  
    --- Diff: examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala ---
    @@ -44,11 +44,11 @@ object GroupByTest {
             arr1(i) = (ranGen.nextInt(Int.MaxValue), byteArr)
           }
           arr1
    -    }.cache
    +    }.cache()
    --- End diff --
    
    Yes, all intentional. Parens should be used in Scala when methods have side effects, and @Override should be used in Java where an override is intended. These are just small matters of style but think worth standardizing to match other source and common practice while changing nearby code. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57149539
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20971/consoleFull) for   PR 2575 at commit [`5b2baae`](https://github.com/apache/spark/commit/5b2baaede1896725292808c0b25042374a64ef79).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `class RandomForestModel(val trees: Array[DecisionTreeModel], val algo: Algo) extends Serializable `



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by mattf <gi...@git.apache.org>.
Github user mattf commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148726
  
    --- Diff: examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java ---
    @@ -61,5 +60,7 @@ public Integer call(Integer integer, Integer integer2) {
         });
     
         System.out.println("Pi is roughly " + 4.0 * count / n);
    +
    +    jsc.stop();
    --- End diff --
    
    how about using the Closeable feature?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2575#issuecomment-57513128
  
    I also cherry-picked this into branch-1.1.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: SPARK-2626 [DOCS] Stop SparkContext in all exa...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2575#discussion_r18148941
  
    --- Diff: examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java ---
    @@ -61,5 +60,7 @@ public Integer call(Integer integer, Integer integer2) {
         });
     
         System.out.println("Pi is roughly " + 4.0 * count / n);
    +
    +    jsc.stop();
    --- End diff --
    
    You don't mean call close() but use try with resources ? That would require requiring Java 7. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org