You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by phalodi <gi...@git.apache.org> on 2016/07/30 14:00:30 UTC

[GitHub] spark pull request #14421: [Spark-16916] Add api to get JavaSparkContext fro...

GitHub user phalodi opened a pull request:

    https://github.com/apache/spark/pull/14421

    [Spark-16916] Add api to get JavaSparkContext from SparkSession

    ## What changes were proposed in this pull request?
    Add api to get JavaSparkContext from SparkSession
    
    
    ## How was this patch tested?
    Run all test cases of spark
    
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/phalodi/spark SPARK-16816

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/14421.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #14421
    
----
commit b9134baa3ab8b4154e09f24ea04600395e44a928
Author: sandy <ph...@gmail.com>
Date:   2016-07-30T13:57:02Z

    [Spark-16916] Add api to get JavaSparkContext from SparkSession

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by petermaxlee <gi...@git.apache.org>.
Github user petermaxlee commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    Isn't this just
    
    ```
    new JavaSparkContext(session.sparkContext)
    ```
    ?
    
    Perhaps we should just update the documentation to say that.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14421#discussion_r72888602
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---
    @@ -122,6 +122,14 @@ class SparkSession private(
       val sqlContext: SQLContext = new SQLContext(this)
     
       /**
    +   * This is the interface through which the user can get [[JavaSparkContext]]
    +   *
    +   * @since 2.0.0
    +   */
    +  @transient
    +  val javaSparkContext: JavaSparkContext = new JavaSparkContext(sparkContext)
    --- End diff --
    
    But when user write code in java and he/she wants the JavaSparkConext, then need to get the sparkContext and create the java context with JavaSparkContext method fromSparkContext(sc) so its more good to provide directly JavaSparkContext from spark session to the user. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [Spark-16916] Add api to get JavaSparkContext from Spark...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @srowen @rxin @petermaxlee I make some changes according to me you can suggest something else if you have some good idea else just merge it if its looks good.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @srowen @rxin @petermaxlee i will close this pull request and create new one with documentation changes and also modify jira issue. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @petermaxlee @rxin ok so i go for make changes in documentation...!! What you guyss suggest what is the correct place to add this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    I meant update the sparkContext field in SparkSession.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @rxin ok i look into it soon and make changes in that.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14421#discussion_r72888685
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---
    @@ -122,6 +122,14 @@ class SparkSession private(
       val sqlContext: SQLContext = new SQLContext(this)
     
       /**
    +   * This is the interface through which the user can get [[JavaSparkContext]]
    +   *
    +   * @since 2.0.0
    +   */
    +  @transient
    +  val javaSparkContext: JavaSparkContext = new JavaSparkContext(sparkContext)
    --- End diff --
    
    @srowen Yeah you are right but for java developers its pain when they get sparkContext from spark session and call parallelize method on it and its give error that its take argument type of any scala seq so i think for java developer or user its more easy to get api.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    Yea seems like we should just update the documentation for this one.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    Can we update the doc for sparkContext?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi closed the pull request at:

    https://github.com/apache/spark/pull/14421


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14421#discussion_r72888442
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---
    @@ -122,6 +122,14 @@ class SparkSession private(
       val sqlContext: SQLContext = new SQLContext(this)
     
       /**
    +   * This is the interface through which the user can get [[JavaSparkContext]]
    +   *
    +   * @since 2.0.0
    +   */
    +  @transient
    +  val javaSparkContext: JavaSparkContext = new JavaSparkContext(sparkContext)
    --- End diff --
    
    This always make a JavaSparkContext. Why not just have the caller do this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @rxin But according to user they read it from starting and while they read about sparkContext they dont know about spark session so i think we just add a single line below the example where we extract sparkContext from spark session we just show that with sparkContext you can also create java spark context like this new JavaSparkContext(session.sparkContext) or JavaConverter.fromSparkContext(spark.sparkContext). What do you think its looks better?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14421#discussion_r72888618
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---
    @@ -122,6 +122,14 @@ class SparkSession private(
       val sqlContext: SQLContext = new SQLContext(this)
     
       /**
    +   * This is the interface through which the user can get [[JavaSparkContext]]
    +   *
    +   * @since 2.0.0
    +   */
    +  @transient
    +  val javaSparkContext: JavaSparkContext = new JavaSparkContext(sparkContext)
    --- End diff --
    
    Yeah but this exposes a Java-specific API and always creates the object for all callers. It's minor either way, but that seems even less nice.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

Posted by phalodi <gi...@git.apache.org>.
Github user phalodi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14421#discussion_r72888636
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---
    @@ -122,6 +122,14 @@ class SparkSession private(
       val sqlContext: SQLContext = new SQLContext(this)
     
       /**
    +   * This is the interface through which the user can get [[JavaSparkContext]]
    +   *
    +   * @since 2.0.0
    +   */
    +  @transient
    +  val javaSparkContext: JavaSparkContext = new JavaSparkContext(sparkContext)
    --- End diff --
    
    or we can add this code in spark.javaSparkContext so its more good for user to use it because context is primary thing.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    @phalodi these changes don't seem to be what Reynold suggested. You have your old change and added some example change (?) but didn't update documentation.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/14421
  
    (hence the lower case)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org