You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by cloud-fan <gi...@git.apache.org> on 2016/05/04 07:11:43 UTC

[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

GitHub user cloud-fan opened a pull request:

    https://github.com/apache/spark/pull/12890

    [SQL] revert 2 REPL changes in SPARK-15073

    ## What changes were proposed in this pull request?
    
    see https://github.com/apache/spark/pull/12873#discussion_r61993910
    
    ## How was this patch tested?
    
    verified it locally.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/cloud-fan/spark repl

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/12890.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #12890
    
----
commit 49d653ba672fe42c133fe0ea294033c7ed819ceb
Author: Wenchen Fan <we...@databricks.com>
Date:   2016-05-04T06:59:33Z

    revert 2 REPL changes in SPARK-15073

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763223
  
    cc @andrewor14 @rxin 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12890#discussion_r62084188
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -71,35 +71,32 @@ object Main extends Logging {
         }
       }
     
    -  def createSparkContext(): SparkContext = {
    +  def createSparkSession(): SparkSession = {
         val execUri = System.getenv("SPARK_EXECUTOR_URI")
         conf.setIfMissing("spark.app.name", "Spark shell")
    -      // SparkContext will detect this configuration and register it with the RpcEnv's
    -      // file server, setting spark.repl.class.uri to the actual URI for executors to
    -      // use. This is sort of ugly but since executors are started as part of SparkContext
    -      // initialization in certain cases, there's an initialization order issue that prevents
    -      // this from being set after SparkContext is instantiated.
    -      .set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +    // SparkContext will detect this configuration and register it with the RpcEnv's
    +    // file server, setting spark.repl.class.uri to the actual URI for executors to
    +    // use. This is sort of ugly but since executors are started as part of SparkContext
    +    // initialization in certain cases, there's an initialization order issue that prevents
    +    // this from being set after SparkContext is instantiated.
    +    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
         if (execUri != null) {
           conf.set("spark.executor.uri", execUri)
         }
         if (System.getenv("SPARK_HOME") != null) {
           conf.setSparkHome(System.getenv("SPARK_HOME"))
         }
    -    sparkContext = new SparkContext(conf)
    -    logInfo("Created spark context..")
    -    Signaling.cancelOnInterrupt(sparkContext)
    -    sparkContext
    -  }
     
    -  def createSparkSession(): SparkSession = {
    +    val builder = SparkSession.builder.config(conf)
         if (SparkSession.hiveClassesArePresent) {
    -      sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
    +      sparkSession = builder.enableHiveSupport().getOrCreate()
           logInfo("Created Spark session with Hive support")
         } else {
    -      sparkSession = SparkSession.builder.getOrCreate()
    +      sparkSession = builder.getOrCreate()
    --- End diff --
    
    hm I think it's better to keep that flag contained rather than duplicating it everywhere


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216825153
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/57742/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216825000
  
    **[Test build #57742 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57742/consoleFull)** for PR 12890 at commit [`2681e79`](https://github.com/apache/spark/commit/2681e7957e8117829bec73a057e9594ca88505a4).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763752
  
    yea, but look at the logic of `Builder.getOrCreate`:
    ```
    def getOrCreate(): SparkSession = synchronized {
          // Step 1. Create a SparkConf
          // Step 2. Get a SparkContext
          // Step 3. Get a SparkSession
          val sparkConf = new SparkConf()
          options.foreach { case (k, v) => sparkConf.set(k, v) }
          val sparkContext = SparkContext.getOrCreate(sparkConf)
    
          SQLContext.getOrCreate(sparkContext).sparkSession
        }
    ```
    
    In REPL, we create the `SparkContext` first, then `SparkSession`, so the `getOrCreate` won't create a new `SparkSession`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-217012064
  
    Merging into master 2.0.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763651
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/57732/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216930532
  
    Is the problem that {{val sparkContext = SparkContext.getOrCreate(sparkConf)}} will give us a {{sparkContext}} that already created by the repl and its conf does not `spark.sql.catalogImplementation` to `hive`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763650
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763255
  
    Are these not called only once?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763191
  
    **[Test build #57732 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57732/consoleFull)** for PR 12890 at commit [`49d653b`](https://github.com/apache/spark/commit/49d653ba672fe42c133fe0ea294033c7ed819ceb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216806829
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216949848
  
    By the way I think the comment https://github.com/apache/spark/pull/12873#discussion_r61993910 captures the point but is a little misleading. The issue here is that if there's already a `SparkContext` then we just throw away all the conf set through the builder, including the `spark.sql.catalogImplementation`!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12890#discussion_r62119745
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -71,35 +71,32 @@ object Main extends Logging {
         }
       }
     
    -  def createSparkContext(): SparkContext = {
    +  def createSparkSession(): SparkSession = {
         val execUri = System.getenv("SPARK_EXECUTOR_URI")
         conf.setIfMissing("spark.app.name", "Spark shell")
    -      // SparkContext will detect this configuration and register it with the RpcEnv's
    -      // file server, setting spark.repl.class.uri to the actual URI for executors to
    -      // use. This is sort of ugly but since executors are started as part of SparkContext
    -      // initialization in certain cases, there's an initialization order issue that prevents
    -      // this from being set after SparkContext is instantiated.
    -      .set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +    // SparkContext will detect this configuration and register it with the RpcEnv's
    +    // file server, setting spark.repl.class.uri to the actual URI for executors to
    +    // use. This is sort of ugly but since executors are started as part of SparkContext
    +    // initialization in certain cases, there's an initialization order issue that prevents
    +    // this from being set after SparkContext is instantiated.
    +    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
         if (execUri != null) {
           conf.set("spark.executor.uri", execUri)
         }
         if (System.getenv("SPARK_HOME") != null) {
           conf.setSparkHome(System.getenv("SPARK_HOME"))
         }
    -    sparkContext = new SparkContext(conf)
    -    logInfo("Created spark context..")
    -    Signaling.cancelOnInterrupt(sparkContext)
    -    sparkContext
    -  }
     
    -  def createSparkSession(): SparkSession = {
    +    val builder = SparkSession.builder.config(conf)
         if (SparkSession.hiveClassesArePresent) {
    -      sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
    +      sparkSession = builder.enableHiveSupport().getOrCreate()
    --- End diff --
    
    oh, right. We still have this method.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12890#discussion_r62076215
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -71,35 +71,32 @@ object Main extends Logging {
         }
       }
     
    -  def createSparkContext(): SparkContext = {
    +  def createSparkSession(): SparkSession = {
         val execUri = System.getenv("SPARK_EXECUTOR_URI")
         conf.setIfMissing("spark.app.name", "Spark shell")
    -      // SparkContext will detect this configuration and register it with the RpcEnv's
    -      // file server, setting spark.repl.class.uri to the actual URI for executors to
    -      // use. This is sort of ugly but since executors are started as part of SparkContext
    -      // initialization in certain cases, there's an initialization order issue that prevents
    -      // this from being set after SparkContext is instantiated.
    -      .set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +    // SparkContext will detect this configuration and register it with the RpcEnv's
    +    // file server, setting spark.repl.class.uri to the actual URI for executors to
    +    // use. This is sort of ugly but since executors are started as part of SparkContext
    +    // initialization in certain cases, there's an initialization order issue that prevents
    +    // this from being set after SparkContext is instantiated.
    +    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
         if (execUri != null) {
           conf.set("spark.executor.uri", execUri)
         }
         if (System.getenv("SPARK_HOME") != null) {
           conf.setSparkHome(System.getenv("SPARK_HOME"))
         }
    -    sparkContext = new SparkContext(conf)
    -    logInfo("Created spark context..")
    -    Signaling.cancelOnInterrupt(sparkContext)
    -    sparkContext
    -  }
     
    -  def createSparkSession(): SparkSession = {
    +    val builder = SparkSession.builder.config(conf)
         if (SparkSession.hiveClassesArePresent) {
    -      sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
    +      sparkSession = builder.enableHiveSupport().getOrCreate()
           logInfo("Created Spark session with Hive support")
         } else {
    -      sparkSession = SparkSession.builder.getOrCreate()
    +      sparkSession = builder.getOrCreate()
    --- End diff --
    
    At here, maybe it is better to explicitly set `spark.sql.catalogImplementation` to `in-memory`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12890#discussion_r62076169
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -71,35 +71,32 @@ object Main extends Logging {
         }
       }
     
    -  def createSparkContext(): SparkContext = {
    +  def createSparkSession(): SparkSession = {
         val execUri = System.getenv("SPARK_EXECUTOR_URI")
         conf.setIfMissing("spark.app.name", "Spark shell")
    -      // SparkContext will detect this configuration and register it with the RpcEnv's
    -      // file server, setting spark.repl.class.uri to the actual URI for executors to
    -      // use. This is sort of ugly but since executors are started as part of SparkContext
    -      // initialization in certain cases, there's an initialization order issue that prevents
    -      // this from being set after SparkContext is instantiated.
    -      .set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +    // SparkContext will detect this configuration and register it with the RpcEnv's
    +    // file server, setting spark.repl.class.uri to the actual URI for executors to
    +    // use. This is sort of ugly but since executors are started as part of SparkContext
    +    // initialization in certain cases, there's an initialization order issue that prevents
    +    // this from being set after SparkContext is instantiated.
    +    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
         if (execUri != null) {
           conf.set("spark.executor.uri", execUri)
         }
         if (System.getenv("SPARK_HOME") != null) {
           conf.setSparkHome(System.getenv("SPARK_HOME"))
         }
    -    sparkContext = new SparkContext(conf)
    -    logInfo("Created spark context..")
    -    Signaling.cancelOnInterrupt(sparkContext)
    -    sparkContext
    -  }
     
    -  def createSparkSession(): SparkSession = {
    +    val builder = SparkSession.builder.config(conf)
         if (SparkSession.hiveClassesArePresent) {
    -      sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
    +      sparkSession = builder.enableHiveSupport().getOrCreate()
    --- End diff --
    
    I guess we want to use `builder.config("spark.sql.catalogImplementation", "hive")`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216792610
  
    **[Test build #57742 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57742/consoleFull)** for PR 12890 at commit [`2681e79`](https://github.com/apache/spark/commit/2681e7957e8117829bec73a057e9594ca88505a4).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216806314
  
    **[Test build #57733 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57733/consoleFull)** for PR 12890 at commit [`49d653b`](https://github.com/apache/spark/commit/49d653ba672fe42c133fe0ea294033c7ed819ceb).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/12890


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216806834
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/57733/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216825152
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216764423
  
    retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216768295
  
    It seems like there is an underlying problem here. Can we fix that?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216950823
  
    It seems like we might need a little bit more of design here to figure out what the right behavior should be. Let's talk more offline.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216948092
  
    LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216764697
  
    **[Test build #57733 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57733/consoleFull)** for PR 12890 at commit [`49d653b`](https://github.com/apache/spark/commit/49d653ba672fe42c133fe0ea294033c7ed819ceb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216763636
  
    **[Test build #57732 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/57732/consoleFull)** for PR 12890 at commit [`49d653b`](https://github.com/apache/spark/commit/49d653ba672fe42c133fe0ea294033c7ed819ceb).
     * This patch **fails build dependency tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-15116] In REPL we should create SparkSe...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12890#discussion_r62082428
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -71,35 +71,32 @@ object Main extends Logging {
         }
       }
     
    -  def createSparkContext(): SparkContext = {
    +  def createSparkSession(): SparkSession = {
         val execUri = System.getenv("SPARK_EXECUTOR_URI")
         conf.setIfMissing("spark.app.name", "Spark shell")
    -      // SparkContext will detect this configuration and register it with the RpcEnv's
    -      // file server, setting spark.repl.class.uri to the actual URI for executors to
    -      // use. This is sort of ugly but since executors are started as part of SparkContext
    -      // initialization in certain cases, there's an initialization order issue that prevents
    -      // this from being set after SparkContext is instantiated.
    -      .set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +    // SparkContext will detect this configuration and register it with the RpcEnv's
    +    // file server, setting spark.repl.class.uri to the actual URI for executors to
    +    // use. This is sort of ugly but since executors are started as part of SparkContext
    +    // initialization in certain cases, there's an initialization order issue that prevents
    +    // this from being set after SparkContext is instantiated.
    +    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
         if (execUri != null) {
           conf.set("spark.executor.uri", execUri)
         }
         if (System.getenv("SPARK_HOME") != null) {
           conf.setSparkHome(System.getenv("SPARK_HOME"))
         }
    -    sparkContext = new SparkContext(conf)
    -    logInfo("Created spark context..")
    -    Signaling.cancelOnInterrupt(sparkContext)
    -    sparkContext
    -  }
     
    -  def createSparkSession(): SparkSession = {
    +    val builder = SparkSession.builder.config(conf)
         if (SparkSession.hiveClassesArePresent) {
    -      sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
    +      sparkSession = builder.enableHiveSupport().getOrCreate()
    --- End diff --
    
    that's what enableHiveSupport does?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SQL] revert 2 REPL changes in SPARK-15073

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/12890#issuecomment-216765106
  
    Sorry I still don't understand what's going on by looking at your comment. One thing is that we are going to remove withHiveSupport, so adding it back isn't a "fix" per se. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org