You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2018/05/18 22:52:33 UTC

[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/21368

    [SPARK-16451][repl] Fail shell if SparkSession fails to start.

    Currently, in spark-shell, if the session fails to start, the
    user sees a bunch of unrelated errors which are caused by code
    in the shell initialization that references the "spark" variable,
    which does not exist in that case. Things like:
    
    ```
    <console>:14: error: not found: value spark
           import spark.sql
    ```
    
    The user is also left with a non-working shell (unless they want
    to just write non-Spark Scala or Python code, that is).
    
    This change fails the whole shell session at the point where the
    failure occurs, so that the last error message is the one with
    the actual information about the failure.
    
    Tested with spark-shell, pyspark (with 2.7 and 3.5), by forcing an
    error during SparkContext initialization.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark SPARK-16451

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21368.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21368
    
----
commit b748d346e8c46e31e52f5bee8fade63b2155ac83
Author: Marcelo Vanzin <va...@...>
Date:   2018-05-18T22:35:59Z

    [SPARK-16451][repl] Fail shell if SparkSession fails to start.
    
    Currently, in spark-shell, if the session fails to start, the
    user sees a bunch of unrelated errors which are caused by code
    in the shell initialization that references the "spark" variable,
    which does not exist in that case. Things like:
    
    ```
    <console>:14: error: not found: value spark
           import spark.sql
    ```
    
    The user is also left with a non-working shell (unless they want
    to just write non-Spark Scala or Python code, that is).
    
    This change fails the whole shell session at the point where the
    failure occurs, so that the last error message is the one with
    the actual information about the failure.
    
    Tested with spark-shell, pyspark (with 2.7 and 3.5), by forcing an
    error during SparkContext initialization.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189709217
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    yeah, looks like github is confused. It's definitely reverted in https://github.com/apache/spark/pull/21368/commits/a062e4c4c349e0035d50a6de9abc1e7eb04c1568


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189674768
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    Thank you, @vanzin .


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/21368


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90916 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90916/testReport)** for PR 21368 at commit [`7f87bff`](https://github.com/apache/spark/commit/7f87bff7e3167e44413cbb4eb29822b9ef40c73e).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90916/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90916 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90916/testReport)** for PR 21368 at commit [`7f87bff`](https://github.com/apache/spark/commit/7f87bff7e3167e44413cbb4eb29822b9ef40c73e).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3427/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3799/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189760439
  
    --- Diff: repl/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -79,44 +81,50 @@ object Main extends Logging {
       }
     
       def createSparkSession(): SparkSession = {
    -    val execUri = System.getenv("SPARK_EXECUTOR_URI")
    -    conf.setIfMissing("spark.app.name", "Spark shell")
    -    // SparkContext will detect this configuration and register it with the RpcEnv's
    -    // file server, setting spark.repl.class.uri to the actual URI for executors to
    -    // use. This is sort of ugly but since executors are started as part of SparkContext
    -    // initialization in certain cases, there's an initialization order issue that prevents
    -    // this from being set after SparkContext is instantiated.
    -    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    -    if (execUri != null) {
    -      conf.set("spark.executor.uri", execUri)
    -    }
    -    if (System.getenv("SPARK_HOME") != null) {
    -      conf.setSparkHome(System.getenv("SPARK_HOME"))
    -    }
    +    try {
    +      val execUri = System.getenv("SPARK_EXECUTOR_URI")
    +      conf.setIfMissing("spark.app.name", "Spark shell")
    +      // SparkContext will detect this configuration and register it with the RpcEnv's
    +      // file server, setting spark.repl.class.uri to the actual URI for executors to
    +      // use. This is sort of ugly but since executors are started as part of SparkContext
    +      // initialization in certain cases, there's an initialization order issue that prevents
    +      // this from being set after SparkContext is instantiated.
    +      conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +      if (execUri != null) {
    +        conf.set("spark.executor.uri", execUri)
    +      }
    +      if (System.getenv("SPARK_HOME") != null) {
    +        conf.setSparkHome(System.getenv("SPARK_HOME"))
    +      }
     
    -    val builder = SparkSession.builder.config(conf)
    -    if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
    -      if (SparkSession.hiveClassesArePresent) {
    -        // In the case that the property is not set at all, builder's config
    -        // does not have this value set to 'hive' yet. The original default
    -        // behavior is that when there are hive classes, we use hive catalog.
    -        sparkSession = builder.enableHiveSupport().getOrCreate()
    -        logInfo("Created Spark session with Hive support")
    +      val builder = SparkSession.builder.config(conf)
    +      if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
    +        if (SparkSession.hiveClassesArePresent) {
    +          // In the case that the property is not set at all, builder's config
    +          // does not have this value set to 'hive' yet. The original default
    +          // behavior is that when there are hive classes, we use hive catalog.
    +          sparkSession = builder.enableHiveSupport().getOrCreate()
    +          logInfo("Created Spark session with Hive support")
    +        } else {
    +          // Need to change it back to 'in-memory' if no hive classes are found
    +          // in the case that the property is set to hive in spark-defaults.conf
    +          builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
    +          sparkSession = builder.getOrCreate()
    +          logInfo("Created Spark session")
    +        }
           } else {
    -        // Need to change it back to 'in-memory' if no hive classes are found
    -        // in the case that the property is set to hive in spark-defaults.conf
    -        builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
    +        // In the case that the property is set but not to 'hive', the internal
    +        // default is 'in-memory'. So the sparkSession will use in-memory catalog.
             sparkSession = builder.getOrCreate()
             logInfo("Created Spark session")
           }
    -    } else {
    -      // In the case that the property is set but not to 'hive', the internal
    -      // default is 'in-memory'. So the sparkSession will use in-memory catalog.
    -      sparkSession = builder.getOrCreate()
    -      logInfo("Created Spark session")
    +      sparkContext = sparkSession.sparkContext
    +      sparkSession
    +    } catch {
    +      case e: Exception if isShellSession =>
    +        logError("Failed to initialize Spark session.", e)
    --- End diff --
    
    @vanzin, seem `e.printStackTrace()` is missing .. ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189681355
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    SparkR already does something close to this:
    
    ```
      # 0 is success and +1 is reserved for heartbeats. Other negative values indicate errors.
      if (returnStatus < 0) {
        stop(readString(conn))
      }
    ```
    
    Except that "stop()" doesn't actually exit the shell. That's probably by design, so unless @felixcheung  says it's ok to exit the shell (and tells me how), I'll leave things as is.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90928/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189781240
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    there's a way, but overall we do not exit/terminate the R session since SparkR could be running in an interactive session (eg. RStudio)
    
    one possible approach is to exit only when running sparkR shell, by checking here:
    https://github.com/apache/spark/blob/master/R/pkg/inst/profile/shell.R#L27
    
    I'm not sure if stop() vs exit makes much of a difference though.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189421121
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    +1, LGTM for `spark-shell` and `pyspark`. For consistency, we are going to change `sparkR` and `spark-sql` later in this way, aren't we? Could you mention `spark-shell` and `pyspark` in the PR title specifically?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189762943
  
    --- Diff: repl/src/main/scala/org/apache/spark/repl/Main.scala ---
    @@ -79,44 +81,50 @@ object Main extends Logging {
       }
     
       def createSparkSession(): SparkSession = {
    -    val execUri = System.getenv("SPARK_EXECUTOR_URI")
    -    conf.setIfMissing("spark.app.name", "Spark shell")
    -    // SparkContext will detect this configuration and register it with the RpcEnv's
    -    // file server, setting spark.repl.class.uri to the actual URI for executors to
    -    // use. This is sort of ugly but since executors are started as part of SparkContext
    -    // initialization in certain cases, there's an initialization order issue that prevents
    -    // this from being set after SparkContext is instantiated.
    -    conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    -    if (execUri != null) {
    -      conf.set("spark.executor.uri", execUri)
    -    }
    -    if (System.getenv("SPARK_HOME") != null) {
    -      conf.setSparkHome(System.getenv("SPARK_HOME"))
    -    }
    +    try {
    +      val execUri = System.getenv("SPARK_EXECUTOR_URI")
    +      conf.setIfMissing("spark.app.name", "Spark shell")
    +      // SparkContext will detect this configuration and register it with the RpcEnv's
    +      // file server, setting spark.repl.class.uri to the actual URI for executors to
    +      // use. This is sort of ugly but since executors are started as part of SparkContext
    +      // initialization in certain cases, there's an initialization order issue that prevents
    +      // this from being set after SparkContext is instantiated.
    +      conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
    +      if (execUri != null) {
    +        conf.set("spark.executor.uri", execUri)
    +      }
    +      if (System.getenv("SPARK_HOME") != null) {
    +        conf.setSparkHome(System.getenv("SPARK_HOME"))
    +      }
     
    -    val builder = SparkSession.builder.config(conf)
    -    if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
    -      if (SparkSession.hiveClassesArePresent) {
    -        // In the case that the property is not set at all, builder's config
    -        // does not have this value set to 'hive' yet. The original default
    -        // behavior is that when there are hive classes, we use hive catalog.
    -        sparkSession = builder.enableHiveSupport().getOrCreate()
    -        logInfo("Created Spark session with Hive support")
    +      val builder = SparkSession.builder.config(conf)
    +      if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
    +        if (SparkSession.hiveClassesArePresent) {
    +          // In the case that the property is not set at all, builder's config
    +          // does not have this value set to 'hive' yet. The original default
    +          // behavior is that when there are hive classes, we use hive catalog.
    +          sparkSession = builder.enableHiveSupport().getOrCreate()
    +          logInfo("Created Spark session with Hive support")
    +        } else {
    +          // Need to change it back to 'in-memory' if no hive classes are found
    +          // in the case that the property is set to hive in spark-defaults.conf
    +          builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
    +          sparkSession = builder.getOrCreate()
    +          logInfo("Created Spark session")
    +        }
           } else {
    -        // Need to change it back to 'in-memory' if no hive classes are found
    -        // in the case that the property is set to hive in spark-defaults.conf
    -        builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
    +        // In the case that the property is set but not to 'hive', the internal
    +        // default is 'in-memory'. So the sparkSession will use in-memory catalog.
             sparkSession = builder.getOrCreate()
             logInfo("Created Spark session")
           }
    -    } else {
    -      // In the case that the property is set but not to 'hive', the internal
    -      // default is 'in-memory'. So the sparkSession will use in-memory catalog.
    -      sparkSession = builder.getOrCreate()
    -      logInfo("Created Spark session")
    +      sparkContext = sparkSession.sparkContext
    +      sparkSession
    +    } catch {
    +      case e: Exception if isShellSession =>
    +        logError("Failed to initialize Spark session.", e)
    --- End diff --
    
    The exception is being printed as part of the `logError`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #91455 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/91455/testReport)** for PR 21368 at commit [`6d53ca0`](https://github.com/apache/spark/commit/6d53ca024a5f88d7d3dcd41257c3de72aadd40b6).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90928 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90928/testReport)** for PR 21368 at commit [`6d53ca0`](https://github.com/apache/spark/commit/6d53ca024a5f88d7d3dcd41257c3de72aadd40b6).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189649450
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -44,7 +44,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    My usual response is "this is not a public class" (it's not in the public API docs), but let me see if it's easy to restrict the `sys.exit` to spark-shell invocations.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189649541
  
    --- Diff: python/pyspark/shell.py ---
    @@ -38,25 +41,29 @@
     SparkContext._ensure_initialized()
     
     try:
    -    # Try to access HiveConf, it will raise exception if Hive is not added
    -    conf = SparkConf()
    -    if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    -        SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    -        spark = SparkSession.builder\
    -            .enableHiveSupport()\
    -            .getOrCreate()
    -    else:
    +    try:
    +        # Try to access HiveConf, it will raise exception if Hive is not added
    +        conf = SparkConf()
    +        if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    +            SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    +            spark = SparkSession.builder\
    +                .enableHiveSupport()\
    +                .getOrCreate()
    +        else:
    +            spark = SparkSession.builder.getOrCreate()
    +    except py4j.protocol.Py4JError:
    +        if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +            warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                          "please make sure you build spark with hive")
    +        spark = SparkSession.builder.getOrCreate()
    +    except TypeError:
    +        if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +            warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                          "please make sure you build spark with hive")
             spark = SparkSession.builder.getOrCreate()
    -except py4j.protocol.Py4JError:
    -    if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    -        warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    -                      "please make sure you build spark with hive")
    -    spark = SparkSession.builder.getOrCreate()
    -except TypeError:
    -    if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    -        warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    -                      "please make sure you build spark with hive")
    -    spark = SparkSession.builder.getOrCreate()
    +except Exception as e:
    +    print("Failed to initialize Spark session:", e, file=sys.stderr)
    --- End diff --
    
    Printing the exception shows its traceback.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/91455/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90928 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90928/testReport)** for PR 21368 at commit [`6d53ca0`](https://github.com/apache/spark/commit/6d53ca024a5f88d7d3dcd41257c3de72aadd40b6).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189964634
  
    --- Diff: python/pyspark/sql/session.py ---
    @@ -547,6 +547,40 @@ def _create_from_pandas_with_arrow(self, pdf, schema, timezone):
             df._schema = schema
             return df
     
    +    @staticmethod
    +    def _create_shell_session():
    +        """
    +        Initialize a SparkSession for a pyspark shell session. This is called from shell.py
    +        to make error handling simpler without needing to declare local variables in that
    +        script, which would expose those to users.
    +        """
    +        import py4j
    +        from pyspark.conf import SparkConf
    +        from pyspark.context import SparkContext
    +        try:
    +            # Try to access HiveConf, it will raise exception if Hive is not added
    +            conf = SparkConf()
    +            if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    +                SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    +                return SparkSession.builder\
    +                    .enableHiveSupport()\
    +                    .getOrCreate()
    +            else:
    +                return SparkSession.builder.getOrCreate()
    +        except py4j.protocol.Py4JError:
    +            if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +                warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                              "please make sure you build spark with hive")
    +
    +        try:
    +            return SparkSession.builder.getOrCreate()
    --- End diff --
    
    This is intentional to avoid the python exception being unreadable (see commit description).
    
    The actual flow logic is the same.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #91455 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/91455/testReport)** for PR 21368 at commit [`6d53ca0`](https://github.com/apache/spark/commit/6d53ca024a5f88d7d3dcd41257c3de72aadd40b6).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189781578
  
    --- Diff: python/pyspark/sql/session.py ---
    @@ -547,6 +547,40 @@ def _create_from_pandas_with_arrow(self, pdf, schema, timezone):
             df._schema = schema
             return df
     
    +    @staticmethod
    +    def _create_shell_session():
    +        """
    +        Initialize a SparkSession for a pyspark shell session. This is called from shell.py
    +        to make error handling simpler without needing to declare local variables in that
    +        script, which would expose those to users.
    +        """
    +        import py4j
    +        from pyspark.conf import SparkConf
    +        from pyspark.context import SparkContext
    +        try:
    +            # Try to access HiveConf, it will raise exception if Hive is not added
    +            conf = SparkConf()
    +            if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    +                SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    +                return SparkSession.builder\
    +                    .enableHiveSupport()\
    +                    .getOrCreate()
    +            else:
    +                return SparkSession.builder.getOrCreate()
    +        except py4j.protocol.Py4JError:
    +            if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +                warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                              "please make sure you build spark with hive")
    +
    +        try:
    +            return SparkSession.builder.getOrCreate()
    --- End diff --
    
    the call flow seems to be changed here? I think this line is meant to be inside the handling of Py4JError?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189681808
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    Also, spark-sql for me exits right away if the SparkContext does not come up, showing the underlying exception, so it looks good already.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90816/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189754122
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    how about just squashing the commits if it's not hard?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189480728
  
    --- Diff: python/pyspark/shell.py ---
    @@ -38,25 +41,29 @@
     SparkContext._ensure_initialized()
     
     try:
    -    # Try to access HiveConf, it will raise exception if Hive is not added
    -    conf = SparkConf()
    -    if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    -        SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    -        spark = SparkSession.builder\
    -            .enableHiveSupport()\
    -            .getOrCreate()
    -    else:
    +    try:
    +        # Try to access HiveConf, it will raise exception if Hive is not added
    +        conf = SparkConf()
    +        if conf.get('spark.sql.catalogImplementation', 'hive').lower() == 'hive':
    +            SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
    +            spark = SparkSession.builder\
    +                .enableHiveSupport()\
    +                .getOrCreate()
    +        else:
    +            spark = SparkSession.builder.getOrCreate()
    +    except py4j.protocol.Py4JError:
    +        if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +            warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                          "please make sure you build spark with hive")
    +        spark = SparkSession.builder.getOrCreate()
    +    except TypeError:
    +        if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    +            warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    +                          "please make sure you build spark with hive")
             spark = SparkSession.builder.getOrCreate()
    -except py4j.protocol.Py4JError:
    -    if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    -        warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    -                      "please make sure you build spark with hive")
    -    spark = SparkSession.builder.getOrCreate()
    -except TypeError:
    -    if conf.get('spark.sql.catalogImplementation', '').lower() == 'hive':
    -        warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
    -                      "please make sure you build spark with hive")
    -    spark = SparkSession.builder.getOrCreate()
    +except Exception as e:
    +    print("Failed to initialize Spark session:", e, file=sys.stderr)
    --- End diff --
    
    For consistency, it sounds better to print out traceback here too likewise with Scala side?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189427017
  
    --- Diff: repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -44,7 +44,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    my concern is SparkILoop is used in a bunch of settings outside of spark and its shell/repl - sys.exit might not be ideal in some cases


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189649746
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    I'm not that familiar with those two shells but I'll give it a try.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189754953
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    Ah I know. I didn't update the scala 2.12 code. D'oh.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90912 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90912/testReport)** for PR 21368 at commit [`53c29db`](https://github.com/apache/spark/commit/53c29dbbf494be203256ac7d25a916bd1b3c84b0).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90816 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90816/testReport)** for PR 21368 at commit [`b748d34`](https://github.com/apache/spark/commit/b748d346e8c46e31e52f5bee8fade63b2155ac83).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189754757
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    Let me try that...


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90816 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90816/testReport)** for PR 21368 at commit [`b748d34`](https://github.com/apache/spark/commit/b748d346e8c46e31e52f5bee8fade63b2155ac83).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189708824
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    Hm, I thought I had reverted this. Let me look again.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged to master


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3434/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3435/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3351/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/3424/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90912/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    **[Test build #90912 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90912/testReport)** for PR 21368 at commit [`53c29db`](https://github.com/apache/spark/commit/53c29dbbf494be203256ac7d25a916bd1b3c84b0).
     * This patch **fails Python style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    also, ping.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21368: [SPARK-16451][repl] Fail shell if SparkSession fails to ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21368
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21368: [SPARK-16451][repl] Fail shell if SparkSession fa...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189755036
  
    --- Diff: repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    --- End diff --
    
    Ohh haha sure. I just noticed it too.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org