You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kyuubi.apache.org by ul...@apache.org on 2021/08/26 08:53:33 UTC

[incubator-kyuubi] branch master updated: [KYUUBI #991] [MINOR] improve the error message of SPARK package not found.

This is an automated email from the ASF dual-hosted git repository.

ulyssesyou pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-kyuubi.git


The following commit(s) were added to refs/heads/master by this push:
     new 9695f75  [KYUUBI #991] [MINOR] improve the error message of SPARK package not found.
9695f75 is described below

commit 9695f7588913d18d8e1fc40e490b66a84343f3f7
Author: Fu Chen <cf...@gmail.com>
AuthorDate: Thu Aug 26 16:53:24 2021 +0800

    [KYUUBI #991] [MINOR] improve the error message of SPARK package not found.
    
    <!--
    Thanks for sending a pull request!
    
    Here are some tips for you:
      1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html
      2. If the PR is related to an issue in https://github.com/apache/incubator-kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'.
      3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'.
    -->
    
    ### _Why are the changes needed?_
    <!--
    Please clarify why the changes are needed. For instance,
      1. If you add a feature, you can talk about the use case of it.
      2. If you fix a bug, you can clarify why it is a bug.
    -->
    
    As `File.listFiles` may return null when spark package not found in folder `${PROJECT_ROOT_DIR}/kyuubi-server/externals/kyuubi-download/target`, the NPE will throw. Improve the error message.
    
    Before thie pr:
    
    ```
    Caused by: java.lang.RuntimeException: org.apache.kyuubi.KyuubiSQLException:Error opening session SessionHandle [9749221f-5b30-457b-b8e5-25affba25061] for fchen due to null
    	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:68)
    	at org.apache.kyuubi.session.KyuubiSessionManager.openSession(KyuubiSessionManager.scala:74)
    	at org.apache.kyuubi.service.AbstractBackendService.openSession(AbstractBackendService.scala:45)
    	at org.apache.kyuubi.service.ThriftFrontendService.getSessionHandle(ThriftFrontendService.scala:190)
    	at org.apache.kyuubi.service.ThriftFrontendService.OpenSession(ThriftFrontendService.scala:199)
    	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377)
    	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
    	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
    	at org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
    	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.NullPointerException: null
    	at scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:204)
    	at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:204)
    	at scala.collection.IndexedSeqOptimized.isEmpty(IndexedSeqOptimized.scala:30)
    	at scala.collection.IndexedSeqOptimized.isEmpty$(IndexedSeqOptimized.scala:30)
    	at scala.collection.mutable.ArrayOps$ofRef.isEmpty(ArrayOps.scala:198)
    	at scala.collection.TraversableLike.headOption(TraversableLike.scala:608)
    	at scala.collection.TraversableLike.headOption$(TraversableLike.scala:608)
    	at scala.collection.mutable.ArrayOps$ofRef.headOption(ArrayOps.scala:198)
    	at org.apache.kyuubi.engine.spark.SparkProcessBuilder.$anonfun$executable$1(SparkProcessBuilder.scala:50)
    	at scala.Option.orElse(Option.scala:447)
    	at org.apache.kyuubi.engine.spark.SparkProcessBuilder.<init>(SparkProcessBuilder.scala:41)
    	at org.apache.kyuubi.engine.EngineRef.$anonfun$create$1(EngineRef.scala:140)
    	at org.apache.kyuubi.engine.EngineRef.tryWithLock(EngineRef.scala:116)
    	at org.apache.kyuubi.engine.EngineRef.create(EngineRef.scala:128)
    	at org.apache.kyuubi.engine.EngineRef.$anonfun$getOrCreate$1(EngineRef.scala:182)
    	at scala.Option.getOrElse(Option.scala:189)
    	at org.apache.kyuubi.engine.EngineRef.getOrCreate(EngineRef.scala:182)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$open$2(KyuubiSessionImpl.scala:63)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$open$2$adapted(KyuubiSessionImpl.scala:62)
    	at org.apache.kyuubi.ha.client.ZooKeeperClientProvider$.withZkClient(ZooKeeperClientProvider.scala:74)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.open(KyuubiSessionImpl.scala:62)
    	at org.apache.kyuubi.session.KyuubiSessionManager.openSession(KyuubiSessionManager.scala:58)
    	... 12 more
    ```
    After this pr:
    
    ```
    Caused by: java.lang.RuntimeException: org.apache.kyuubi.KyuubiSQLException:Error opening session SessionHandle [5ea6a8b3-1727-4e0d-b542-992f84afcde8] for fchen due to SPARK_HOME is not set!
    	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:68)
    	at org.apache.kyuubi.session.KyuubiSessionManager.openSession(KyuubiSessionManager.scala:74)
    	at org.apache.kyuubi.service.AbstractBackendService.openSession(AbstractBackendService.scala:45)
    	at org.apache.kyuubi.service.ThriftFrontendService.getSessionHandle(ThriftFrontendService.scala:190)
    	at org.apache.kyuubi.service.ThriftFrontendService.OpenSession(ThriftFrontendService.scala:199)
    	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377)
    	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
    	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
    	at org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
    	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.RuntimeException: org.apache.kyuubi.KyuubiSQLException:SPARK_HOME is not set!
    	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:68)
    	at org.apache.kyuubi.engine.spark.SparkProcessBuilder.$anonfun$executable$5(SparkProcessBuilder.scala:61)
    	at scala.Option.getOrElse(Option.scala:189)
    	at org.apache.kyuubi.engine.spark.SparkProcessBuilder.<init>(SparkProcessBuilder.scala:61)
    	at org.apache.kyuubi.engine.EngineRef.$anonfun$create$1(EngineRef.scala:140)
    	at org.apache.kyuubi.engine.EngineRef.tryWithLock(EngineRef.scala:116)
    	at org.apache.kyuubi.engine.EngineRef.create(EngineRef.scala:128)
    	at org.apache.kyuubi.engine.EngineRef.$anonfun$getOrCreate$1(EngineRef.scala:182)
    	at scala.Option.getOrElse(Option.scala:189)
    	at org.apache.kyuubi.engine.EngineRef.getOrCreate(EngineRef.scala:182)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$open$2(KyuubiSessionImpl.scala:63)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$open$2$adapted(KyuubiSessionImpl.scala:62)
    	at org.apache.kyuubi.ha.client.ZooKeeperClientProvider$.withZkClient(ZooKeeperClientProvider.scala:74)
    	at org.apache.kyuubi.session.KyuubiSessionImpl.open(KyuubiSessionImpl.scala:62)
    	at org.apache.kyuubi.session.KyuubiSessionManager.openSession(KyuubiSessionManager.scala:58)
    	... 12 more
    ```
    
    ### _How was this patch tested?_
    - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible
    
    - [ ] Add screenshots for manual tests if appropriate
    
    - [x] [Run test](https://kyuubi.readthedocs.io/en/latest/develop_tools/testing.html#running-tests) locally before make a pull request
    
    Closes #991 from cfmcgrady/improve-error-msg.
    
    Closes #991
    
    1e8da683 [Fu Chen] imporove the error message
    
    Authored-by: Fu Chen <cf...@gmail.com>
    Signed-off-by: ulysses-you <ul...@gmail.com>
---
 .../kyuubi/engine/spark/SparkProcessBuilder.scala   | 21 +++++++++++----------
 1 file changed, 11 insertions(+), 10 deletions(-)

diff --git a/kyuubi-server/src/main/scala/org/apache/kyuubi/engine/spark/SparkProcessBuilder.scala b/kyuubi-server/src/main/scala/org/apache/kyuubi/engine/spark/SparkProcessBuilder.scala
index 47351d4..47077fb 100644
--- a/kyuubi-server/src/main/scala/org/apache/kyuubi/engine/spark/SparkProcessBuilder.scala
+++ b/kyuubi-server/src/main/scala/org/apache/kyuubi/engine/spark/SparkProcessBuilder.scala
@@ -42,16 +42,17 @@ class SparkProcessBuilder(
       val cwd = getClass.getProtectionDomain.getCodeSource.getLocation.getPath
         .split("kyuubi-server")
       assert(cwd.length > 1)
-      Paths.get(cwd.head)
-        .resolve("externals")
-        .resolve("kyuubi-download")
-        .resolve("target")
-        .toFile
-        .listFiles(new FilenameFilter {
-        override def accept(dir: File, name: String): Boolean = {
-          dir.isDirectory && name.startsWith("spark-")
-        }
-      }).headOption.map(_.getAbsolutePath)
+      Option(
+        Paths.get(cwd.head)
+          .resolve("externals")
+          .resolve("kyuubi-download")
+          .resolve("target")
+          .toFile
+          .listFiles(new FilenameFilter {
+            override def accept(dir: File, name: String): Boolean = {
+              dir.isDirectory && name.startsWith("spark-")}}))
+        .flatMap(_.headOption)
+        .map(_.getAbsolutePath)
     }
 
     sparkHomeOpt.map{ dir =>