You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by GitBox <gi...@apache.org> on 2021/01/04 14:17:06 UTC

[GitHub] [kylin] hit-lacus edited a comment on pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

hit-lacus edited a comment on pull request #1535:
URL: https://github.com/apache/kylin/pull/1535#issuecomment-753998578


   ### Fixed Exception 
   
   - replace hive-exec-xxx.jar 
   
   ```java
   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
   21/01/04 08:31:26 INFO SparderContext: Current thread 36 create a SparkSession.
   21/01/04 08:31:26 INFO SparderContext: Init spark.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark thread starting.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark, waiting for done.
   21/01/04 08:31:27 ERROR SparderContext: Error for initializing spark
   java.lang.ExceptionInInitializerError
   	at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)
   	at java.lang.Class.forName0(Native Method)
   	at java.lang.Class.forName(Class.java:348)
   	at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
   	at org.apache.spark.sql.SparkSession$.hiveClassesArePresent(SparkSession.scala:1128)
   	at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:867)
   	at org.apache.spark.sql.SparderContext$$anonfun$initSpark$1$$anon$4.run(SparderContext.scala:150)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-cdh6.2.0
   	at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:174)
   	at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
   	at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
   	at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)
   	... 8 more
   21/01/04 08:31:27 INFO SparderContext: Setting initializing Spark thread to null.
   ```
   
   - rm -rf spark/jars/hadoop-hdfs* ; rm -rf spark/jars/hadoop-yarn* ; rm -rf spark/jars/hadoop-mapreduce* 
   
   ```java
   [2021-01-04 09:03:45.442]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   03:44 INFO spark.SecurityManager: Changing modify acls groups to:
   21/01/04 09:03:44 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
   21/01/04 09:03:45 INFO yarn.ApplicationMaster: Preparing Local resources
   Exception in thread "main" java.lang.VerifyError: Bad return type
   Exception Details:
     Location:
       org/apache/hadoop/hdfs/DFSClient.getQuotaUsage(Ljava/lang/String;)Lorg/apache/hadoop/fs/QuotaUsage; @157: areturn
     Reason:
       Type 'org/apache/hadoop/fs/ContentSummary' (current frame, stack[0]) is not assignable to 'org/apache/hadoop/fs/QuotaUsage' (from method signature)
     Current Frame:
       bci: @157
       flags: { }
       locals: { 'org/apache/hadoop/hdfs/DFSClient', 'java/lang/String', 'org/apache/hadoop/ipc/RemoteException', 'java/io/IOException' }
       stack: { 'org/apache/hadoop/fs/ContentSummary' }
     Bytecode:
       0x0000000: 2ab6 00b5 2a13 01f4 2bb6 00b7 4d01 4e2a
       0x0000010: b400 422b b901 f502 003a 042c c600 1d2d
       0x0000020: c600 152c b600 b9a7 0012 3a05 2d19 05b6
       0x0000030: 00bb a700 072c b600 b919 04b0 3a04 1904
       0x0000040: 4e19 04bf 3a06 2cc6 001d 2dc6 0015 2cb6
       0x0000050: 00b9 a700 123a 072d 1907 b600 bba7 0007
       0x0000060: 2cb6 00b9 1906 bf4d 2c07 bd00 d459 0312
       0x0000070: d653 5904 12e0 5359 0512 e153 5906 1301
       0x0000080: f653 b600 d74e 2dc1 01f6 9900 14b2 0023
       0x0000090: 1301 f7b9 002b 0200 2a2b b601 f8b0 2dbf
       0x00000a0:
     Exception Handler Table:
       bci [35, 39] => handler: 42
       bci [15, 27] => handler: 60
       bci [15, 27] => handler: 68
       bci [78, 82] => handler: 85
       bci [60, 70] => handler: 68
       bci [4, 57] => handler: 103
       bci [60, 103] => handler: 103
     Stackmap Table:
       full_frame(@42,{Object[#751],Object[#774],Object[#829],Object[#799],Object[#1221]},{Object[#799]})
       same_frame(@53)
       same_frame(@57)
       full_frame(@60,{Object[#751],Object[#774],Object[#829],Object[#799]},{Object[#799]})
       same_locals_1_stack_item_frame(@68,Object[#799])
       full_frame(@85,{Object[#751],Object[#774],Object[#829],Object[#799],Top,Top,Object[#799]},{Object[#799]})
       same_frame(@96)
       same_frame(@100)
       full_frame(@103,{Object[#751],Object[#774]},{Object[#854]})
       append_frame(@158,Object[#854],Object[#814])
   
   	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:167)
   	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
   	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
   	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
   	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
   	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:219)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:217)
   	at scala.Option.foreach(Option.scala:257)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:217)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:780)
   	at java.security.AccessController.doPrivileged(Native Method)
   	at javax.security.auth.Subject.doAs(Subject.java:422)
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:779)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:803)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:834)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
   
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org