You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by GitBox <gi...@apache.org> on 2021/01/04 06:54:31 UTC

[GitHub] [kylin] hit-lacus opened a new pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

hit-lacus opened a new pull request #1535:
URL: https://github.com/apache/kylin/pull/1535


   ## Proposed changes
   
   Describe the big picture of your changes here to communicate to the maintainers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue.
   
   ## Types of changes
   
   What types of changes does your code introduce to Kylin?
   _Put an `x` in the boxes that apply_
   
   - [ ] Bugfix (non-breaking change which fixes an issue)
   - [ ] New feature (non-breaking change which adds functionality)
   - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
   - [ ] Documentation Update (if none of the other choices apply)
   
   ## Checklist
   
   _Put an `x` in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your code._
   
   - [ ] I have create an issue on [Kylin's jira](https://issues.apache.org/jira/browse/KYLIN), and have described the bug/feature there in detail
   - [ ] Commit messages in my PR start with the related jira ID, like "KYLIN-0000 Make Kylin project open-source"
   - [ ] Compiling and unit tests pass locally with my changes
   - [ ] I have added tests that prove my fix is effective or that my feature works
   - [ ] If this change need a document change, I will prepare another pr against the `document` branch
   - [ ] Any dependent changes have been merged
   
   ## Further comments
   
   If this is a relatively large or complex change, kick off the discussion at user@kylin or dev@kylin by explaining why you chose the solution you did and what alternatives you considered, etc...
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] hit-lacus commented on pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

Posted by GitBox <gi...@apache.org>.
hit-lacus commented on pull request #1535:
URL: https://github.com/apache/kylin/pull/1535#issuecomment-753997480


   ## Test
   
   ### Hadoop
   
   ```sh
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# hadoop version
   Hadoop 3.0.0-cdh6.2.0
   Source code repository http://github.com/cloudera/hadoop -r d1dff3d3a126da44e3458bbf148c3bc16ff55bd8
   Compiled by jenkins on 2019-03-14T06:39Z
   Compiled with protoc 2.5.0
   From source with checksum 7fd065792597e9cd1f12e1a7c7a0
   This command was run using /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/hadoop-common-3.0.0-cdh6.2.0.jar
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# hive --version
   WARNING: Use "yarn jar" to launch YARN applications.
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
   SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
   Hive 2.1.1-cdh6.2.0
   Subversion file:///container.redhat7/build/cdh/hive/2.1.1-cdh6.2.0/rpm/BUILD/hive-2.1.1-cdh6.2.0 -r fec2a053ef8eab794f509841a51199c2bb25bcfd
   Compiled by jenkins on Thu Mar 14 00:03:47 PDT 2019
   From source with checksum 54100195f9044f45ac33448c7b2542ad
   ```
   
   ### Replace jars
   
   ```sh
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# find spark/jars/ -name '*.jar' -type f -mmin -30
   spark/jars/hadoop-annotations-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-client.jar
   spark/jars/hadoop-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-httpfs.jar
   spark/jars/hadoop-hdfs-native-client.jar
   spark/jars/hadoop-hdfs-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-auth-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-httpfs-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-native-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-shuffle-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-app-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-core-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-server-web-proxy-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-server-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-api-3.0.0-cdh6.2.0.jar
   spark/jars/htrace-core4-4.2.0-incubating.jar
   spark/jars/htrace-core4-4.1.0-incubating.jar
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]#
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]#
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]#
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# find spark/jars/ -name '*2.7.3*' -type f
   spark/jars/univocity-parsers-2.7.3.jar
   spark/jars/hadoop-annotations-2.7.3.jar
   spark/jars/hadoop-common-2.7.3.jar
   spark/jars/hadoop-auth-2.7.3.jar
   spark/jars/hadoop-client-2.7.3.jar
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] hit-lacus edited a comment on pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

Posted by GitBox <gi...@apache.org>.
hit-lacus edited a comment on pull request #1535:
URL: https://github.com/apache/kylin/pull/1535#issuecomment-753997480


   ## Test
   
   ### Hadoop
   
   ```sh
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# hadoop version
   Hadoop 3.0.0-cdh6.2.0
   Source code repository http://github.com/cloudera/hadoop -r d1dff3d3a126da44e3458bbf148c3bc16ff55bd8
   Compiled by jenkins on 2019-03-14T06:39Z
   Compiled with protoc 2.5.0
   From source with checksum 7fd065792597e9cd1f12e1a7c7a0
   This command was run using /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/hadoop-common-3.0.0-cdh6.2.0.jar
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# hive --version
   WARNING: Use "yarn jar" to launch YARN applications.
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
   SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
   Hive 2.1.1-cdh6.2.0
   Subversion file:///container.redhat7/build/cdh/hive/2.1.1-cdh6.2.0/rpm/BUILD/hive-2.1.1-cdh6.2.0 -r fec2a053ef8eab794f509841a51199c2bb25bcfd
   Compiled by jenkins on Thu Mar 14 00:03:47 PDT 2019
   From source with checksum 54100195f9044f45ac33448c7b2542ad
   ```
   
   ### Replace jars
   
   ```sh
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]#  find spark/jars/ -name '*.jar' -type f -mmin -30
   spark/jars/hadoop-annotations-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-client.jar
   spark/jars/hadoop-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-httpfs.jar
   spark/jars/hadoop-hdfs-native-client.jar
   spark/jars/hadoop-hdfs-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-3.0.0-cdh6.2.0.jar
   spark/jars/re2j-1.1.jar
   spark/jars/woodstox-core-asl-4.4.1.jar
   spark/jars/hadoop-auth-3.0.0-cdh6.2.0.jar
   spark/jars/woodstox-core-5.1.0.jar
   spark/jars/hadoop-hdfs-httpfs-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-hdfs-native-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-shuffle-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-app-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-mapreduce-client-core-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-server-web-proxy-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-server-common-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-client-3.0.0-cdh6.2.0.jar
   spark/jars/hadoop-yarn-api-3.0.0-cdh6.2.0.jar
   spark/jars/htrace-core4-4.2.0-incubating.jar
   spark/jars/htrace-core4-4.1.0-incubating.jar
   spark/jars/commons-configuration2-2.1.jar
   spark/jars/commons-configuration2-2.1.1.jar
   spark/jars/re2j-1.0.jar
   spark/jars/woodstox-core-5.0.3.jar
   spark/jars/hive-exec-1.21.2.3.1.0.0-78.jar
   spark/jars/stax2-api-3.1.4.jar
   
   
   [root@cdh-1 apache-kylin-4.0.0-SNAPSHOT-bin]# find spark/jars/ -name '*2.7.3*' -type f
   spark/jars/univocity-parsers-2.7.3.jar
   spark/jars/hadoop-annotations-2.7.3.jar
   spark/jars/hadoop-common-2.7.3.jar
   spark/jars/hadoop-auth-2.7.3.jar
   spark/jars/hadoop-client-2.7.3.jar
   ```
   
   <img width="705" alt="image" src="https://user-images.githubusercontent.com/14030549/103547116-efee2500-4ede-11eb-900a-5836d75a06be.png">
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] hit-lacus commented on pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

Posted by GitBox <gi...@apache.org>.
hit-lacus commented on pull request #1535:
URL: https://github.com/apache/kylin/pull/1535#issuecomment-753998578


   ### Prevous Exception 
   
   ```java
   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
   21/01/04 08:31:26 INFO SparderContext: Current thread 36 create a SparkSession.
   21/01/04 08:31:26 INFO SparderContext: Init spark.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark thread starting.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark, waiting for done.
   21/01/04 08:31:27 ERROR SparderContext: Error for initializing spark
   java.lang.ExceptionInInitializerError
   	at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)
   	at java.lang.Class.forName0(Native Method)
   	at java.lang.Class.forName(Class.java:348)
   	at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
   	at org.apache.spark.sql.SparkSession$.hiveClassesArePresent(SparkSession.scala:1128)
   	at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:867)
   	at org.apache.spark.sql.SparderContext$$anonfun$initSpark$1$$anon$4.run(SparderContext.scala:150)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-cdh6.2.0
   	at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:174)
   	at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
   	at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
   	at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)
   	... 8 more
   21/01/04 08:31:27 INFO SparderContext: Setting initializing Spark thread to null.
   ```
   
   
   ```java
   [2021-01-04 09:03:45.442]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   03:44 INFO spark.SecurityManager: Changing modify acls groups to:
   21/01/04 09:03:44 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
   21/01/04 09:03:45 INFO yarn.ApplicationMaster: Preparing Local resources
   Exception in thread "main" java.lang.VerifyError: Bad return type
   Exception Details:
     Location:
       org/apache/hadoop/hdfs/DFSClient.getQuotaUsage(Ljava/lang/String;)Lorg/apache/hadoop/fs/QuotaUsage; @157: areturn
     Reason:
       Type 'org/apache/hadoop/fs/ContentSummary' (current frame, stack[0]) is not assignable to 'org/apache/hadoop/fs/QuotaUsage' (from method signature)
     Current Frame:
       bci: @157
       flags: { }
       locals: { 'org/apache/hadoop/hdfs/DFSClient', 'java/lang/String', 'org/apache/hadoop/ipc/RemoteException', 'java/io/IOException' }
       stack: { 'org/apache/hadoop/fs/ContentSummary' }
     Bytecode:
       0x0000000: 2ab6 00b5 2a13 01f4 2bb6 00b7 4d01 4e2a
       0x0000010: b400 422b b901 f502 003a 042c c600 1d2d
       0x0000020: c600 152c b600 b9a7 0012 3a05 2d19 05b6
       0x0000030: 00bb a700 072c b600 b919 04b0 3a04 1904
       0x0000040: 4e19 04bf 3a06 2cc6 001d 2dc6 0015 2cb6
       0x0000050: 00b9 a700 123a 072d 1907 b600 bba7 0007
       0x0000060: 2cb6 00b9 1906 bf4d 2c07 bd00 d459 0312
       0x0000070: d653 5904 12e0 5359 0512 e153 5906 1301
       0x0000080: f653 b600 d74e 2dc1 01f6 9900 14b2 0023
       0x0000090: 1301 f7b9 002b 0200 2a2b b601 f8b0 2dbf
       0x00000a0:
     Exception Handler Table:
       bci [35, 39] => handler: 42
       bci [15, 27] => handler: 60
       bci [15, 27] => handler: 68
       bci [78, 82] => handler: 85
       bci [60, 70] => handler: 68
       bci [4, 57] => handler: 103
       bci [60, 103] => handler: 103
     Stackmap Table:
       full_frame(@42,{Object[#751],Object[#774],Object[#829],Object[#799],Object[#1221]},{Object[#799]})
       same_frame(@53)
       same_frame(@57)
       full_frame(@60,{Object[#751],Object[#774],Object[#829],Object[#799]},{Object[#799]})
       same_locals_1_stack_item_frame(@68,Object[#799])
       full_frame(@85,{Object[#751],Object[#774],Object[#829],Object[#799],Top,Top,Object[#799]},{Object[#799]})
       same_frame(@96)
       same_frame(@100)
       full_frame(@103,{Object[#751],Object[#774]},{Object[#854]})
       append_frame(@158,Object[#854],Object[#814])
   
   	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:167)
   	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
   	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
   	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
   	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
   	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:219)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:217)
   	at scala.Option.foreach(Option.scala:257)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:217)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:780)
   	at java.security.AccessController.doPrivileged(Native Method)
   	at javax.security.auth.Subject.doAs(Subject.java:422)
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:779)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:803)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:834)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
   
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] hit-lacus merged pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

Posted by GitBox <gi...@apache.org>.
hit-lacus merged pull request #1535:
URL: https://github.com/apache/kylin/pull/1535


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] hit-lacus edited a comment on pull request #1535: KYLIN-4858 Support Kylin4 deployment on CDH 6.X

Posted by GitBox <gi...@apache.org>.
hit-lacus edited a comment on pull request #1535:
URL: https://github.com/apache/kylin/pull/1535#issuecomment-753998578


   ### Fixed Exception 
   
   - replace hive-exec-xxx.jar 
   
   ```java
   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
   21/01/04 08:31:26 INFO SparderContext: Current thread 36 create a SparkSession.
   21/01/04 08:31:26 INFO SparderContext: Init spark.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark thread starting.
   21/01/04 08:31:26 INFO SparderContext: Initializing Spark, waiting for done.
   21/01/04 08:31:27 ERROR SparderContext: Error for initializing spark
   java.lang.ExceptionInInitializerError
   	at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)
   	at java.lang.Class.forName0(Native Method)
   	at java.lang.Class.forName(Class.java:348)
   	at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
   	at org.apache.spark.sql.SparkSession$.hiveClassesArePresent(SparkSession.scala:1128)
   	at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:867)
   	at org.apache.spark.sql.SparderContext$$anonfun$initSpark$1$$anon$4.run(SparderContext.scala:150)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-cdh6.2.0
   	at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:174)
   	at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
   	at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
   	at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)
   	... 8 more
   21/01/04 08:31:27 INFO SparderContext: Setting initializing Spark thread to null.
   ```
   
   - rm -rf spark/jars/hadoop-hdfs* ; rm -rf spark/jars/hadoop-yarn* ; rm -rf spark/jars/hadoop-mapreduce* 
   
   ```java
   [2021-01-04 09:03:45.442]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   03:44 INFO spark.SecurityManager: Changing modify acls groups to:
   21/01/04 09:03:44 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
   21/01/04 09:03:45 INFO yarn.ApplicationMaster: Preparing Local resources
   Exception in thread "main" java.lang.VerifyError: Bad return type
   Exception Details:
     Location:
       org/apache/hadoop/hdfs/DFSClient.getQuotaUsage(Ljava/lang/String;)Lorg/apache/hadoop/fs/QuotaUsage; @157: areturn
     Reason:
       Type 'org/apache/hadoop/fs/ContentSummary' (current frame, stack[0]) is not assignable to 'org/apache/hadoop/fs/QuotaUsage' (from method signature)
     Current Frame:
       bci: @157
       flags: { }
       locals: { 'org/apache/hadoop/hdfs/DFSClient', 'java/lang/String', 'org/apache/hadoop/ipc/RemoteException', 'java/io/IOException' }
       stack: { 'org/apache/hadoop/fs/ContentSummary' }
     Bytecode:
       0x0000000: 2ab6 00b5 2a13 01f4 2bb6 00b7 4d01 4e2a
       0x0000010: b400 422b b901 f502 003a 042c c600 1d2d
       0x0000020: c600 152c b600 b9a7 0012 3a05 2d19 05b6
       0x0000030: 00bb a700 072c b600 b919 04b0 3a04 1904
       0x0000040: 4e19 04bf 3a06 2cc6 001d 2dc6 0015 2cb6
       0x0000050: 00b9 a700 123a 072d 1907 b600 bba7 0007
       0x0000060: 2cb6 00b9 1906 bf4d 2c07 bd00 d459 0312
       0x0000070: d653 5904 12e0 5359 0512 e153 5906 1301
       0x0000080: f653 b600 d74e 2dc1 01f6 9900 14b2 0023
       0x0000090: 1301 f7b9 002b 0200 2a2b b601 f8b0 2dbf
       0x00000a0:
     Exception Handler Table:
       bci [35, 39] => handler: 42
       bci [15, 27] => handler: 60
       bci [15, 27] => handler: 68
       bci [78, 82] => handler: 85
       bci [60, 70] => handler: 68
       bci [4, 57] => handler: 103
       bci [60, 103] => handler: 103
     Stackmap Table:
       full_frame(@42,{Object[#751],Object[#774],Object[#829],Object[#799],Object[#1221]},{Object[#799]})
       same_frame(@53)
       same_frame(@57)
       full_frame(@60,{Object[#751],Object[#774],Object[#829],Object[#799]},{Object[#799]})
       same_locals_1_stack_item_frame(@68,Object[#799])
       full_frame(@85,{Object[#751],Object[#774],Object[#829],Object[#799],Top,Top,Object[#799]},{Object[#799]})
       same_frame(@96)
       same_frame(@100)
       full_frame(@103,{Object[#751],Object[#774]},{Object[#854]})
       append_frame(@158,Object[#854],Object[#814])
   
   	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:167)
   	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
   	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
   	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
   	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
   	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:219)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8$$anonfun$apply$3.apply(ApplicationMaster.scala:217)
   	at scala.Option.foreach(Option.scala:257)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:217)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$8.apply(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:780)
   	at java.security.AccessController.doPrivileged(Native Method)
   	at javax.security.auth.Subject.doAs(Subject.java:422)
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:779)
   	at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:182)
   	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:803)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:834)
   	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
   
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org