You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by "Yanwen Lin (JIRA)" <ji...@apache.org> on 2019/03/13 09:25:00 UTC

[jira] [Created] (KYLIN-3871) Kylin inside Cloudera CDH Quickstart Sandbox

Yanwen Lin created KYLIN-3871:
---------------------------------

             Summary: Kylin inside Cloudera CDH Quickstart Sandbox
                 Key: KYLIN-3871
                 URL: https://issues.apache.org/jira/browse/KYLIN-3871
             Project: Kylin
          Issue Type: Test
          Components: Job Engine
    Affects Versions: v2.6.0
         Environment: Cloudera Quickstart Docker image:
- OS: centos6
- memory: 13GB
- disk: 20GB
- java: 1.8
- maven: 3.5.3
            Reporter: Yanwen Lin


When doing integration test, I met the following error. I know this is related to Java version error and the reason is that Kylin use java1.8 while Cloudera use java1.7. So I manually installed java 1.8 and set JAVA_HOME to point to Spark2.x. (I also type spark-submit --version to check this). But the bug did not go away. I guess during the process of Spark job, some command may change the Java version back to java1.7 (not sure). Is there anyway to force it not change back to Java1.7 or any workaround?

I have successfully finished maven installing and unit tests.

*Branch:*

realtime-streaming

*Executed command with problem:*

mvn verify -fae -Dhdp.version=2.4.0.0-169 -P sandbox

*Error stack:*

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/network/util/ByteUnit : Unsupported major.minor version 52.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
 at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at org.apache.spark.deploy.history.config$.<init>(config.scala:44)
 at org.apache.spark.deploy.history.config$.<clinit>(config.scala)
 at org.apache.spark.SparkConf$.<init>(SparkConf.scala:635)
 at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
 at org.apache.spark.SparkConf.set(SparkConf.scala:94)
 at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:76)
 at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:75)
 at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
 at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
 at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
 at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
 at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
 at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
 at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75)
 at org.apache.spark.SparkConf.<init>(SparkConf.scala:70)
 at org.apache.spark.SparkConf.<init>(SparkConf.scala:57)
 at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:62)
 at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:838)
 at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:869)
 at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)