You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by yuzhang <sh...@163.com> on 2019/03/23 01:08:53 UTC
spark task error occurs when run IT in sanbox
Hi team:
I meet java.lang.UnsupportedClassVersionError: org/apache/spark/network/util/ByteUnit : Unsupported major.minor version 52.0 when I run Kylin2.6.X integration test in sandbox 2.4.0.0.169.
Follow the http://kylin.apache.org/development/dev_env.html, I have installed the sanbox 2.4 and replaced jdk1.7 with jdk1.8(set JAVA_HOME point to jdk1.8 path and restart VM). download spark2.3.2 and untar it. Then I mvn verify. After a while, the test process ended with error.
I attach the error.log and yarn_error.log. Im looking for the solution from the internet or maybe someone who know the answer. Use this mail to trace the problem.
Best regards
yuzhang
| |
yuzhang
|
|
shifengdefannao@163.com
|
签名由网易邮箱大师定制
Re: spark task error occurs when run IT in sanbox
Posted by yuzhang <sh...@163.com>.
Hi elkan:
Thank you take time to reply.
Just as you said, the reason is the unmatched jdk version. I just set root's JAVA_HOME to point jdk1.8, but every server in sandbox has it's own user to run it. So I should re-link the original JAVA_HOME to new one.
Best regards
yuzhang
| |
yuzhang
|
|
shifengdefannao@163.com
|
签名由网易邮箱大师定制
On 3/25/2019 13:32,elkan1788<el...@gmail.com> wrote:
Seems there your Java running time environment was not clean. Please check
the JAVA_HOME and PATH system variable, use the echo command see what output
from them.
By the way the Kylin also can run in Hadoop clusters which use JDK1.7, just
a simple modify. The steps like this:
1. modify the HBase conf file which name is hbase-env.sh, add export
JAVA_HOME=/path/of/jdk1.8
2. append the below configure into kylin_job_conf.xml and
kylin_job_conf_inmem.xml files.
<property>
<name>mapred.child.env</name>
<value>JAVA_HOME=/usr/lib/java/jdk1.8.0_201</value>
</property>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>JAVA_HOME=/usr/lib/java/jdk1.8.0_201</value>
</property>
Hope those can help you!
--
Sent from: http://apache-kylin.74782.x6.nabble.com/
Re: spark task error occurs when run IT in sanbox
Posted by elkan1788 <el...@gmail.com>.
Seems there your Java running time environment was not clean. Please check
the JAVA_HOME and PATH system variable, use the echo command see what output
from them.
By the way the Kylin also can run in Hadoop clusters which use JDK1.7, just
a simple modify. The steps like this:
1. modify the HBase conf file which name is hbase-env.sh, add export
JAVA_HOME=/path/of/jdk1.8
2. append the below configure into kylin_job_conf.xml and
kylin_job_conf_inmem.xml files.
<property>
<name>mapred.child.env</name>
<value>JAVA_HOME=/usr/lib/java/jdk1.8.0_201</value>
</property>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>JAVA_HOME=/usr/lib/java/jdk1.8.0_201</value>
</property>
Hope those can help you!
--
Sent from: http://apache-kylin.74782.x6.nabble.com/