You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sea <26...@qq.com> on 2016/01/06 03:16:27 UTC
How to use Java8
Hi, all
I want to support java8, I use JDK1.8.0_65 in production environment, but it doesn't work. Should I build spark using jdk1.8, and set <java.version>1.8</java.version> in pom.xml?
java.lang.UnsupportedClassVersionError: Unsupported major.minor version 52.
回复: How to use Java8
Posted by Sea <26...@qq.com>.
thanks
------------------ 原始邮件 ------------------
发件人: "Andy Davidson";<An...@SantaCruzIntegration.com>;
发送时间: 2016年1月6日(星期三) 中午11:04
收件人: "Sea"<26...@qq.com>; "user"<us...@spark.apache.org>;
主题: Re: How to use Java8
Hi Sea
From: Sea <26...@qq.com>
Date: Tuesday, January 5, 2016 at 6:16 PM
To: "user @spark" <us...@spark.apache.org>
Subject: How to use Java8
Hi, all
I want to support java8, I use JDK1.8.0_65 in production environment, but it doesn't work. Should I build spark using jdk1.8, and set <java.version>1.8</java.version> in pom.xml?
java.lang.UnsupportedClassVersionError: Unsupported major.minor version 52.
Here are some notes I wrote about how to configure my data center to use java 8. You’ll probably need to do something like this
Your mileage may vary
Andy
Setting Java_HOME
ref: configure env vars
install java 8 on all nodes (master and slave)
install java 1.8 on master
$ ssh -i $KEY_FILE root@$SPARK_MASTER # ?? how was this package download from oracle? curl? yum install jdk-8u65-linux-x64.rpm
copy rpm to slaves and install java 1.8 on slaves
for i in `cat /root/spark-ec2/slaves`;do scp /home/ec2-user/jdk-8u65-linux-x64.rpm $i:; done pssh -i -h /root/spark-ec2/slaves ls -l pssh -i -h /root/spark-ec2/slaves yum install -y jdk-8u65-linux-x64.rpm
remove rpm from slaves. It is 153M
pssh -i -h /root/spark-ec2/slaves rm jdk-8u65-linux-x64.rpm
Configure spark to use java 1.8
ref: configure env vars
Make a back up of of config file
cp /root/spark/conf/spark-env.sh /root/spark/conf/spark-env.sh-`date +%Y-%m-%d:%H:%M:%S` pssh -i -h /root/spark-ec2/slaves cp /root/spark/conf/spark-env.sh /root/spark/conf/spark-env.sh-`date +%Y-%m-%d:%H:%M:%S` pssh -i -h /root/spark-ec2/slaves ls "/root/spark/conf/spark-env.sh*"
Edit /root/spark/conf/spark-env.sh, add
export JAVA_HOME=/usr/java/latest
Copy spark-env.sh to slaves
pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME /root/spark/conf/spark-env.sh for i in `cat /root/spark-ec2/slaves`;do scp /root/spark/conf/spark-env.sh $i:/root/spark/conf/spark-env.sh; done pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME /root/spark/conf/spark-env.sh
Re: How to use Java8
Posted by Andy Davidson <An...@SantaCruzIntegration.com>.
Hi Sea
From: Sea <26...@qq.com>
Date: Tuesday, January 5, 2016 at 6:16 PM
To: "user @spark" <us...@spark.apache.org>
Subject: How to use Java8
> Hi, all
> I want to support java8, I use JDK1.8.0_65 in production environment, but
> it doesn't work. Should I build spark using jdk1.8, and set
> <java.version>1.8</java.version> in pom.xml?
>
> java.lang.UnsupportedClassVersionError: Unsupported major.minor version 52.
Here are some notes I wrote about how to configure my data center to use
java 8. You’ll probably need to do something like this
Your mileage may vary
Andy
Setting Java_HOME
ref: configure env vars
<http://spark.apache.org/docs/latest/configuration.html#environment-variable
s>
install java 8 on all nodes (master and slave)
install java 1.8 on master
$ ssh -i $KEY_FILE root@$SPARK_MASTER
# ?? how was this package download from oracle? curl?
yum install jdk-8u65-linux-x64.rpm
copy rpm to slaves and install java 1.8 on slaves
for i in `cat /root/spark-ec2/slaves`;do scp
/home/ec2-user/jdk-8u65-linux-x64.rpm $i:; done
pssh -i -h /root/spark-ec2/slaves ls -l
pssh -i -h /root/spark-ec2/slaves yum install -y jdk-8u65-linux-x64.rpm
remove rpm from slaves. It is 153M
pssh -i -h /root/spark-ec2/slaves rm jdk-8u65-linux-x64.rpm
Configure spark to use java 1.8
ref: configure env vars
<http://spark.apache.org/docs/latest/configuration.html#environment-variable
s>
Make a back up of of config file
cp /root/spark/conf/spark-env.sh /root/spark/conf/spark-env.sh-`date
+%Y-%m-%d:%H:%M:%S`
pssh -i -h /root/spark-ec2/slaves cp /root/spark/conf/spark-env.sh
/root/spark/conf/spark-env.sh-`date +%Y-%m-%d:%H:%M:%S`
pssh -i -h /root/spark-ec2/slaves ls "/root/spark/conf/spark-env.sh*"
Edit /root/spark/conf/spark-env.sh, add
export JAVA_HOME=/usr/java/latest
Copy spark-env.sh to slaves
pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME
/root/spark/conf/spark-env.sh
for i in `cat /root/spark-ec2/slaves`;do scp /root/spark/conf/spark-env.sh
$i:/root/spark/conf/spark-env.sh; done
pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME
/root/spark/conf/spark-env.sh
>