You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Shixiong Zhu <zs...@gmail.com> on 2014/12/31 05:09:07 UTC

Why the major.minor version of the new hive-exec is 51.0?

The major.minor version of the new org.spark-project.hive.hive-exec is
51.0, so it will require people use JDK7. Is it intentional?

<dependency>
<groupId>org.spark-project.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>0.12.0-protobuf-2.5</version>
</dependency>

You can use the following steps to reproduce it (Need to use JDK6):

1. Create a Test.java file with the following content:

public class Test {

    public static void main(String[] args) throws Exception{
       Class.forName("org.apache.hadoop.hive.conf.HiveConf");
    }

}

2. javac Test.java
3. java -classpath
~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
Test

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at Test.main(Test.java:5)


Best Regards,
Shixiong Zhu

Re: Why the major.minor version of the new hive-exec is 51.0?

Posted by Ted Yu <yu...@gmail.com>.
I see.

I logged SPARK-5041 which references this thread.

Thanks

On Wed, Dec 31, 2014 at 12:57 PM, Michael Armbrust <mi...@databricks.com>
wrote:

> We actually do publish our own version of this jar, because the version
> that the hive team publishes is an uber jar and this breaks all kinds of
> things.  As a result I'd file the JIRA against Spark.
>
> On Wed, Dec 31, 2014 at 12:55 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Michael:
>> hive-exec-0.12.0-protobuf-2.5.jar is not generated from Spark source
>> code, right ?
>>
>> What would be done after the JIRA is opened ?
>>
>> Cheers
>>
>> On Wed, Dec 31, 2014 at 12:16 PM, Michael Armbrust <
>> michael@databricks.com> wrote:
>>
>>> This was not intended, can you open a JIRA?
>>>
>>> On Tue, Dec 30, 2014 at 8:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> I extracted org/apache/hadoop/hive/common/CompressionUtils.class from
>>>> the
>>>> jar and used hexdump to view the class file.
>>>> Bytes 6 and 7 are 00 and 33, respectively.
>>>>
>>>> According to http://en.wikipedia.org/wiki/Java_class_file, the jar was
>>>> produced using Java 7.
>>>>
>>>> FYI
>>>>
>>>> On Tue, Dec 30, 2014 at 8:09 PM, Shixiong Zhu <zs...@gmail.com>
>>>> wrote:
>>>>
>>>> > The major.minor version of the new org.spark-project.hive.hive-exec is
>>>> > 51.0, so it will require people use JDK7. Is it intentional?
>>>> >
>>>> > <dependency>
>>>> > <groupId>org.spark-project.hive</groupId>
>>>> > <artifactId>hive-exec</artifactId>
>>>> > <version>0.12.0-protobuf-2.5</version>
>>>> > </dependency>
>>>> >
>>>> > You can use the following steps to reproduce it (Need to use JDK6):
>>>> >
>>>> > 1. Create a Test.java file with the following content:
>>>> >
>>>> > public class Test {
>>>> >
>>>> >     public static void main(String[] args) throws Exception{
>>>> >        Class.forName("org.apache.hadoop.hive.conf.HiveConf");
>>>> >     }
>>>> >
>>>> > }
>>>> >
>>>> > 2. javac Test.java
>>>> > 3. java -classpath
>>>> >
>>>> >
>>>> ~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
>>>> > Test
>>>> >
>>>> > Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> > org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor
>>>> version 51.0
>>>> > at java.lang.ClassLoader.defineClass1(Native Method)
>>>> > at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>> > at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>> > at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> > at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>> > at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>> > at java.security.AccessController.doPrivileged(Native Method)
>>>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>> > at java.lang.Class.forName0(Native Method)
>>>> > at java.lang.Class.forName(Class.java:169)
>>>> > at Test.main(Test.java:5)
>>>> >
>>>> >
>>>> > Best Regards,
>>>> > Shixiong Zhu
>>>> >
>>>>
>>>
>>>
>>
>

Re: Why the major.minor version of the new hive-exec is 51.0?

Posted by Michael Armbrust <mi...@databricks.com>.
We actually do publish our own version of this jar, because the version
that the hive team publishes is an uber jar and this breaks all kinds of
things.  As a result I'd file the JIRA against Spark.

On Wed, Dec 31, 2014 at 12:55 PM, Ted Yu <yu...@gmail.com> wrote:

> Michael:
> hive-exec-0.12.0-protobuf-2.5.jar is not generated from Spark source code,
> right ?
>
> What would be done after the JIRA is opened ?
>
> Cheers
>
> On Wed, Dec 31, 2014 at 12:16 PM, Michael Armbrust <michael@databricks.com
> > wrote:
>
>> This was not intended, can you open a JIRA?
>>
>> On Tue, Dec 30, 2014 at 8:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> I extracted org/apache/hadoop/hive/common/CompressionUtils.class from the
>>> jar and used hexdump to view the class file.
>>> Bytes 6 and 7 are 00 and 33, respectively.
>>>
>>> According to http://en.wikipedia.org/wiki/Java_class_file, the jar was
>>> produced using Java 7.
>>>
>>> FYI
>>>
>>> On Tue, Dec 30, 2014 at 8:09 PM, Shixiong Zhu <zs...@gmail.com> wrote:
>>>
>>> > The major.minor version of the new org.spark-project.hive.hive-exec is
>>> > 51.0, so it will require people use JDK7. Is it intentional?
>>> >
>>> > <dependency>
>>> > <groupId>org.spark-project.hive</groupId>
>>> > <artifactId>hive-exec</artifactId>
>>> > <version>0.12.0-protobuf-2.5</version>
>>> > </dependency>
>>> >
>>> > You can use the following steps to reproduce it (Need to use JDK6):
>>> >
>>> > 1. Create a Test.java file with the following content:
>>> >
>>> > public class Test {
>>> >
>>> >     public static void main(String[] args) throws Exception{
>>> >        Class.forName("org.apache.hadoop.hive.conf.HiveConf");
>>> >     }
>>> >
>>> > }
>>> >
>>> > 2. javac Test.java
>>> > 3. java -classpath
>>> >
>>> >
>>> ~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
>>> > Test
>>> >
>>> > Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> > org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor version
>>> 51.0
>>> > at java.lang.ClassLoader.defineClass1(Native Method)
>>> > at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>> > at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>> > at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> > at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>> > at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>> > at java.security.AccessController.doPrivileged(Native Method)
>>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>> > at java.lang.Class.forName0(Native Method)
>>> > at java.lang.Class.forName(Class.java:169)
>>> > at Test.main(Test.java:5)
>>> >
>>> >
>>> > Best Regards,
>>> > Shixiong Zhu
>>> >
>>>
>>
>>
>

Re: Why the major.minor version of the new hive-exec is 51.0?

Posted by Michael Armbrust <mi...@databricks.com>.
This was not intended, can you open a JIRA?

On Tue, Dec 30, 2014 at 8:40 PM, Ted Yu <yu...@gmail.com> wrote:

> I extracted org/apache/hadoop/hive/common/CompressionUtils.class from the
> jar and used hexdump to view the class file.
> Bytes 6 and 7 are 00 and 33, respectively.
>
> According to http://en.wikipedia.org/wiki/Java_class_file, the jar was
> produced using Java 7.
>
> FYI
>
> On Tue, Dec 30, 2014 at 8:09 PM, Shixiong Zhu <zs...@gmail.com> wrote:
>
> > The major.minor version of the new org.spark-project.hive.hive-exec is
> > 51.0, so it will require people use JDK7. Is it intentional?
> >
> > <dependency>
> > <groupId>org.spark-project.hive</groupId>
> > <artifactId>hive-exec</artifactId>
> > <version>0.12.0-protobuf-2.5</version>
> > </dependency>
> >
> > You can use the following steps to reproduce it (Need to use JDK6):
> >
> > 1. Create a Test.java file with the following content:
> >
> > public class Test {
> >
> >     public static void main(String[] args) throws Exception{
> >        Class.forName("org.apache.hadoop.hive.conf.HiveConf");
> >     }
> >
> > }
> >
> > 2. javac Test.java
> > 3. java -classpath
> >
> >
> ~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
> > Test
> >
> > Exception in thread "main" java.lang.UnsupportedClassVersionError:
> > org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor version
> 51.0
> > at java.lang.ClassLoader.defineClass1(Native Method)
> > at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > at java.lang.Class.forName0(Native Method)
> > at java.lang.Class.forName(Class.java:169)
> > at Test.main(Test.java:5)
> >
> >
> > Best Regards,
> > Shixiong Zhu
> >
>

Re: Why the major.minor version of the new hive-exec is 51.0?

Posted by Ted Yu <yu...@gmail.com>.
I extracted org/apache/hadoop/hive/common/CompressionUtils.class from the
jar and used hexdump to view the class file.
Bytes 6 and 7 are 00 and 33, respectively.

According to http://en.wikipedia.org/wiki/Java_class_file, the jar was
produced using Java 7.

FYI

On Tue, Dec 30, 2014 at 8:09 PM, Shixiong Zhu <zs...@gmail.com> wrote:

> The major.minor version of the new org.spark-project.hive.hive-exec is
> 51.0, so it will require people use JDK7. Is it intentional?
>
> <dependency>
> <groupId>org.spark-project.hive</groupId>
> <artifactId>hive-exec</artifactId>
> <version>0.12.0-protobuf-2.5</version>
> </dependency>
>
> You can use the following steps to reproduce it (Need to use JDK6):
>
> 1. Create a Test.java file with the following content:
>
> public class Test {
>
>     public static void main(String[] args) throws Exception{
>        Class.forName("org.apache.hadoop.hive.conf.HiveConf");
>     }
>
> }
>
> 2. javac Test.java
> 3. java -classpath
>
> ~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
> Test
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor version 51.0
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:169)
> at Test.main(Test.java:5)
>
>
> Best Regards,
> Shixiong Zhu
>