You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Hande, Ranjit Dilip (Ranjit)" <ha...@avaya.com> on 2019/02/07 11:44:36 UTC

java.lang.IllegalArgumentException: Unsupported class file major version 55

Hi,

I am developing one java process which will consume data from Kafka using Apache Spark Streaming.
For this I am using following:

Java:
openjdk version "11.0.1" 2018-10-16 LTS
OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK 64-Bit Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)

Maven: (Spark Streaming)
<dependency>
	<groupId>org.apache.spark</groupId>
	<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
	<version>2.4.0</version>
</dependency>
<dependency>
	<groupId>org.apache.spark</groupId>
	<artifactId>spark-streaming_2.11</artifactId>
	<version>2.4.0</version>
</dependency>

I am able to compile project successfully but when I try to run I get following error:

{"@timestamp":"2019-02-07T11:54:30.624+05:30","@version":"1","message":"Application run failed","logger_name":"org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":40000,"stack_trace":"java.lang.IllegalStateException: Failed to execute CommandLineRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797) at org.springframework.boot.SpringApplication.run(SpringApplication.java:324) at com.avaya.measures.AgentMeasures.AgentMeasuresApplication.main(AgentMeasuresApplication.java:41) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at 

org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)\r\nCaused by: java.lang.IllegalArgumentException: Unsupported class file major version 55 at

 org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237) at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500) at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175) at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238) at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631) at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306) at scala.collection.immutable.List.foreach(List.scala:392) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162) at org.apache.spark.SparkContext.clean(SparkContext.scala:2326) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100) at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.take(RDD.scala:1337) at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735) at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834)\r\n"}

So I have two questions:
1. Am I doing something wrong?
2. Is OpenJDK version "11.0.1" supported with Apache Spark Streaming version
2.4.0

Thanks in anticipation.

Regards,
Ranjit




--
Sent from: https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_%26d%3DDwIBAg%26c%3DBFpWQw8bsuKpl1SgiZH64Q%26r%3DhY8w56GvvrSrxE6MDqREew%26m%3D8tBpQV4VhukLPILOO3Z5nH9tYoKGzhFJB3pzOsCB1hQ%26s%3Dd_7mVgFuAjOBAWLNSpSNF18ttMkmb6NIGFwKRRgfm7M%26e&amp;data=01%7C01%7Chande%40avaya.com%7Cc63beb61ad2a425bb21208d68ccbf6ca%7C04a2636c326d48ff93f8709875bd3aa9%7C1&amp;sdata=sHb7lBQPpaYgAMayEAF7ko9TRpOBqsyOZs1X27yugcs%3D&amp;reserved=0=

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

Posted by Felix Cheung <fe...@hotmail.com>.
And it might not work completely. Spark only officially supports JDK 8.

I’m not sure if JDK 9 and + support is complete?


________________________________
From: Jungtaek Lim <ka...@gmail.com>
Sent: Thursday, February 7, 2019 5:22 AM
To: Gabor Somogyi
Cc: Hande, Ranjit Dilip (Ranjit); user@spark.apache.org
Subject: Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

ASM 6 doesn't support Java 11. In master branch (for Spark 3.0) there's dependency upgrade on ASM 7 and also some efforts (if my understanding is right) to support Java 11, so you may need to use lower version of JDK (8 safest) for Spark 2.4.0, and try out master branch for preparing Java 11.

Thanks,
Jungtaek Lim (HeartSaVioR)

2019년 2월 7일 (목) 오후 9:18, Gabor Somogyi <ga...@gmail.com>>님이 작성:
Hi Hande,

"Unsupported class file major version 55" means java incompatibility.
This error means you're trying to load a Java "class" file that was compiled with a newer version of Java than you have installed.
For example, your .class file could have been compiled for JDK 8, and you're trying to run it with JDK 7.
Are you sure 11 is the only JDK which is the default?

Small number of peoples playing with JDK 11 but not heavily tested and used.
Spark may or may not work but not suggested for production in general.

BR,
G


On Thu, Feb 7, 2019 at 12:53 PM Hande, Ranjit Dilip (Ranjit) <ha...@avaya.com>> wrote:
Hi,

I am developing one java process which will consume data from Kafka using Apache Spark Streaming.
For this I am using following:

Java:
openjdk version "11.0.1" 2018-10-16 LTS
OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK 64-Bit Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)

Maven: (Spark Streaming)
<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
        <version>2.4.0</version>
</dependency>
<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.4.0</version>
</dependency>

I am able to compile project successfully but when I try to run I get following error:

{"@timestamp":"2019-02-07T11:54:30.624+05:30","@version":"1","message":"Application run failed","logger_name":"org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":40000,"stack_trace":"java.lang.IllegalStateException: Failed to execute CommandLineRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797) at org.springframework.boot.SpringApplication.run(SpringApplication.java:324) at com.avaya.measures.AgentMeasures.AgentMeasuresApplication.main(AgentMeasuresApplication.java:41) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at

org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)\r\nCaused by: java.lang.IllegalArgumentException: Unsupported class file major version 55 at

 org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136) at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237) at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500) at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175) at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238) at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631) at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306) at scala.collection.immutable.List.foreach(List.scala:392) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162) at org.apache.spark.SparkContext.clean(SparkContext.scala:2326) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100) at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.take(RDD.scala:1337) at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735) at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834)\r\n"}

So I have two questions:
1. Am I doing something wrong?
2. Is OpenJDK version "11.0.1" supported with Apache Spark Streaming version
2.4.0

Thanks in anticipation.

Regards,
Ranjit




--
Sent from: https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_%26d%3DDwIBAg%26c%3DBFpWQw8bsuKpl1SgiZH64Q%26r%3DhY8w56GvvrSrxE6MDqREew%26m%3D8tBpQV4VhukLPILOO3Z5nH9tYoKGzhFJB3pzOsCB1hQ%26s%3Dd_7mVgFuAjOBAWLNSpSNF18ttMkmb6NIGFwKRRgfm7M%26e&amp;data=01%7C01%7Chande%40avaya.com%7Cc63beb61ad2a425bb21208d68ccbf6ca%7C04a2636c326d48ff93f8709875bd3aa9%7C1&amp;sdata=sHb7lBQPpaYgAMayEAF7ko9TRpOBqsyOZs1X27yugcs%3D&amp;reserved=0=

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>


Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

Posted by Jungtaek Lim <ka...@gmail.com>.
ASM 6 doesn't support Java 11. In master branch (for Spark 3.0) there's
dependency upgrade on ASM 7 and also some efforts (if my understanding is
right) to support Java 11, so you may need to use lower version of JDK (8
safest) for Spark 2.4.0, and try out master branch for preparing Java 11.

Thanks,
Jungtaek Lim (HeartSaVioR)

2019년 2월 7일 (목) 오후 9:18, Gabor Somogyi <ga...@gmail.com>님이 작성:

> Hi Hande,
>
> "Unsupported class file major version 55" means java incompatibility.
> This error means you're trying to load a Java "class" file that was
> compiled with a newer version of Java than you have installed.
> For example, your .class file could have been compiled for JDK 8, and
> you're trying to run it with JDK 7.
> Are you sure 11 is the only JDK which is the default?
>
> Small number of peoples playing with JDK 11 but not heavily tested and
> used.
> Spark may or may not work but not suggested for production in general.
>
> BR,
> G
>
>
> On Thu, Feb 7, 2019 at 12:53 PM Hande, Ranjit Dilip (Ranjit) <
> hande@avaya.com> wrote:
>
>> Hi,
>>
>> I am developing one java process which will consume data from Kafka using
>> Apache Spark Streaming.
>> For this I am using following:
>>
>> Java:
>> openjdk version "11.0.1" 2018-10-16 LTS
>> OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK
>> 64-Bit Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)
>>
>> Maven: (Spark Streaming)
>> <dependency>
>>         <groupId>org.apache.spark</groupId>
>>         <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
>>         <version>2.4.0</version>
>> </dependency>
>> <dependency>
>>         <groupId>org.apache.spark</groupId>
>>         <artifactId>spark-streaming_2.11</artifactId>
>>         <version>2.4.0</version>
>> </dependency>
>>
>> I am able to compile project successfully but when I try to run I get
>> following error:
>>
>> {"@timestamp":"2019-02-07T11:54:30.624+05:30","@version":"1","message":"Application
>> run
>> failed","logger_name":"org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":40000,"stack_trace":"java.lang.IllegalStateException:
>> Failed to execute CommandLineRunner at
>> org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816)
>> at
>> org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797)
>> at
>> org.springframework.boot.SpringApplication.run(SpringApplication.java:324)
>> at
>> com.avaya.measures.AgentMeasures.AgentMeasuresApplication.main(AgentMeasuresApplication.java:41)
>> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
>> org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
>> at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at
>>
>> org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at
>> org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)\r\nCaused
>> by: java.lang.IllegalArgumentException: Unsupported class file major
>> version 55 at
>>
>>  org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166) at
>> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148) at
>> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136) at
>> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237) at
>> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
>> at
>> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
>> at
>> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
>> at
>> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
>> at
>> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
>> at
>> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
>> at
>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
>> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at
>> scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134) at
>> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
>> at
>> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
>> at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175) at
>> org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238) at
>> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631) at
>> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355) at
>> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
>> at
>> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
>> at scala.collection.immutable.List.foreach(List.scala:392) at
>> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
>> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162) at
>> org.apache.spark.SparkContext.clean(SparkContext.scala:2326) at
>> org.apache.spark.SparkContext.runJob(SparkContext.scala:2100) at
>> org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364) at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at
>> org.apache.spark.rdd.RDD.take(RDD.scala:1337) at
>> org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
>> at
>> org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
>> at
>> org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
>> at
>> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
>> at scala.util.Try$.apply(Try.scala:192) at
>> org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at
>> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
>> at
>> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
>> at
>> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
>> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at
>> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> at java.base/java.lang.Thread.run(Thread.java:834)\r\n"}
>>
>> So I have two questions:
>> 1. Am I doing something wrong?
>> 2. Is OpenJDK version "11.0.1" supported with Apache Spark Streaming
>> version
>> 2.4.0
>>
>> Thanks in anticipation.
>>
>> Regards,
>> Ranjit
>>
>>
>>
>>
>> --
>> Sent from:
>> https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_%26d%3DDwIBAg%26c%3DBFpWQw8bsuKpl1SgiZH64Q%26r%3DhY8w56GvvrSrxE6MDqREew%26m%3D8tBpQV4VhukLPILOO3Z5nH9tYoKGzhFJB3pzOsCB1hQ%26s%3Dd_7mVgFuAjOBAWLNSpSNF18ttMkmb6NIGFwKRRgfm7M%26e&amp;data=01%7C01%7Chande%40avaya.com%7Cc63beb61ad2a425bb21208d68ccbf6ca%7C04a2636c326d48ff93f8709875bd3aa9%7C1&amp;sdata=sHb7lBQPpaYgAMayEAF7ko9TRpOBqsyOZs1X27yugcs%3D&amp;reserved=0=
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>

Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

Posted by Gabor Somogyi <ga...@gmail.com>.
Hi Hande,

"Unsupported class file major version 55" means java incompatibility.
This error means you're trying to load a Java "class" file that was
compiled with a newer version of Java than you have installed.
For example, your .class file could have been compiled for JDK 8, and
you're trying to run it with JDK 7.
Are you sure 11 is the only JDK which is the default?

Small number of peoples playing with JDK 11 but not heavily tested and used.
Spark may or may not work but not suggested for production in general.

BR,
G


On Thu, Feb 7, 2019 at 12:53 PM Hande, Ranjit Dilip (Ranjit) <
hande@avaya.com> wrote:

> Hi,
>
> I am developing one java process which will consume data from Kafka using
> Apache Spark Streaming.
> For this I am using following:
>
> Java:
> openjdk version "11.0.1" 2018-10-16 LTS
> OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK
> 64-Bit Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)
>
> Maven: (Spark Streaming)
> <dependency>
>         <groupId>org.apache.spark</groupId>
>         <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
>         <version>2.4.0</version>
> </dependency>
> <dependency>
>         <groupId>org.apache.spark</groupId>
>         <artifactId>spark-streaming_2.11</artifactId>
>         <version>2.4.0</version>
> </dependency>
>
> I am able to compile project successfully but when I try to run I get
> following error:
>
> {"@timestamp":"2019-02-07T11:54:30.624+05:30","@version":"1","message":"Application
> run
> failed","logger_name":"org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":40000,"stack_trace":"java.lang.IllegalStateException:
> Failed to execute CommandLineRunner at
> org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816)
> at
> org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797)
> at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:324)
> at
> com.avaya.measures.AgentMeasures.AgentMeasuresApplication.main(AgentMeasuresApplication.java:41)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method) at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
> org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
> at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at
>
> org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at
> org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)\r\nCaused
> by: java.lang.IllegalArgumentException: Unsupported class file major
> version 55 at
>
>  org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166) at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148) at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136) at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237) at
> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
> at
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
> at
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
> at
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> at
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> at
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at
> scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134) at
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> at
> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
> at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175) at
> org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238) at
> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631) at
> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355) at
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
> at
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
> at scala.collection.immutable.List.foreach(List.scala:392) at
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162) at
> org.apache.spark.SparkContext.clean(SparkContext.scala:2326) at
> org.apache.spark.SparkContext.runJob(SparkContext.scala:2100) at
> org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364) at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at
> org.apache.spark.rdd.RDD.take(RDD.scala:1337) at
> org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
> at
> org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
> at
> org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
> at
> org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
> at scala.util.Try$.apply(Try.scala:192) at
> org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at
> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
> at
> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
> at
> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at
> org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
> at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> at java.base/java.lang.Thread.run(Thread.java:834)\r\n"}
>
> So I have two questions:
> 1. Am I doing something wrong?
> 2. Is OpenJDK version "11.0.1" supported with Apache Spark Streaming
> version
> 2.4.0
>
> Thanks in anticipation.
>
> Regards,
> Ranjit
>
>
>
>
> --
> Sent from:
> https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_%26d%3DDwIBAg%26c%3DBFpWQw8bsuKpl1SgiZH64Q%26r%3DhY8w56GvvrSrxE6MDqREew%26m%3D8tBpQV4VhukLPILOO3Z5nH9tYoKGzhFJB3pzOsCB1hQ%26s%3Dd_7mVgFuAjOBAWLNSpSNF18ttMkmb6NIGFwKRRgfm7M%26e&amp;data=01%7C01%7Chande%40avaya.com%7Cc63beb61ad2a425bb21208d68ccbf6ca%7C04a2636c326d48ff93f8709875bd3aa9%7C1&amp;sdata=sHb7lBQPpaYgAMayEAF7ko9TRpOBqsyOZs1X27yugcs%3D&amp;reserved=0=
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

Posted by Sean Owen <sr...@gmail.com>.
This means you compiled with Java 11, but are running on Java < 11. It's
not related to Spark.

On Fri, Apr 23, 2021 at 10:23 AM chansonzhang <zh...@gmail.com>
wrote:

> I just update the spark-* version in my pom.xml to match my spark and scala
> environment, and this solved the problem
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

Posted by chansonzhang <zh...@gmail.com>.
I just update the spark-* version in my pom.xml to match my spark and scala
environment, and this solved the problem




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org