You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Flavio Pompermaier <po...@okkam.it> on 2016/02/03 14:38:44 UTC

Flink cluster and Java 8

Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version
(1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
version 52.0

Is there any other setting I forgot to check? Do I have to change also the
source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
yes I did

On Thu, Feb 4, 2016 at 5:44 PM, Maximilian Michels <mx...@apache.org> wrote:

> I see. Did you perform a full "mvn clean package -DskipTests" after
> you changed the source level to 1.8?
>
> On Thu, Feb 4, 2016 at 4:33 PM, Flavio Pompermaier <po...@okkam.it>
> wrote:
> > Flink compiles correctly using java 8 as long as you leave java 1.7
> source
> > and target in the maven java compiler.
> > If you change them to 1.8 flink-core doesn't compile anymore.
> >
> > On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <mx...@apache.org>
> wrote:
> >>
> >> Hi Flavio,
> >>
> >> To address your points:
> >>
> >> 1) It runs. That's fine.
> >> 2) It doesn't work to run a Java 8 compiled Flink job with Java 7
> >> Flink cluster if you use Java 8 non-backwards-compatible features in
> >> your job.
> >> 3) I compile Flink daily with Java 8. Also, we have Travis CI tests
> >> which uses OpenJDK and OracaleJDK 7/8 to compile.
> >>
> >> I think there is something wrong with the configuration of your build
> >> setup.
> >>
> >> Cheers,
> >> Max
> >>
> >> On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
> >> <po...@okkam.it> wrote:
> >> > I've tested several configurations (also changing my compilation to
> 1.7
> >> > but
> >> > then sesame 4 was causing the error [1]):
> >> >
> >> > Flink compiled with java 1.7 (default), runned within Eclipse with
> Java
> >> > 8:
> >> > OK
> >> > Flink compiled with java 1.7 (default), runned the cluster with java
> 8:
> >> > not
> >> > able to run my job compiled with java 1.8 and causing the reported
> >> > exception
> >> > (unsupported major.minor version)
> >> > Flink compiled with java 1.8: not able to compile without the reported
> >> > modifications, but then the job was running fine
> >> >
> >> > I don't know if you ever tested all those configurations but I'm sure
> it
> >> > wasn't working when deployed in the cluster.
> >> >
> >> > [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
> >> >
> >> > On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <se...@apache.org>
> wrote:
> >> >>
> >> >> Hi!
> >> >>
> >> >> I am running Java 8 for a year without an issue. The code is compiled
> >> >> for
> >> >> target Java 7, but can be run with Java 8.
> >> >> User code that is targeted for Java 8 can be run if Flink is run with
> >> >> Java
> >> >> 8.
> >> >>
> >> >> The initial error you got was because you probably compiled with
> Java 8
> >> >> as
> >> >> the target, and ran it with Java 7.
> >> >>
> >> >> I would just leave the target to be 1.7 and run it in a Java 8 JVM.
> >> >> User
> >> >> code can also be Java 8, that mixes seamlessly.
> >> >>
> >> >> Stephan
> >> >>
> >> >>
> >> >> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier
> >> >> <po...@okkam.it>
> >> >> wrote:
> >> >>>
> >> >>> Anyone looking into this? Java 7 reached its end of life at april
> 2015
> >> >>> with its last public update (numer 80) and the ability to run Java 8
> >> >>> jobs
> >> >>> would be more and more important in the future. IMHO, the default
> >> >>> target of
> >> >>> the maven compiler plugin should be set to 1.8 in the 1.0 release.
> In
> >> >>> most
> >> >>> of the cases this would be backward compatible and if it's not you
> can
> >> >>> always recompile it with 1.7 (but as an exception this time).
> >> >>> Obviously this is not urgent, I just wanted to point this out and
> >> >>> hopefully help someone else facing the same problem
> >> >>>
> >> >>> Best,
> >> >>> Flavio
> >> >>>
> >> >>>
> >> >>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier
> >> >>> <po...@okkam.it>
> >> >>> wrote:
> >> >>>>
> >> >>>> I've fixed it changing the copy method in the TupleSerializer as
> >> >>>> follow:
> >> >>>>
> >> >>>> @Override
> >> >>>> public T copy(T from, T reuse) {
> >> >>>> for (int i = 0; i < arity; i++) {
> >> >>>> Object copy = fieldSerializers[i].copy(from.getField(i));
> >> >>>> reuse.setField(copy, i);
> >> >>>> }
> >> >>>> return reuse;
> >> >>>> }
> >> >>>>
> >> >>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
> >> >>>>
> >> >>>> assertEquals(NUM_ELEMENTS,
> >> >>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
> >> >>>>
> >> >>>> I hope it helps..
> >> >>>>
> >> >>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
> >> >>>> <po...@okkam.it> wrote:
> >> >>>>>
> >> >>>>> I've checked the compiled classes with javap -verbose and indeed
> >> >>>>> they
> >> >>>>> had a major.verion=51 (java 7).
> >> >>>>> So I've changed the source and target to 1.8 in the main pom.xm
> and
> >> >>>>> now
> >> >>>>> the generated .class have major.verion=52.
> >> >>>>> Unfortunately now I get this error:
> >> >>>>>
> >> >>>>> [ERROR]
> >> >>>>>
> >> >>>>>
> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
> >> >>>>> incompatible types: void cannot be converted to java.lang.Object
> >> >>>>>
> >> >>>>> How can I fix it? I also tried to upgrade the maven compiler to
> 3.5
> >> >>>>> but
> >> >>>>> it didn't help :(
> >> >>>>>
> >> >>>>> Best,
> >> >>>>> Flavio
> >> >>>>>
> >> >>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
> >> >>>>> <po...@okkam.it> wrote:
> >> >>>>>>
> >> >>>>>> Hi to all,
> >> >>>>>>
> >> >>>>>> I was trying to make my Java 8 application to run on a Flink
> 0.10.1
> >> >>>>>> cluster.
> >> >>>>>> I've compiled both Flink sources and my app with the same Java
> >> >>>>>> version
> >> >>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM
> >> >>>>>> in every
> >> >>>>>> flink-conf.yml of the cluster.
> >> >>>>>>
> >> >>>>>> I always get the following Exception:
> >> >>>>>>
> >> >>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported
> >> >>>>>> major.minor
> >> >>>>>> version 52.0
> >> >>>>>>
> >> >>>>>> Is there any other setting I forgot to check? Do I have to change
> >> >>>>>> also
> >> >>>>>> the source and target to 1.8 in the maven compiler settings of
> the
> >> >>>>>> main pom?
> >> >>>>>>
> >> >>>>>> Best,
> >> >>>>>> Flavio
> >> >>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>
> >> >>
> >> >
> >> >
> >
> >
>

Re: Flink cluster and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
I see. Did you perform a full "mvn clean package -DskipTests" after
you changed the source level to 1.8?

On Thu, Feb 4, 2016 at 4:33 PM, Flavio Pompermaier <po...@okkam.it> wrote:
> Flink compiles correctly using java 8 as long as you leave java 1.7 source
> and target in the maven java compiler.
> If you change them to 1.8 flink-core doesn't compile anymore.
>
> On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <mx...@apache.org> wrote:
>>
>> Hi Flavio,
>>
>> To address your points:
>>
>> 1) It runs. That's fine.
>> 2) It doesn't work to run a Java 8 compiled Flink job with Java 7
>> Flink cluster if you use Java 8 non-backwards-compatible features in
>> your job.
>> 3) I compile Flink daily with Java 8. Also, we have Travis CI tests
>> which uses OpenJDK and OracaleJDK 7/8 to compile.
>>
>> I think there is something wrong with the configuration of your build
>> setup.
>>
>> Cheers,
>> Max
>>
>> On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
>> <po...@okkam.it> wrote:
>> > I've tested several configurations (also changing my compilation to 1.7
>> > but
>> > then sesame 4 was causing the error [1]):
>> >
>> > Flink compiled with java 1.7 (default), runned within Eclipse with Java
>> > 8:
>> > OK
>> > Flink compiled with java 1.7 (default), runned the cluster with java 8:
>> > not
>> > able to run my job compiled with java 1.8 and causing the reported
>> > exception
>> > (unsupported major.minor version)
>> > Flink compiled with java 1.8: not able to compile without the reported
>> > modifications, but then the job was running fine
>> >
>> > I don't know if you ever tested all those configurations but I'm sure it
>> > wasn't working when deployed in the cluster.
>> >
>> > [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>> >
>> > On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <se...@apache.org> wrote:
>> >>
>> >> Hi!
>> >>
>> >> I am running Java 8 for a year without an issue. The code is compiled
>> >> for
>> >> target Java 7, but can be run with Java 8.
>> >> User code that is targeted for Java 8 can be run if Flink is run with
>> >> Java
>> >> 8.
>> >>
>> >> The initial error you got was because you probably compiled with Java 8
>> >> as
>> >> the target, and ran it with Java 7.
>> >>
>> >> I would just leave the target to be 1.7 and run it in a Java 8 JVM.
>> >> User
>> >> code can also be Java 8, that mixes seamlessly.
>> >>
>> >> Stephan
>> >>
>> >>
>> >> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier
>> >> <po...@okkam.it>
>> >> wrote:
>> >>>
>> >>> Anyone looking into this? Java 7 reached its end of life at april 2015
>> >>> with its last public update (numer 80) and the ability to run Java 8
>> >>> jobs
>> >>> would be more and more important in the future. IMHO, the default
>> >>> target of
>> >>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In
>> >>> most
>> >>> of the cases this would be backward compatible and if it's not you can
>> >>> always recompile it with 1.7 (but as an exception this time).
>> >>> Obviously this is not urgent, I just wanted to point this out and
>> >>> hopefully help someone else facing the same problem
>> >>>
>> >>> Best,
>> >>> Flavio
>> >>>
>> >>>
>> >>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier
>> >>> <po...@okkam.it>
>> >>> wrote:
>> >>>>
>> >>>> I've fixed it changing the copy method in the TupleSerializer as
>> >>>> follow:
>> >>>>
>> >>>> @Override
>> >>>> public T copy(T from, T reuse) {
>> >>>> for (int i = 0; i < arity; i++) {
>> >>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>> >>>> reuse.setField(copy, i);
>> >>>> }
>> >>>> return reuse;
>> >>>> }
>> >>>>
>> >>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>> >>>>
>> >>>> assertEquals(NUM_ELEMENTS,
>> >>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>> >>>>
>> >>>> I hope it helps..
>> >>>>
>> >>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>> >>>> <po...@okkam.it> wrote:
>> >>>>>
>> >>>>> I've checked the compiled classes with javap -verbose and indeed
>> >>>>> they
>> >>>>> had a major.verion=51 (java 7).
>> >>>>> So I've changed the source and target to 1.8 in the main pom.xm and
>> >>>>> now
>> >>>>> the generated .class have major.verion=52.
>> >>>>> Unfortunately now I get this error:
>> >>>>>
>> >>>>> [ERROR]
>> >>>>>
>> >>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>> >>>>> incompatible types: void cannot be converted to java.lang.Object
>> >>>>>
>> >>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5
>> >>>>> but
>> >>>>> it didn't help :(
>> >>>>>
>> >>>>> Best,
>> >>>>> Flavio
>> >>>>>
>> >>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>> >>>>> <po...@okkam.it> wrote:
>> >>>>>>
>> >>>>>> Hi to all,
>> >>>>>>
>> >>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>> >>>>>> cluster.
>> >>>>>> I've compiled both Flink sources and my app with the same Java
>> >>>>>> version
>> >>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM
>> >>>>>> in every
>> >>>>>> flink-conf.yml of the cluster.
>> >>>>>>
>> >>>>>> I always get the following Exception:
>> >>>>>>
>> >>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported
>> >>>>>> major.minor
>> >>>>>> version 52.0
>> >>>>>>
>> >>>>>> Is there any other setting I forgot to check? Do I have to change
>> >>>>>> also
>> >>>>>> the source and target to 1.8 in the maven compiler settings of the
>> >>>>>> main pom?
>> >>>>>>
>> >>>>>> Best,
>> >>>>>> Flavio
>> >>>>>
>> >>>>>
>> >>>>>
>> >>>>
>> >>
>> >
>> >
>
>

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
Flink compiles correctly using java 8 as long as you leave java 1.7 source
and target in the maven java compiler.
If you change them to 1.8 flink-core doesn't compile anymore.

On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <mx...@apache.org> wrote:

> Hi Flavio,
>
> To address your points:
>
> 1) It runs. That's fine.
> 2) It doesn't work to run a Java 8 compiled Flink job with Java 7
> Flink cluster if you use Java 8 non-backwards-compatible features in
> your job.
> 3) I compile Flink daily with Java 8. Also, we have Travis CI tests
> which uses OpenJDK and OracaleJDK 7/8 to compile.
>
> I think there is something wrong with the configuration of your build
> setup.
>
> Cheers,
> Max
>
> On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
> <po...@okkam.it> wrote:
> > I've tested several configurations (also changing my compilation to 1.7
> but
> > then sesame 4 was causing the error [1]):
> >
> > Flink compiled with java 1.7 (default), runned within Eclipse with Java
> 8:
> > OK
> > Flink compiled with java 1.7 (default), runned the cluster with java 8:
> not
> > able to run my job compiled with java 1.8 and causing the reported
> exception
> > (unsupported major.minor version)
> > Flink compiled with java 1.8: not able to compile without the reported
> > modifications, but then the job was running fine
> >
> > I don't know if you ever tested all those configurations but I'm sure it
> > wasn't working when deployed in the cluster.
> >
> > [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
> >
> > On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <se...@apache.org> wrote:
> >>
> >> Hi!
> >>
> >> I am running Java 8 for a year without an issue. The code is compiled
> for
> >> target Java 7, but can be run with Java 8.
> >> User code that is targeted for Java 8 can be run if Flink is run with
> Java
> >> 8.
> >>
> >> The initial error you got was because you probably compiled with Java 8
> as
> >> the target, and ran it with Java 7.
> >>
> >> I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
> >> code can also be Java 8, that mixes seamlessly.
> >>
> >> Stephan
> >>
> >>
> >> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <
> pompermaier@okkam.it>
> >> wrote:
> >>>
> >>> Anyone looking into this? Java 7 reached its end of life at april 2015
> >>> with its last public update (numer 80) and the ability to run Java 8
> jobs
> >>> would be more and more important in the future. IMHO, the default
> target of
> >>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In
> most
> >>> of the cases this would be backward compatible and if it's not you can
> >>> always recompile it with 1.7 (but as an exception this time).
> >>> Obviously this is not urgent, I just wanted to point this out and
> >>> hopefully help someone else facing the same problem
> >>>
> >>> Best,
> >>> Flavio
> >>>
> >>>
> >>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <
> pompermaier@okkam.it>
> >>> wrote:
> >>>>
> >>>> I've fixed it changing the copy method in the TupleSerializer as
> follow:
> >>>>
> >>>> @Override
> >>>> public T copy(T from, T reuse) {
> >>>> for (int i = 0; i < arity; i++) {
> >>>> Object copy = fieldSerializers[i].copy(from.getField(i));
> >>>> reuse.setField(copy, i);
> >>>> }
> >>>> return reuse;
> >>>> }
> >>>>
> >>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
> >>>>
> >>>> assertEquals(NUM_ELEMENTS,
> >>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
> >>>>
> >>>> I hope it helps..
> >>>>
> >>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
> >>>> <po...@okkam.it> wrote:
> >>>>>
> >>>>> I've checked the compiled classes with javap -verbose and indeed they
> >>>>> had a major.verion=51 (java 7).
> >>>>> So I've changed the source and target to 1.8 in the main pom.xm and
> now
> >>>>> the generated .class have major.verion=52.
> >>>>> Unfortunately now I get this error:
> >>>>>
> >>>>> [ERROR]
> >>>>>
> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
> >>>>> incompatible types: void cannot be converted to java.lang.Object
> >>>>>
> >>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5
> but
> >>>>> it didn't help :(
> >>>>>
> >>>>> Best,
> >>>>> Flavio
> >>>>>
> >>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
> >>>>> <po...@okkam.it> wrote:
> >>>>>>
> >>>>>> Hi to all,
> >>>>>>
> >>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
> >>>>>> cluster.
> >>>>>> I've compiled both Flink sources and my app with the same Java
> version
> >>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM
> in every
> >>>>>> flink-conf.yml of the cluster.
> >>>>>>
> >>>>>> I always get the following Exception:
> >>>>>>
> >>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
> >>>>>> version 52.0
> >>>>>>
> >>>>>> Is there any other setting I forgot to check? Do I have to change
> also
> >>>>>> the source and target to 1.8 in the maven compiler settings of the
> main pom?
> >>>>>>
> >>>>>> Best,
> >>>>>> Flavio
> >>>>>
> >>>>>
> >>>>>
> >>>>
> >>
> >
> >
>

Re: Flink cluster and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
Hi Flavio,

To address your points:

1) It runs. That's fine.
2) It doesn't work to run a Java 8 compiled Flink job with Java 7
Flink cluster if you use Java 8 non-backwards-compatible features in
your job.
3) I compile Flink daily with Java 8. Also, we have Travis CI tests
which uses OpenJDK and OracaleJDK 7/8 to compile.

I think there is something wrong with the configuration of your build setup.

Cheers,
Max

On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
<po...@okkam.it> wrote:
> I've tested several configurations (also changing my compilation to 1.7 but
> then sesame 4 was causing the error [1]):
>
> Flink compiled with java 1.7 (default), runned within Eclipse with Java 8:
> OK
> Flink compiled with java 1.7 (default), runned the cluster with java 8: not
> able to run my job compiled with java 1.8 and causing the reported exception
> (unsupported major.minor version)
> Flink compiled with java 1.8: not able to compile without the reported
> modifications, but then the job was running fine
>
> I don't know if you ever tested all those configurations but I'm sure it
> wasn't working when deployed in the cluster.
>
> [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>
> On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <se...@apache.org> wrote:
>>
>> Hi!
>>
>> I am running Java 8 for a year without an issue. The code is compiled for
>> target Java 7, but can be run with Java 8.
>> User code that is targeted for Java 8 can be run if Flink is run with Java
>> 8.
>>
>> The initial error you got was because you probably compiled with Java 8 as
>> the target, and ran it with Java 7.
>>
>> I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
>> code can also be Java 8, that mixes seamlessly.
>>
>> Stephan
>>
>>
>> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <po...@okkam.it>
>> wrote:
>>>
>>> Anyone looking into this? Java 7 reached its end of life at april 2015
>>> with its last public update (numer 80) and the ability to run Java 8 jobs
>>> would be more and more important in the future. IMHO, the default target of
>>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In most
>>> of the cases this would be backward compatible and if it's not you can
>>> always recompile it with 1.7 (but as an exception this time).
>>> Obviously this is not urgent, I just wanted to point this out and
>>> hopefully help someone else facing the same problem
>>>
>>> Best,
>>> Flavio
>>>
>>>
>>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <po...@okkam.it>
>>> wrote:
>>>>
>>>> I've fixed it changing the copy method in the TupleSerializer as follow:
>>>>
>>>> @Override
>>>> public T copy(T from, T reuse) {
>>>> for (int i = 0; i < arity; i++) {
>>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>>>> reuse.setField(copy, i);
>>>> }
>>>> return reuse;
>>>> }
>>>>
>>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>>>>
>>>> assertEquals(NUM_ELEMENTS,
>>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>>>>
>>>> I hope it helps..
>>>>
>>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>>>> <po...@okkam.it> wrote:
>>>>>
>>>>> I've checked the compiled classes with javap -verbose and indeed they
>>>>> had a major.verion=51 (java 7).
>>>>> So I've changed the source and target to 1.8 in the main pom.xm and now
>>>>> the generated .class have major.verion=52.
>>>>> Unfortunately now I get this error:
>>>>>
>>>>> [ERROR]
>>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>>>>> incompatible types: void cannot be converted to java.lang.Object
>>>>>
>>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>>>>> it didn't help :(
>>>>>
>>>>> Best,
>>>>> Flavio
>>>>>
>>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>>>>> <po...@okkam.it> wrote:
>>>>>>
>>>>>> Hi to all,
>>>>>>
>>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>>>>> cluster.
>>>>>> I've compiled both Flink sources and my app with the same Java version
>>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>>>>> flink-conf.yml of the cluster.
>>>>>>
>>>>>> I always get the following Exception:
>>>>>>
>>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>>>>> version 52.0
>>>>>>
>>>>>> Is there any other setting I forgot to check? Do I have to change also
>>>>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>>>>
>>>>>> Best,
>>>>>> Flavio
>>>>>
>>>>>
>>>>>
>>>>
>>
>
>

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
I've tested several configurations (also changing my compilation to 1.7 but
then sesame 4 was causing the error [1]):

   1. Flink compiled with java 1.7 (default), runned within Eclipse with
   Java 8: OK
   2. Flink compiled with java 1.7 (default), runned the cluster with java
   8: not able to run my job compiled with java 1.8 and causing the reported
   exception (unsupported major.minor version)
   3. Flink compiled with java 1.8: not able to compile without the
   reported modifications, but then the job was running fine

I don't know if you ever tested all those configurations but I'm sure it
wasn't working when deployed in the cluster.

[1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view

On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <se...@apache.org> wrote:

> Hi!
>
> I am running Java 8 for a year without an issue. The code is compiled for
> target Java 7, but can be run with Java 8.
> User code that is targeted for Java 8 can be run if Flink is run with Java
> 8.
>
> The initial error you got was because you probably compiled with Java 8 as
> the target, and ran it with Java 7.
>
> I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
> code can also be Java 8, that mixes seamlessly.
>
> Stephan
>
>
> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <po...@okkam.it>
> wrote:
>
>> Anyone looking into this? Java 7 reached its end of life at april 2015
>> with its last public update (numer 80) and the ability to run Java 8 jobs
>> would be more and more important in the future. IMHO, the default target of
>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In most
>> of the cases this would be backward compatible and if it's not you can
>> always recompile it with 1.7 (but as an exception this time).
>> Obviously this is not urgent, I just wanted to point this out and
>> hopefully help someone else facing the same problem
>>
>> Best,
>> Flavio
>>
>>
>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <po...@okkam.it>
>> wrote:
>>
>>> I've fixed it changing the copy method in the *TupleSerializer* as
>>> follow:
>>>
>>> @Override
>>> public T copy(T from, T reuse) {
>>> for (int i = 0; i < arity; i++) {
>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>>> reuse.setField(copy, i);
>>> }
>>> return reuse;
>>> }
>>>
>>> And commenting line 50 in *CollectionExecutionAccumulatorsTest*:
>>>
>>> assertEquals(NUM_ELEMENTS,
>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>>>
>>> I hope it helps..
>>>
>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <pompermaier@okkam.it
>>> > wrote:
>>>
>>>> I've checked the compiled classes with javap -verbose and indeed they
>>>> had a major.verion=51 (java 7).
>>>> So I've changed the source and target to 1.8 in the main pom.xm and now
>>>> the generated .class have major.verion=52.
>>>> Unfortunately now I get this error:
>>>>
>>>> [ERROR]
>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>>>> incompatible types: void cannot be converted to java.lang.Object
>>>>
>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>>>> it didn't help :(
>>>>
>>>> Best,
>>>> Flavio
>>>>
>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <
>>>> pompermaier@okkam.it> wrote:
>>>>
>>>>> Hi to all,
>>>>>
>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>>>> cluster.
>>>>> I've compiled both Flink sources and my app with the same Java version
>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>>>> flink-conf.yml of the cluster.
>>>>>
>>>>> I always get the following Exception:
>>>>>
>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>>>> version 52.0
>>>>>
>>>>> Is there any other setting I forgot to check? Do I have to change also
>>>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>>>
>>>>> Best,
>>>>> Flavio
>>>>>
>>>>
>>>>
>>>>
>>>
>

Re: Flink cluster and Java 8

Posted by Stephan Ewen <se...@apache.org>.
Hi!

I am running Java 8 for a year without an issue. The code is compiled for
target Java 7, but can be run with Java 8.
User code that is targeted for Java 8 can be run if Flink is run with Java
8.

The initial error you got was because you probably compiled with Java 8 as
the target, and ran it with Java 7.

I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
code can also be Java 8, that mixes seamlessly.

Stephan


On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <po...@okkam.it>
wrote:

> Anyone looking into this? Java 7 reached its end of life at april 2015
> with its last public update (numer 80) and the ability to run Java 8 jobs
> would be more and more important in the future. IMHO, the default target of
> the maven compiler plugin should be set to 1.8 in the 1.0 release. In most
> of the cases this would be backward compatible and if it's not you can
> always recompile it with 1.7 (but as an exception this time).
> Obviously this is not urgent, I just wanted to point this out and
> hopefully help someone else facing the same problem
>
> Best,
> Flavio
>
>
> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <po...@okkam.it>
> wrote:
>
>> I've fixed it changing the copy method in the *TupleSerializer* as
>> follow:
>>
>> @Override
>> public T copy(T from, T reuse) {
>> for (int i = 0; i < arity; i++) {
>> Object copy = fieldSerializers[i].copy(from.getField(i));
>> reuse.setField(copy, i);
>> }
>> return reuse;
>> }
>>
>> And commenting line 50 in *CollectionExecutionAccumulatorsTest*:
>>
>> assertEquals(NUM_ELEMENTS,
>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>>
>> I hope it helps..
>>
>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <po...@okkam.it>
>> wrote:
>>
>>> I've checked the compiled classes with javap -verbose and indeed they
>>> had a major.verion=51 (java 7).
>>> So I've changed the source and target to 1.8 in the main pom.xm and now
>>> the generated .class have major.verion=52.
>>> Unfortunately now I get this error:
>>>
>>> [ERROR]
>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>>> incompatible types: void cannot be converted to java.lang.Object
>>>
>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>>> it didn't help :(
>>>
>>> Best,
>>> Flavio
>>>
>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <pompermaier@okkam.it
>>> > wrote:
>>>
>>>> Hi to all,
>>>>
>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>>> cluster.
>>>> I've compiled both Flink sources and my app with the same Java version
>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>>> flink-conf.yml of the cluster.
>>>>
>>>> I always get the following Exception:
>>>>
>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>>> version 52.0
>>>>
>>>> Is there any other setting I forgot to check? Do I have to change also
>>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>>
>>>> Best,
>>>> Flavio
>>>>
>>>
>>>
>>>
>>

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
Anyone looking into this? Java 7 reached its end of life at april 2015 with
its last public update (numer 80) and the ability to run Java 8 jobs would
be more and more important in the future. IMHO, the default target of the
maven compiler plugin should be set to 1.8 in the 1.0 release. In most of
the cases this would be backward compatible and if it's not you can always
recompile it with 1.7 (but as an exception this time).
Obviously this is not urgent, I just wanted to point this out and hopefully
help someone else facing the same problem

Best,
Flavio

On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <po...@okkam.it>
wrote:

> I've fixed it changing the copy method in the *TupleSerializer* as follow:
>
> @Override
> public T copy(T from, T reuse) {
> for (int i = 0; i < arity; i++) {
> Object copy = fieldSerializers[i].copy(from.getField(i));
> reuse.setField(copy, i);
> }
> return reuse;
> }
>
> And commenting line 50 in *CollectionExecutionAccumulatorsTest*:
>
> assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME));
>
> I hope it helps..
>
> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <po...@okkam.it>
> wrote:
>
>> I've checked the compiled classes with javap -verbose and indeed they had
>> a major.verion=51 (java 7).
>> So I've changed the source and target to 1.8 in the main pom.xm and now
>> the generated .class have major.verion=52.
>> Unfortunately now I get this error:
>>
>> [ERROR]
>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>> incompatible types: void cannot be converted to java.lang.Object
>>
>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>> it didn't help :(
>>
>> Best,
>> Flavio
>>
>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <po...@okkam.it>
>> wrote:
>>
>>> Hi to all,
>>>
>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>> cluster.
>>> I've compiled both Flink sources and my app with the same Java version
>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>> flink-conf.yml of the cluster.
>>>
>>> I always get the following Exception:
>>>
>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>> version 52.0
>>>
>>> Is there any other setting I forgot to check? Do I have to change also
>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>
>>> Best,
>>> Flavio
>>>
>>
>>
>>
>

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
I've fixed it changing the copy method in the *TupleSerializer* as follow:

@Override
public T copy(T from, T reuse) {
for (int i = 0; i < arity; i++) {
Object copy = fieldSerializers[i].copy(from.getField(i));
reuse.setField(copy, i);
}
return reuse;
}

And commenting line 50 in *CollectionExecutionAccumulatorsTest*:

assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME));

I hope it helps..

On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <po...@okkam.it>
wrote:

> I've checked the compiled classes with javap -verbose and indeed they had
> a major.verion=51 (java 7).
> So I've changed the source and target to 1.8 in the main pom.xm and now
> the generated .class have major.verion=52.
> Unfortunately now I get this error:
>
> [ERROR]
> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
> incompatible types: void cannot be converted to java.lang.Object
>
> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it
> didn't help :(
>
> Best,
> Flavio
>
> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <po...@okkam.it>
> wrote:
>
>> Hi to all,
>>
>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>> cluster.
>> I've compiled both Flink sources and my app with the same Java version
>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>> flink-conf.yml of the cluster.
>>
>> I always get the following Exception:
>>
>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>> version 52.0
>>
>> Is there any other setting I forgot to check? Do I have to change also
>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>
>> Best,
>> Flavio
>>
>
>
>

Re: Flink cluster and Java 8

Posted by Flavio Pompermaier <po...@okkam.it>.
I've checked the compiled classes with javap -verbose and indeed they had a
major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the
generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR]
/opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it
didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <po...@okkam.it>
wrote:

> Hi to all,
>
> I was trying to make my Java 8 application to run on a Flink 0.10.1
> cluster.
> I've compiled both Flink sources and my app with the same Java version
> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
> flink-conf.yml of the cluster.
>
> I always get the following Exception:
>
> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
> version 52.0
>
> Is there any other setting I forgot to check? Do I have to change also the
> source and target to 1.8 in the maven compiler settings of the main pom?
>
> Best,
> Flavio
>