You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Cory Monty <co...@getbraintree.com> on 2015/12/07 16:15:26 UTC

Using Flink with Scala 2.11 and Java 8

Is it possible to use Scala 2.11 and Java 8?

I'm able to get our project to compile correctly, however there are runtime
errors with the Reflectasm library (I'm guessing due to Kyro). I looked
into the error and it seems Spark had the same issue (
https://issues.apache.org/jira/browse/SPARK-6152,
https://github.com/EsotericSoftware/reflectasm/issues/35) because of an
outdated version of Kyro.

I'm also unsure if maybe we have to build Flink with Scala 2.11 (
https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html)
in order to run Flink correctly with Java 8.

Cheers,

Cory

Re: Using Flink with Scala 2.11 and Java 8

Posted by Cory Monty <co...@getbraintree.com>.
Thanks!

On Thu, Dec 10, 2015 at 12:32 PM, Maximilian Michels <mx...@apache.org> wrote:

> Hi Cory,
>
> The issue has been fixed in the master and the latest Maven snapshot.
> https://issues.apache.org/jira/browse/FLINK-3143
>
> Cheers,
> Max
>
> On Tue, Dec 8, 2015 at 12:35 PM, Maximilian Michels <mx...@apache.org>
> wrote:
> > Thanks for the stack trace, Cory. Looks like you were on the right
> > path with the Spark issue. We will file an issue and correct it soon.
> >
> > Thanks,
> > Max
> >
> > On Mon, Dec 7, 2015 at 8:20 PM, Stephan Ewen <se...@apache.org> wrote:
> >> Sorry, correcting myself:
> >>
> >> The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply
> >> adjusting the imports to use the common ASM (which is 5.0) should do it
> ;-)
> >>
> >> On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <se...@apache.org> wrote:
> >>>
> >>> Flink's own asm is 5.0, but the Kryo version used in Flink bundles
> >>> reflectasm with a dedicated asm version 4 (no lambdas supported).
> >>>
> >>> Might be as simple as bumping the kryo version...
> >>>
> >>>
> >>>
> >>> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <
> cory.monty@getbraintree.com>
> >>> wrote:
> >>>>
> >>>> Thanks, Max.
> >>>>
> >>>> Here is the stack trace I receive:
> >>>>
> >>>> java.lang.IllegalArgumentException:
> >>>> at
> >>>>
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> >>>> Source)
> >>>> at
> >>>>
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> >>>> Source)
> >>>> at
> >>>>
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> >>>> Source)
> >>>> at
> >>>>
> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
> >>>> at
> >>>>
> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)
> >>>> at
> >>>>
> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)
> >>>> at
> >>>>
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)
> >>>> at
> >>>>
> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)
> >>>> at
> >>>>
> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)
> >>>>
> >>>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org>
> >>>> wrote:
> >>>>>
> >>>>> For completeness, could you provide a stack trace of the error
> message?
> >>>>>
> >>>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org>
> >>>>> wrote:
> >>>>> > Hi Cory,
> >>>>> >
> >>>>> > Thanks for reporting the issue. Scala should run independently of
> the
> >>>>> > Java version. We are already using ASM version 5.0.4. However, some
> >>>>> > code uses the ASM4 op codes which don't seem to be work with Java
> 8.
> >>>>> > This needs to be fixed. I'm filing a JIRA.
> >>>>> >
> >>>>> > Cheers,
> >>>>> > Max
> >>>>> >
> >>>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty
> >>>>> > <co...@getbraintree.com> wrote:
> >>>>> >> Is it possible to use Scala 2.11 and Java 8?
> >>>>> >>
> >>>>> >> I'm able to get our project to compile correctly, however there
> are
> >>>>> >> runtime
> >>>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I
> >>>>> >> looked into
> >>>>> >> the error and it seems Spark had the same issue
> >>>>> >> (https://issues.apache.org/jira/browse/SPARK-6152,
> >>>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35)
> because of
> >>>>> >> an
> >>>>> >> outdated version of Kyro.
> >>>>> >>
> >>>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
> >>>>> >>
> >>>>> >> (
> https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html
> )
> >>>>> >> in order to run Flink correctly with Java 8.
> >>>>> >>
> >>>>> >> Cheers,
> >>>>> >>
> >>>>> >> Cory
> >>>>
> >>>>
> >>>
> >>
>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
Hi Cory,

The issue has been fixed in the master and the latest Maven snapshot.
https://issues.apache.org/jira/browse/FLINK-3143

Cheers,
Max

On Tue, Dec 8, 2015 at 12:35 PM, Maximilian Michels <mx...@apache.org> wrote:
> Thanks for the stack trace, Cory. Looks like you were on the right
> path with the Spark issue. We will file an issue and correct it soon.
>
> Thanks,
> Max
>
> On Mon, Dec 7, 2015 at 8:20 PM, Stephan Ewen <se...@apache.org> wrote:
>> Sorry, correcting myself:
>>
>> The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply
>> adjusting the imports to use the common ASM (which is 5.0) should do it ;-)
>>
>> On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <se...@apache.org> wrote:
>>>
>>> Flink's own asm is 5.0, but the Kryo version used in Flink bundles
>>> reflectasm with a dedicated asm version 4 (no lambdas supported).
>>>
>>> Might be as simple as bumping the kryo version...
>>>
>>>
>>>
>>> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <co...@getbraintree.com>
>>> wrote:
>>>>
>>>> Thanks, Max.
>>>>
>>>> Here is the stack trace I receive:
>>>>
>>>> java.lang.IllegalArgumentException:
>>>> at
>>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>>> Source)
>>>> at
>>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>>> Source)
>>>> at
>>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>>> Source)
>>>> at
>>>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
>>>> at
>>>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)
>>>> at
>>>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)
>>>> at
>>>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)
>>>> at
>>>> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)
>>>> at
>>>> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)
>>>>
>>>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org>
>>>> wrote:
>>>>>
>>>>> For completeness, could you provide a stack trace of the error message?
>>>>>
>>>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org>
>>>>> wrote:
>>>>> > Hi Cory,
>>>>> >
>>>>> > Thanks for reporting the issue. Scala should run independently of the
>>>>> > Java version. We are already using ASM version 5.0.4. However, some
>>>>> > code uses the ASM4 op codes which don't seem to be work with Java 8.
>>>>> > This needs to be fixed. I'm filing a JIRA.
>>>>> >
>>>>> > Cheers,
>>>>> > Max
>>>>> >
>>>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty
>>>>> > <co...@getbraintree.com> wrote:
>>>>> >> Is it possible to use Scala 2.11 and Java 8?
>>>>> >>
>>>>> >> I'm able to get our project to compile correctly, however there are
>>>>> >> runtime
>>>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I
>>>>> >> looked into
>>>>> >> the error and it seems Spark had the same issue
>>>>> >> (https://issues.apache.org/jira/browse/SPARK-6152,
>>>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of
>>>>> >> an
>>>>> >> outdated version of Kyro.
>>>>> >>
>>>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
>>>>> >>
>>>>> >> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html)
>>>>> >> in order to run Flink correctly with Java 8.
>>>>> >>
>>>>> >> Cheers,
>>>>> >>
>>>>> >> Cory
>>>>
>>>>
>>>
>>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
Thanks for the stack trace, Cory. Looks like you were on the right
path with the Spark issue. We will file an issue and correct it soon.

Thanks,
Max

On Mon, Dec 7, 2015 at 8:20 PM, Stephan Ewen <se...@apache.org> wrote:
> Sorry, correcting myself:
>
> The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply
> adjusting the imports to use the common ASM (which is 5.0) should do it ;-)
>
> On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <se...@apache.org> wrote:
>>
>> Flink's own asm is 5.0, but the Kryo version used in Flink bundles
>> reflectasm with a dedicated asm version 4 (no lambdas supported).
>>
>> Might be as simple as bumping the kryo version...
>>
>>
>>
>> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <co...@getbraintree.com>
>> wrote:
>>>
>>> Thanks, Max.
>>>
>>> Here is the stack trace I receive:
>>>
>>> java.lang.IllegalArgumentException:
>>> at
>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>> Source)
>>> at
>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>> Source)
>>> at
>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>>> Source)
>>> at
>>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
>>> at
>>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)
>>> at
>>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)
>>> at
>>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)
>>> at
>>> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)
>>> at
>>> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)
>>>
>>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org>
>>> wrote:
>>>>
>>>> For completeness, could you provide a stack trace of the error message?
>>>>
>>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org>
>>>> wrote:
>>>> > Hi Cory,
>>>> >
>>>> > Thanks for reporting the issue. Scala should run independently of the
>>>> > Java version. We are already using ASM version 5.0.4. However, some
>>>> > code uses the ASM4 op codes which don't seem to be work with Java 8.
>>>> > This needs to be fixed. I'm filing a JIRA.
>>>> >
>>>> > Cheers,
>>>> > Max
>>>> >
>>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty
>>>> > <co...@getbraintree.com> wrote:
>>>> >> Is it possible to use Scala 2.11 and Java 8?
>>>> >>
>>>> >> I'm able to get our project to compile correctly, however there are
>>>> >> runtime
>>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I
>>>> >> looked into
>>>> >> the error and it seems Spark had the same issue
>>>> >> (https://issues.apache.org/jira/browse/SPARK-6152,
>>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of
>>>> >> an
>>>> >> outdated version of Kyro.
>>>> >>
>>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
>>>> >>
>>>> >> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html)
>>>> >> in order to run Flink correctly with Java 8.
>>>> >>
>>>> >> Cheers,
>>>> >>
>>>> >> Cory
>>>
>>>
>>
>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Stephan Ewen <se...@apache.org>.
Sorry, correcting myself:

The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply
adjusting the imports to use the common ASM (which is 5.0) should do it ;-)

On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <se...@apache.org> wrote:

> Flink's own asm is 5.0, but the Kryo version used in Flink bundles
> reflectasm with a dedicated asm version 4 (no lambdas supported).
>
> Might be as simple as bumping the kryo version...
>
>
>
> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <co...@getbraintree.com>
> wrote:
>
>> Thanks, Max.
>>
>> Here is the stack trace I receive:
>>
>> java.lang.IllegalArgumentException:
>>
>> at
>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>> Source)
>> at
>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>> Source)
>> at
>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>> Source)
>> at
>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
>> at
>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)
>>
>> at
>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)
>>
>> at
>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)
>>
>> at
>> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)
>>
>> at
>> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)
>>
>>
>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org>
>> wrote:
>>
>>> For completeness, could you provide a stack trace of the error message?
>>>
>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org>
>>> wrote:
>>> > Hi Cory,
>>> >
>>> > Thanks for reporting the issue. Scala should run independently of the
>>> > Java version. We are already using ASM version 5.0.4. However, some
>>> > code uses the ASM4 op codes which don't seem to be work with Java 8.
>>> > This needs to be fixed. I'm filing a JIRA.
>>> >
>>> > Cheers,
>>> > Max
>>> >
>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <
>>> cory.monty@getbraintree.com> wrote:
>>> >> Is it possible to use Scala 2.11 and Java 8?
>>> >>
>>> >> I'm able to get our project to compile correctly, however there are
>>> runtime
>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I
>>> looked into
>>> >> the error and it seems Spark had the same issue
>>> >> (https://issues.apache.org/jira/browse/SPARK-6152,
>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of
>>> an
>>> >> outdated version of Kyro.
>>> >>
>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
>>> >> (
>>> https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html
>>> )
>>> >> in order to run Flink correctly with Java 8.
>>> >>
>>> >> Cheers,
>>> >>
>>> >> Cory
>>>
>>
>>
>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Stephan Ewen <se...@apache.org>.
Flink's own asm is 5.0, but the Kryo version used in Flink bundles
reflectasm with a dedicated asm version 4 (no lambdas supported).

Might be as simple as bumping the kryo version...



On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <co...@getbraintree.com>
wrote:

> Thanks, Max.
>
> Here is the stack trace I receive:
>
> java.lang.IllegalArgumentException:
>
> at
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> Source)
> at
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> Source)
> at
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
> Source)
> at
> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
> at
> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)
>
> at
> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)
>
> at
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)
>
> at
> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)
>
> at
> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)
>
>
> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org>
> wrote:
>
>> For completeness, could you provide a stack trace of the error message?
>>
>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org>
>> wrote:
>> > Hi Cory,
>> >
>> > Thanks for reporting the issue. Scala should run independently of the
>> > Java version. We are already using ASM version 5.0.4. However, some
>> > code uses the ASM4 op codes which don't seem to be work with Java 8.
>> > This needs to be fixed. I'm filing a JIRA.
>> >
>> > Cheers,
>> > Max
>> >
>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <co...@getbraintree.com>
>> wrote:
>> >> Is it possible to use Scala 2.11 and Java 8?
>> >>
>> >> I'm able to get our project to compile correctly, however there are
>> runtime
>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I
>> looked into
>> >> the error and it seems Spark had the same issue
>> >> (https://issues.apache.org/jira/browse/SPARK-6152,
>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of
>> an
>> >> outdated version of Kyro.
>> >>
>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
>> >> (
>> https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html
>> )
>> >> in order to run Flink correctly with Java 8.
>> >>
>> >> Cheers,
>> >>
>> >> Cory
>>
>
>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Cory Monty <co...@getbraintree.com>.
Thanks, Max.

Here is the stack trace I receive:

java.lang.IllegalArgumentException:

at
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
Source)
at
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
Source)
at
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
Source)
at
org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47)
at
org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90)

at
org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113)

at
org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555)

at
org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764)

at
org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473)


On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <mx...@apache.org> wrote:

> For completeness, could you provide a stack trace of the error message?
>
> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org> wrote:
> > Hi Cory,
> >
> > Thanks for reporting the issue. Scala should run independently of the
> > Java version. We are already using ASM version 5.0.4. However, some
> > code uses the ASM4 op codes which don't seem to be work with Java 8.
> > This needs to be fixed. I'm filing a JIRA.
> >
> > Cheers,
> > Max
> >
> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <co...@getbraintree.com>
> wrote:
> >> Is it possible to use Scala 2.11 and Java 8?
> >>
> >> I'm able to get our project to compile correctly, however there are
> runtime
> >> errors with the Reflectasm library (I'm guessing due to Kyro). I looked
> into
> >> the error and it seems Spark had the same issue
> >> (https://issues.apache.org/jira/browse/SPARK-6152,
> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of an
> >> outdated version of Kyro.
> >>
> >> I'm also unsure if maybe we have to build Flink with Scala 2.11
> >> (
> https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html
> )
> >> in order to run Flink correctly with Java 8.
> >>
> >> Cheers,
> >>
> >> Cory
>

Re: Using Flink with Scala 2.11 and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
For completeness, could you provide a stack trace of the error message?

On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <mx...@apache.org> wrote:
> Hi Cory,
>
> Thanks for reporting the issue. Scala should run independently of the
> Java version. We are already using ASM version 5.0.4. However, some
> code uses the ASM4 op codes which don't seem to be work with Java 8.
> This needs to be fixed. I'm filing a JIRA.
>
> Cheers,
> Max
>
> On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <co...@getbraintree.com> wrote:
>> Is it possible to use Scala 2.11 and Java 8?
>>
>> I'm able to get our project to compile correctly, however there are runtime
>> errors with the Reflectasm library (I'm guessing due to Kyro). I looked into
>> the error and it seems Spark had the same issue
>> (https://issues.apache.org/jira/browse/SPARK-6152,
>> https://github.com/EsotericSoftware/reflectasm/issues/35) because of an
>> outdated version of Kyro.
>>
>> I'm also unsure if maybe we have to build Flink with Scala 2.11
>> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html)
>> in order to run Flink correctly with Java 8.
>>
>> Cheers,
>>
>> Cory

Re: Using Flink with Scala 2.11 and Java 8

Posted by Maximilian Michels <mx...@apache.org>.
Hi Cory,

Thanks for reporting the issue. Scala should run independently of the
Java version. We are already using ASM version 5.0.4. However, some
code uses the ASM4 op codes which don't seem to be work with Java 8.
This needs to be fixed. I'm filing a JIRA.

Cheers,
Max

On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <co...@getbraintree.com> wrote:
> Is it possible to use Scala 2.11 and Java 8?
>
> I'm able to get our project to compile correctly, however there are runtime
> errors with the Reflectasm library (I'm guessing due to Kyro). I looked into
> the error and it seems Spark had the same issue
> (https://issues.apache.org/jira/browse/SPARK-6152,
> https://github.com/EsotericSoftware/reflectasm/issues/35) because of an
> outdated version of Kyro.
>
> I'm also unsure if maybe we have to build Flink with Scala 2.11
> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html)
> in order to run Flink correctly with Java 8.
>
> Cheers,
>
> Cory