You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Niels Basjes <Ni...@basjes.nl> on 2020/07/01 08:50:01 UTC

Re: Flink JDK compatibility problem.

Hi,

> This is difficult to do since you cannot have multiple jdk activations
for a single profile in maven

Well actually you can have multiple versions of JDK activate the java11
profile.
http://maven.apache.org/guides/introduction/introduction-to-profiles.html

quote:

   1. <profile>
   2. <activation>
   3. <jdk>[1.3,1.6)</jdk>
   4. </activation>



So I actually had something really simple in mind:
Have the Java11 profile activate on anything greater or equal to java 11
(i.e. also on java 12,13,14, ...)
And then force it to build java11 code.

Something like this (untested)

   <profile>
        <id>java11</id>
        <activation>
            <jdk>[11,)</jdk>
        </activation>

...
<configuration>
    <source>11</source>
    <target>11</target>
    <release>11</release>

Niels  Basjes




On Tue, Jun 30, 2020 at 7:53 PM Chesnay Schepler <ch...@apache.org> wrote:

> This is difficult to do since you cannot have multiple jdk activations for
> a single profile in maven, and duplicating the entire profile for all jdk
> versions isn't an option.
>
> We _maybe_ could invert the behavior such that the Java 11 behavior is the
> default, with a JDK 8 profile, but there may be some areas where this
> wouldn't work (e.g., adding dependencies in JDK 11), and it may catch
> developers off-guard.
>
> On 30/06/2020 17:57, Niels Basjes wrote:
>
> Hi Chesnay,
>
> Ok, so if someone uses a non-LTS version of Java (like 14) then how about
> simply "pinning" it to the Java 11 compatibility?
> I'm assuming no one uses Java 9 and/or 10 anymore so I'm ignoring those.
> Then building with Java 8 will result in Java 8 code.
> Building with Java 11, 12, 13, 14, ... will result in Java 11 code.
>
> That way the code generated using Java 14 will fail immediately on Java 8
> because of a completely incompatible binary format.
> To me this error would make a lot more sense (I would immediately know
> what I was doing should not work) than the strange error about the
> non-existent method.
>
> Do you agree?
>
> Niels
>
>
>
>
> On Tue, Jun 30, 2020 at 1:15 PM Chesnay Schepler <ch...@apache.org>
> wrote:
>
>> > What is the Java version Apache Flink is supposed to work with?
>>
>> 8 and 11. Non-LTS Java11+ releases _should work_, but we don't put in
>> effort to make it as convenient as for LTS releases. As such you have to
>> manually enable the java11 profile when compiling Flink.
>>
>> I set the target version to 11 since IIRC we ran into more errors that
>> way, ensuring a smoother transition once Java 11 is the default.
>>
>> On 30/06/2020 13:01, Niels Basjes wrote:
>> > Hi,
>> >
>> > I have both JDK 8 and 14 on my system and yesterday I ran into this
>> > exception (I put the info I have in this ticket
>> > https://issues.apache.org/jira/browse/FLINK-18455 ) :
>> >
>> >         java.lang.NoSuchMethodError:
>> > java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>> >
>> > >From digging around (
>> >
>> https://stackoverflow.com/questions/61267495/exception-in-thread-main-java-lang-nosuchmethoderror-java-nio-bytebuffer-flip
>> > )
>> > it seems this is caused when using JDK 9+ without setting '-release 8'
>> and
>> > then running with JDK 8.
>> >
>> > Essentially there are two solutions I see a lot:
>> > 1) Add the -release 8 flag to the compiler
>> > 2) Use the flaky workaround to cast all problem cases to the superclass
>> > implementing the method (i.e. cast to java.nio.Buffer)
>> >
>> > Looking at the actual Flink code I found that
>> >
>> > In a JDK 8 build both source and target are set to Java 8
>> >
>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L109
>> >       <java.version>1.8</java.version>
>> >
>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L115
>> >      <maven.compiler.source>${java.version}</maven.compiler.source>
>> >      <maven.compiler.target>${java.version}</maven.compiler.target>
>> >
>> > In a JDK 11 build a profile is activated that overrides it to Java 11
>> >
>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L938
>> >    <profile>
>> >       <id>java11</id>
>> >       <activation>
>> >           <jdk>11</jdk>
>> >
>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L1004
>> >        <artifactId>maven-compiler-plugin</artifactId>
>> >        <configuration>
>> >            <source>11</source>
>> >            <target>11</target>
>> >
>> > So when building with Java 11 the output classes are Java 11 compatible
>> > binaries.
>> >
>> > However I have Java 14 (and the 'java11' profile is only activated at
>> the
>> > EXACT version of java 11) so it stays at source and target 1.8 but does
>> not
>> > specify the "release 8" setting. ... which causes the problems I see.
>> >
>> > Looking at the current build settings I was puzzled a bit and I have
>> this
>> > main question:
>> >
>> >      What is the Java version Apache Flink is supposed to work with?
>> >
>> > Currently I would expect Java 8.
>> >
>> > So what I propose for this problem as a fix is to set source, target and
>> > release to Java 8 for all compiler versions (i.e. also Java 9, 11, 14,
>> ...).
>> > That way you can use any compiler and get the correct results.
>> >
>> > I also am curious if that would fix the tests that seem to fail under
>> Java
>> > 11.
>> >
>> > What do you think is the correct approach for this?
>> >
>>
>>
>
> --
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes
>
>
>

-- 
Best regards / Met vriendelijke groeten,

Niels Basjes

Re: Flink JDK compatibility problem.

Posted by Chesnay Schepler <ch...@apache.org>.
My understanding is that you can only remove the exports by either a) 
removing all usages or b) migrating the project to modules and declare 
the JDK dependencies that way.

a) is not viable AFAIK and in at least one case would require dropping 
features (the JMX reporter), while b) appears to be a massive task; 
modularization for larger projects is not trivial, it does not play 
nicely with the shade-plugin (i.e., our entire packaging paradigm) and 
can bring additional headaches when also trying to be compatible with 
Java 8.

I don't believe this to be a big problem though to be honest; it seems 
unlikely that this "temporary workaround" is really that temporary.

Sooo....sure you can create a JIRA ticket, but I do not see anything we 
can change at the moment.

On 01/07/2020 17:31, Niels Basjes wrote:
> Hi,
>
> I've been fiddling around to try to see if this really works when 
> compiling with Java 14.
> I have the simplest form of this fix working and I've put up a pull 
> request for just that. https://github.com/apache/flink/pull/12799
> I also found that there are two different values for java.version that 
> make the build needlessly hard to debug (for which I created 
> https://issues.apache.org/jira/browse/FLINK-18458 )
>
> What I found is that building under Java 9+ uses a construct that is 
> not 'clean'.
> The build is not how it is intended by the Java developers.
>
> What I found today:
>
> When I simply change this (add the release):
>                  <plugin>
>                    <groupId>org.apache.maven.plugins</groupId>
>                    <artifactId>maven-compiler-plugin</artifactId>
>                    <configuration>
> <!-- <source>11</source>--> <!-- <target>11</target>--> <release>11</release>
>                       <compilerArgs combine.children="append">
>                          <arg>--add-exports=java.base/sun.net.util=ALL-UNNAMED</arg>
>                          <arg>--add-exports=java.management/sun.management=ALL-UNNAMED</arg>
>                          <arg>--add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED</arg>
>                          <arg>--add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED</arg>
> y</compilerArgs>
>                    </configuration>
>                 </plugin>
>
> I get
>
>     *[ERROR] exporting a package from system module java.base is not
>     allowed with --release*
>     *[ERROR] exporting a package from system module java.management is
>     not allowed with --release*
>     *[ERROR] exporting a package from system module java.rmi is not
>     allowed with --release*
>     *[ERROR] exporting a package from system module java.security.jgss
>     is not allowed with --release*
>
>
> When I remove the --add-exports statements I get
>
>     *[ERROR]
>     /home/nbasjes/workspace/Apache/flink/flink-core/src/main/java/org/apache/flink/util/NetUtils.java:[26,20]
>     package sun.net.util does not exist*
>
>
> According to the migration guides the --add-exports is seen as a 
> temporary workaround
>
>   * https://docs.oracle.com/javase/9/migrate/toc.htm#JSMIG-GUID-77874D97-46F3-4DB5-85E4-2ACB5F8D760B
>   * Quote:
>       o Critical internal JDK APIs such as sun.misc.Unsafe are still
>         accessible in JDK 9, but most of the JDK’s internal APIs are
>         not accessible at compile time. You may get compilation errors
>         that indicate that your application or its libraries are
>         dependent on internal APIs.
>       o ...
>       o You may use the --add-exports option as a temporary workaround
>         to compile source code with references to JDK internal classes.
>
> and thus is now no longer allowed in combination with the release flag.
>
>   * https://stackoverflow.com/questions/45370178/exporting-a-package-from-system-module-is-not-allowed-with-release
>   * https://bugs.openjdk.java.net/browse/JDK-8178152
>
>
> Q: Is there an existing ticket for making the build 'clean' for java 
> 9+ ? Or shall I create one?
>
> Niels Basjes
>
>
>
> On Wed, Jul 1, 2020 at 10:57 AM Chesnay Schepler <chesnay@apache.org 
> <ma...@apache.org>> wrote:
>
>     oh cool, yes that should work nicely then.
>
>     On 01/07/2020 10:50, Niels Basjes wrote:
>>     Hi,
>>
>>     > This is difficult to do since you cannot have multiple jdk
>>     activations for a single profile in maven
>>
>>     Well actually you can have multiple versions of JDK activate the
>>     java11 profile.
>>     http://maven.apache.org/guides/introduction/introduction-to-profiles.html
>>
>>     quote:
>>
>>      1. <profile>
>>      2. <activation>
>>      3. <jdk>[1.3,1.6)</jdk>
>>      4. </activation>
>>
>>
>>
>>     So I actually had something really simple in mind:
>>     Have the Java11 profile activate on anything greater or equal to
>>     java 11 (i.e. also on java 12,13,14, ...)
>>     And then force it to build java11 code.
>>
>>     Something like this (untested)
>>
>>        <profile>
>>             <id>java11</id>
>>             <activation>
>>                 <jdk>[11,)</jdk>
>>             </activation>
>>
>>     ...
>>     <configuration>
>>         <source>11</source>
>>         <target>11</target>
>>         <release>11</release>
>>
>>     Niels  Basjes
>>
>>
>>
>>
>>     On Tue, Jun 30, 2020 at 7:53 PM Chesnay Schepler
>>     <chesnay@apache.org <ma...@apache.org>> wrote:
>>
>>         This is difficult to do since you cannot have multiple jdk
>>         activations for a single profile in maven, and duplicating
>>         the entire profile for all jdk versions isn't an option.
>>
>>         We _maybe_ could invert the behavior such that the Java 11
>>         behavior is the default, with a JDK 8 profile, but there may
>>         be some areas where this wouldn't work (e.g., adding
>>         dependencies in JDK 11), and it may catch developers off-guard.
>>
>>         On 30/06/2020 17:57, Niels Basjes wrote:
>>>         Hi Chesnay,
>>>
>>>         Ok, so if someone uses a non-LTS version of Java (like 14)
>>>         then how about simply "pinning" it to the Java 11 compatibility?
>>>         I'm assuming no one uses Java 9 and/or 10 anymore so I'm
>>>         ignoring those.
>>>         Then building with Java 8 will result in Java 8 code.
>>>         Building with Java 11, 12, 13, 14, ... will result in Java
>>>         11 code.
>>>
>>>         That way the code generated using Java 14 will fail
>>>         immediately on Java 8 because of a completely
>>>         incompatible binary format.
>>>         To me this error would make a lot more sense (I would
>>>         immediately know what I was doing should not work) than the
>>>         strange error about the non-existent method.
>>>
>>>         Do you agree?
>>>
>>>         Niels
>>>
>>>
>>>
>>>
>>>         On Tue, Jun 30, 2020 at 1:15 PM Chesnay Schepler
>>>         <chesnay@apache.org <ma...@apache.org>> wrote:
>>>
>>>             > What is the Java version Apache Flink is supposed to
>>>             work with?
>>>
>>>             8 and 11. Non-LTS Java11+ releases _should work_, but we
>>>             don't put in
>>>             effort to make it as convenient as for LTS releases. As
>>>             such you have to
>>>             manually enable the java11 profile when compiling Flink.
>>>
>>>             I set the target version to 11 since IIRC we ran into
>>>             more errors that
>>>             way, ensuring a smoother transition once Java 11 is the
>>>             default.
>>>
>>>             On 30/06/2020 13:01, Niels Basjes wrote:
>>>             > Hi,
>>>             >
>>>             > I have both JDK 8 and 14 on my system and yesterday I
>>>             ran into this
>>>             > exception (I put the info I have in this ticket
>>>             > https://issues.apache.org/jira/browse/FLINK-18455 ) :
>>>             >
>>>             >         java.lang.NoSuchMethodError:
>>>             > java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>>>             >
>>>             > >From digging around (
>>>             >
>>>             https://stackoverflow.com/questions/61267495/exception-in-thread-main-java-lang-nosuchmethoderror-java-nio-bytebuffer-flip
>>>             > )
>>>             > it seems this is caused when using JDK 9+ without
>>>             setting '-release 8' and
>>>             > then running with JDK 8.
>>>             >
>>>             > Essentially there are two solutions I see a lot:
>>>             > 1) Add the -release 8 flag to the compiler
>>>             > 2) Use the flaky workaround to cast all problem cases
>>>             to the superclass
>>>             > implementing the method (i.e. cast to java.nio.Buffer)
>>>             >
>>>             > Looking at the actual Flink code I found that
>>>             >
>>>             > In a JDK 8 build both source and target are set to Java 8
>>>             >
>>>             https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L109
>>>             >  <java.version>1.8</java.version>
>>>             >
>>>             https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L115
>>>             >
>>>             <maven.compiler.source>${java.version}</maven.compiler.source>
>>>             >
>>>             <maven.compiler.target>${java.version}</maven.compiler.target>
>>>             >
>>>             > In a JDK 11 build a profile is activated that
>>>             overrides it to Java 11
>>>             >
>>>             https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L938
>>>             >    <profile>
>>>             >       <id>java11</id>
>>>             >       <activation>
>>>             >           <jdk>11</jdk>
>>>             >
>>>             https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L1004
>>>             > <artifactId>maven-compiler-plugin</artifactId>
>>>             >        <configuration>
>>>             > <source>11</source>
>>>             > <target>11</target>
>>>             >
>>>             > So when building with Java 11 the output classes are
>>>             Java 11 compatible
>>>             > binaries.
>>>             >
>>>             > However I have Java 14 (and the 'java11' profile is
>>>             only activated at the
>>>             > EXACT version of java 11) so it stays at source and
>>>             target 1.8 but does not
>>>             > specify the "release 8" setting. ... which causes the
>>>             problems I see.
>>>             >
>>>             > Looking at the current build settings I was puzzled a
>>>             bit and I have this
>>>             > main question:
>>>             >
>>>             >      What is the Java version Apache Flink is supposed
>>>             to work with?
>>>             >
>>>             > Currently I would expect Java 8.
>>>             >
>>>             > So what I propose for this problem as a fix is to set
>>>             source, target and
>>>             > release to Java 8 for all compiler versions (i.e. also
>>>             Java 9, 11, 14, ...).
>>>             > That way you can use any compiler and get the correct
>>>             results.
>>>             >
>>>             > I also am curious if that would fix the tests that
>>>             seem to fail under Java
>>>             > 11.
>>>             >
>>>             > What do you think is the correct approach for this?
>>>             >
>>>
>>>
>>>
>>>         -- 
>>>         Best regards / Met vriendelijke groeten,
>>>
>>>         Niels Basjes
>>
>>
>>
>>
>>     -- 
>>     Best regards / Met vriendelijke groeten,
>>
>>     Niels Basjes
>
>
>
>
> -- 
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes



Re: Flink JDK compatibility problem.

Posted by Niels Basjes <Ni...@basjes.nl>.
Hi,

I've been fiddling around to try to see if this really works when compiling
with Java 14.
I have the simplest form of this fix working and I've put up a pull request
for just that. https://github.com/apache/flink/pull/12799
I also found that there are two different values for java.version that make
the build needlessly hard to debug (for which I created
https://issues.apache.org/jira/browse/FLINK-18458 )

What I found is that building under Java 9+ uses a construct that is not
'clean'.
The build is not how it is intended by the Java developers.

What I found today:

When I simply change this (add the release):

                <plugin>
                  <groupId>org.apache.maven.plugins</groupId>
                  <artifactId>maven-compiler-plugin</artifactId>
                  <configuration>
<!--                     <source>11</source>-->
<!--                     <target>11</target>-->
                     <release>11</release>
                     <compilerArgs combine.children="append">

<arg>--add-exports=java.base/sun.net.util=ALL-UNNAMED</arg>

<arg>--add-exports=java.management/sun.management=ALL-UNNAMED</arg>

<arg>--add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED</arg>

<arg>--add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED</arg>
                     </compilerArgs>
                  </configuration>
               </plugin>


I get

*[ERROR] exporting a package from system module java.base is not allowed
with --release*
*[ERROR] exporting a package from system module java.management is not
allowed with --release*
*[ERROR] exporting a package from system module java.rmi is not allowed
with --release*
*[ERROR] exporting a package from system module java.security.jgss is not
allowed with --release*


When I remove the --add-exports statements I get

*[ERROR]
/home/nbasjes/workspace/Apache/flink/flink-core/src/main/java/org/apache/flink/util/NetUtils.java:[26,20]
package sun.net.util does not exist*


According to the migration guides the --add-exports is seen as a temporary
workaround

   -
   https://docs.oracle.com/javase/9/migrate/toc.htm#JSMIG-GUID-77874D97-46F3-4DB5-85E4-2ACB5F8D760B
   - Quote:
      - Critical internal JDK APIs such as sun.misc.Unsafe are still
      accessible in JDK 9, but most of the JDK’s internal APIs are not
accessible
      at compile time. You may get compilation errors that indicate that your
      application or its libraries are dependent on internal APIs.
      - ...
      - You may use the --add-exports option as a temporary workaround to
      compile source code with references to JDK internal classes.

and thus is now no longer allowed in combination with the release flag.

   -
   https://stackoverflow.com/questions/45370178/exporting-a-package-from-system-module-is-not-allowed-with-release
   - https://bugs.openjdk.java.net/browse/JDK-8178152


Q: Is there an existing ticket for making the build 'clean' for java 9+ ?
Or shall I create one?

Niels Basjes



On Wed, Jul 1, 2020 at 10:57 AM Chesnay Schepler <ch...@apache.org> wrote:

> oh cool, yes that should work nicely then.
>
> On 01/07/2020 10:50, Niels Basjes wrote:
>
> Hi,
>
> > This is difficult to do since you cannot have multiple jdk activations
> for a single profile in maven
>
> Well actually you can have multiple versions of JDK activate the java11
> profile.
> http://maven.apache.org/guides/introduction/introduction-to-profiles.html
>
> quote:
>
>    1. <profile>
>    2. <activation>
>    3. <jdk>[1.3,1.6)</jdk>
>    4. </activation>
>
>
>
> So I actually had something really simple in mind:
> Have the Java11 profile activate on anything greater or equal to java 11
> (i.e. also on java 12,13,14, ...)
> And then force it to build java11 code.
>
> Something like this (untested)
>
>    <profile>
>         <id>java11</id>
>         <activation>
>             <jdk>[11,)</jdk>
>         </activation>
>
> ...
> <configuration>
>     <source>11</source>
>     <target>11</target>
>     <release>11</release>
>
> Niels  Basjes
>
>
>
>
> On Tue, Jun 30, 2020 at 7:53 PM Chesnay Schepler <ch...@apache.org>
> wrote:
>
>> This is difficult to do since you cannot have multiple jdk activations
>> for a single profile in maven, and duplicating the entire profile for all
>> jdk versions isn't an option.
>>
>> We _maybe_ could invert the behavior such that the Java 11 behavior is
>> the default, with a JDK 8 profile, but there may be some areas where this
>> wouldn't work (e.g., adding dependencies in JDK 11), and it may catch
>> developers off-guard.
>>
>> On 30/06/2020 17:57, Niels Basjes wrote:
>>
>> Hi Chesnay,
>>
>> Ok, so if someone uses a non-LTS version of Java (like 14) then how about
>> simply "pinning" it to the Java 11 compatibility?
>> I'm assuming no one uses Java 9 and/or 10 anymore so I'm ignoring those.
>> Then building with Java 8 will result in Java 8 code.
>> Building with Java 11, 12, 13, 14, ... will result in Java 11 code.
>>
>> That way the code generated using Java 14 will fail immediately on Java 8
>> because of a completely incompatible binary format.
>> To me this error would make a lot more sense (I would immediately know
>> what I was doing should not work) than the strange error about the
>> non-existent method.
>>
>> Do you agree?
>>
>> Niels
>>
>>
>>
>>
>> On Tue, Jun 30, 2020 at 1:15 PM Chesnay Schepler <ch...@apache.org>
>> wrote:
>>
>>> > What is the Java version Apache Flink is supposed to work with?
>>>
>>> 8 and 11. Non-LTS Java11+ releases _should work_, but we don't put in
>>> effort to make it as convenient as for LTS releases. As such you have to
>>> manually enable the java11 profile when compiling Flink.
>>>
>>> I set the target version to 11 since IIRC we ran into more errors that
>>> way, ensuring a smoother transition once Java 11 is the default.
>>>
>>> On 30/06/2020 13:01, Niels Basjes wrote:
>>> > Hi,
>>> >
>>> > I have both JDK 8 and 14 on my system and yesterday I ran into this
>>> > exception (I put the info I have in this ticket
>>> > https://issues.apache.org/jira/browse/FLINK-18455 ) :
>>> >
>>> >         java.lang.NoSuchMethodError:
>>> > java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>>> >
>>> > >From digging around (
>>> >
>>> https://stackoverflow.com/questions/61267495/exception-in-thread-main-java-lang-nosuchmethoderror-java-nio-bytebuffer-flip
>>> > )
>>> > it seems this is caused when using JDK 9+ without setting '-release 8'
>>> and
>>> > then running with JDK 8.
>>> >
>>> > Essentially there are two solutions I see a lot:
>>> > 1) Add the -release 8 flag to the compiler
>>> > 2) Use the flaky workaround to cast all problem cases to the superclass
>>> > implementing the method (i.e. cast to java.nio.Buffer)
>>> >
>>> > Looking at the actual Flink code I found that
>>> >
>>> > In a JDK 8 build both source and target are set to Java 8
>>> >
>>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L109
>>> >       <java.version>1.8</java.version>
>>> >
>>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L115
>>> >      <maven.compiler.source>${java.version}</maven.compiler.source>
>>> >      <maven.compiler.target>${java.version}</maven.compiler.target>
>>> >
>>> > In a JDK 11 build a profile is activated that overrides it to Java 11
>>> >
>>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L938
>>> >    <profile>
>>> >       <id>java11</id>
>>> >       <activation>
>>> >           <jdk>11</jdk>
>>> >
>>> https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L1004
>>> >        <artifactId>maven-compiler-plugin</artifactId>
>>> >        <configuration>
>>> >            <source>11</source>
>>> >            <target>11</target>
>>> >
>>> > So when building with Java 11 the output classes are Java 11 compatible
>>> > binaries.
>>> >
>>> > However I have Java 14 (and the 'java11' profile is only activated at
>>> the
>>> > EXACT version of java 11) so it stays at source and target 1.8 but
>>> does not
>>> > specify the "release 8" setting. ... which causes the problems I see.
>>> >
>>> > Looking at the current build settings I was puzzled a bit and I have
>>> this
>>> > main question:
>>> >
>>> >      What is the Java version Apache Flink is supposed to work with?
>>> >
>>> > Currently I would expect Java 8.
>>> >
>>> > So what I propose for this problem as a fix is to set source, target
>>> and
>>> > release to Java 8 for all compiler versions (i.e. also Java 9, 11, 14,
>>> ...).
>>> > That way you can use any compiler and get the correct results.
>>> >
>>> > I also am curious if that would fix the tests that seem to fail under
>>> Java
>>> > 11.
>>> >
>>> > What do you think is the correct approach for this?
>>> >
>>>
>>>
>>
>> --
>> Best regards / Met vriendelijke groeten,
>>
>> Niels Basjes
>>
>>
>>
>
> --
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes
>
>
>

-- 
Best regards / Met vriendelijke groeten,

Niels Basjes

Re: Flink JDK compatibility problem.

Posted by Chesnay Schepler <ch...@apache.org>.
oh cool, yes that should work nicely then.

On 01/07/2020 10:50, Niels Basjes wrote:
> Hi,
>
> > This is difficult to do since you cannot have multiple jdk 
> activations for a single profile in maven
>
> Well actually you can have multiple versions of JDK activate the 
> java11 profile.
> http://maven.apache.org/guides/introduction/introduction-to-profiles.html
>
> quote:
>
>  1. <profile>
>  2. <activation>
>  3. <jdk>[1.3,1.6)</jdk>
>  4. </activation>
>
>
>
> So I actually had something really simple in mind:
> Have the Java11 profile activate on anything greater or equal to java 
> 11 (i.e. also on java 12,13,14, ...)
> And then force it to build java11 code.
>
> Something like this (untested)
>
>    <profile>
>         <id>java11</id>
>         <activation>
>             <jdk>[11,)</jdk>
>         </activation>
>
> ...
> <configuration>
>     <source>11</source>
>     <target>11</target>
>     <release>11</release>
>
> Niels  Basjes
>
>
>
>
> On Tue, Jun 30, 2020 at 7:53 PM Chesnay Schepler <chesnay@apache.org 
> <ma...@apache.org>> wrote:
>
>     This is difficult to do since you cannot have multiple jdk
>     activations for a single profile in maven, and duplicating the
>     entire profile for all jdk versions isn't an option.
>
>     We _maybe_ could invert the behavior such that the Java 11
>     behavior is the default, with a JDK 8 profile, but there may be
>     some areas where this wouldn't work (e.g., adding dependencies in
>     JDK 11), and it may catch developers off-guard.
>
>     On 30/06/2020 17:57, Niels Basjes wrote:
>>     Hi Chesnay,
>>
>>     Ok, so if someone uses a non-LTS version of Java (like 14) then
>>     how about simply "pinning" it to the Java 11 compatibility?
>>     I'm assuming no one uses Java 9 and/or 10 anymore so I'm ignoring
>>     those.
>>     Then building with Java 8 will result in Java 8 code.
>>     Building with Java 11, 12, 13, 14, ... will result in Java 11 code.
>>
>>     That way the code generated using Java 14 will fail immediately
>>     on Java 8 because of a completely incompatible binary format.
>>     To me this error would make a lot more sense (I would immediately
>>     know what I was doing should not work) than the strange error
>>     about the non-existent method.
>>
>>     Do you agree?
>>
>>     Niels
>>
>>
>>
>>
>>     On Tue, Jun 30, 2020 at 1:15 PM Chesnay Schepler
>>     <chesnay@apache.org <ma...@apache.org>> wrote:
>>
>>         > What is the Java version Apache Flink is supposed to work with?
>>
>>         8 and 11. Non-LTS Java11+ releases _should work_, but we
>>         don't put in
>>         effort to make it as convenient as for LTS releases. As such
>>         you have to
>>         manually enable the java11 profile when compiling Flink.
>>
>>         I set the target version to 11 since IIRC we ran into more
>>         errors that
>>         way, ensuring a smoother transition once Java 11 is the default.
>>
>>         On 30/06/2020 13:01, Niels Basjes wrote:
>>         > Hi,
>>         >
>>         > I have both JDK 8 and 14 on my system and yesterday I ran
>>         into this
>>         > exception (I put the info I have in this ticket
>>         > https://issues.apache.org/jira/browse/FLINK-18455 ) :
>>         >
>>         >         java.lang.NoSuchMethodError:
>>         > java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>>         >
>>         > >From digging around (
>>         >
>>         https://stackoverflow.com/questions/61267495/exception-in-thread-main-java-lang-nosuchmethoderror-java-nio-bytebuffer-flip
>>         > )
>>         > it seems this is caused when using JDK 9+ without setting
>>         '-release 8' and
>>         > then running with JDK 8.
>>         >
>>         > Essentially there are two solutions I see a lot:
>>         > 1) Add the -release 8 flag to the compiler
>>         > 2) Use the flaky workaround to cast all problem cases to
>>         the superclass
>>         > implementing the method (i.e. cast to java.nio.Buffer)
>>         >
>>         > Looking at the actual Flink code I found that
>>         >
>>         > In a JDK 8 build both source and target are set to Java 8
>>         >
>>         https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L109
>>         >  <java.version>1.8</java.version>
>>         >
>>         https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L115
>>         > <maven.compiler.source>${java.version}</maven.compiler.source>
>>         > <maven.compiler.target>${java.version}</maven.compiler.target>
>>         >
>>         > In a JDK 11 build a profile is activated that overrides it
>>         to Java 11
>>         >
>>         https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L938
>>         >    <profile>
>>         >       <id>java11</id>
>>         >       <activation>
>>         >           <jdk>11</jdk>
>>         >
>>         https://github.com/apache/flink/blob/d735d8cd8e5d9fae5322001099097581822453ae/pom.xml#L1004
>>         > <artifactId>maven-compiler-plugin</artifactId>
>>         >        <configuration>
>>         >            <source>11</source>
>>         >            <target>11</target>
>>         >
>>         > So when building with Java 11 the output classes are Java
>>         11 compatible
>>         > binaries.
>>         >
>>         > However I have Java 14 (and the 'java11' profile is only
>>         activated at the
>>         > EXACT version of java 11) so it stays at source and target
>>         1.8 but does not
>>         > specify the "release 8" setting. ... which causes the
>>         problems I see.
>>         >
>>         > Looking at the current build settings I was puzzled a bit
>>         and I have this
>>         > main question:
>>         >
>>         >      What is the Java version Apache Flink is supposed to
>>         work with?
>>         >
>>         > Currently I would expect Java 8.
>>         >
>>         > So what I propose for this problem as a fix is to set
>>         source, target and
>>         > release to Java 8 for all compiler versions (i.e. also Java
>>         9, 11, 14, ...).
>>         > That way you can use any compiler and get the correct results.
>>         >
>>         > I also am curious if that would fix the tests that seem to
>>         fail under Java
>>         > 11.
>>         >
>>         > What do you think is the correct approach for this?
>>         >
>>
>>
>>
>>     -- 
>>     Best regards / Met vriendelijke groeten,
>>
>>     Niels Basjes
>
>
>
>
> -- 
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes