You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by DieBauer <gi...@git.apache.org> on 2017/04/08 10:50:15 UTC

[GitHub] flink pull request #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

GitHub user DieBauer opened a pull request:

    https://github.com/apache/flink/pull/3703

    [FLINK-5005] WIP: publish scala 2.12 artifacts

    - [ ] General
      - The pull request references the related JIRA issue ("[FLINK-XXX] Jira title text")
      - The pull request addresses only one issue
      - Each commit in the PR has a meaningful commit message (including the JIRA id)
    
    - [ ] Documentation
      - Documentation has been added for new functionality
      - Old documentation affected by the pull request has been updated
      - JavaDoc for public methods has been added
    
    - [ ] Tests & Build
      - Functionality added by the pull request is covered by tests
      - `mvn clean verify` has been executed successfully locally or a Travis build has passed
    
    
    This is an initial approach to make flink scala 2.12 ready.
    
    I've introduced profiles to switch between 2.12, 2.11 and 2.10. All three profiles now compile.
    
    `mvn clean install -D$version` where $version is `scala-2.12`, `scala-2.11` or `scala-2.10`.
    
    To overcome the `flakka` artifacts (akka2.3-custom) for scala 2.12, I've replaced them with the latest typesafe-akka artifacts when using the 2.12 profile.
    
    TravisCI profiles are added and I've changed the initial release script to accomodate for 2.12, but this is by no means finished. 
    
    I encountered a lot of compilation errors, because types could not be inferred. Therefore I've added types to problematic expressions.
    
    The kafka 0.10 dependency is bumped to 0.10.1.1 since that's the first released version for 2.12. 
    There is some trickery in the connector-parent-pom because only kafka-0.10 is released for 2.12, kafka-0.9 and kafka-0.8 aren't compiled for 2.12. I've to look into that a little more. 
    
    More updated dependencies: 
    javassist was bumped because of java.lang.IllegalStateException: Failed to transform class with name scala.concurrent.duration.Duration. Reason: javassist.bytecode.InterfaceMethodrefInfo cannot be cast to javassist.bytecode.MethodrefInfo. which led me to: http://stackoverflow.com/questions/31189086/powermock-and-java-8-issue-interfacemethodrefinfo-cannot-be-cast-to-methodrefin#37217871
    
    twitter-chill was bumped to 0.7.7 version for cross-compiled versions. 
    
    grizzled slf4j was bumped for scala 2.12 to version 1.3.0. 
    
    scalatest was bumped for scala 2.12 to version 3.0.1
    
    Right now I'm trying to make the travis build succeed.
    
    Any other suggestions are welcome!
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/DieBauer/flink feature/scala-2.12

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/3703.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3703
    
----
commit 5ac12faddda03456b5c70fabb0bae30a82104a2e
Author: Jens Kat <je...@gmail.com>
Date:   2017-01-03T21:32:58Z

    initial commit.

commit 1b345a1cb511660e39004ac44685573234a6dca0
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T12:59:17Z

    Merge branch 'master' of https://github.com/apache/flink into feature/scala-2.12
    
    # Conflicts:
    #	pom.xml

commit 7b8c8872f3597986df63c097053abebcf276f861
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T13:05:55Z

    Change shell scripts

commit 61d38b8ee3573df7d09734b6d086c1501363d339
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T16:22:36Z

    part1

commit 7f32cad571fb86618f11ba3cc9f5dd06dbd63f52
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T18:34:06Z

    flink-tests

commit e858db282d6546e7e4afc016b18a28aad231ab04
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T18:51:13Z

    fix compile error
    
    [ERROR] /Users/jens/Development/flink/flink-tests/src/test/java/org/apache/flink/test/javaApiOperators/GroupCombineITCase.java:54: error: not found: type TestExecutionMode
    [INFO]  public GroupCombineITCase(TestExecutionMode mode) {
    
    maybe related to https://github.com/scala/bug/issues/10207 ?

commit 260a8356c180c23484f73b5ab13e5f7cbd5a997f
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T19:02:05Z

    since scala 2.12 use java.util.concurrent.forkjoinpool directly

commit 23f513dbf92121e451ff6929269ee76623b3a931
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T19:42:48Z

    fix ambiguous methods

commit b7d48ecc3c2dd14dd916ee95749d9aab286a88b8
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T20:14:49Z

    take into account that only kafka 0.10 is for scala 2.12

commit 4ac7203add03fcebb185eb1ae105c93e86ba7e28
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T20:31:29Z

    gelly

commit cdeb7dea46c6f2075befa9c964b642a38e6c4723
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T20:35:35Z

    extract version breeze

commit 5c7ea523d2cd2c0443717ea39fc4b4506916508a
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T20:58:48Z

    fix flink ml

commit f96c46a18160ba63f86bc6c6a768099c45a0b070
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-07T21:11:50Z

    add iloopcompat for scala-2.12

commit 0203d7d9723690640c1fcb797fe766c561ce02a3
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T07:35:32Z

    make it compile with 2.12

commit 8187b9ba900c4075233267f4ee65e7cf20bbb98b
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T07:36:22Z

    build all with travis

commit 26daa5fbba8d0c99f92d6cf0677d2c248d287428
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:07:58Z

    profiles

commit 20be009b5c935a5c352ae90f4f482c38167c4d38
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:22:04Z

    update pom profile

commit 565d82b1fced77f70f2aa409b9eb52e5030d0cf5
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:24:24Z

    updated pom profile

commit 1f626508d10966d55b0af5ae2a64ade5f8fcfad9
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:28:19Z

    update javassist
    
    java.lang.IllegalStateException: Failed to transform class with name scala.concurrent.duration.Duration. Reason: javassist.bytecode.InterfaceMethodrefInfo cannot be cast to javassist.bytecode.MethodrefInfo
    
    http://stackoverflow.com/questions/31189086/powermock-and-java-8-issue-interfacemethodrefinfo-cannot-be-cast-to-methodrefin#37217871

commit 7c52776ac1187203e5da740b1333cde1275369d3
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:53:51Z

    bump scalatest

commit bee792ae7c813109577af964f735a19ea2a0ffcc
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:54:10Z

    Add types to foreach

commit 5362eb077dd32c1b1cff53678f79d819765ebd2c
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T08:58:56Z

    make profiles better

commit 6d413df78ef445b0ae966bfef4ffa4fc480ab81b
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T09:07:08Z

    update chill to 2.12 version compat

commit 3bcf3d59ade5eb92fa78a6ec122c7e5c7316b778
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T09:50:01Z

    use jdk8 profile for travis and scala 2.12

commit 6f47d85e76fdf46988a8ad22ef346f3b19ccf6b0
Author: Jens Kat <je...@gmail.com>
Date:   2017-04-08T10:14:28Z

    clean up

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    One thing you can try and do is to run `TypeExtractionUtils.checkAndExtractLambda` to see if it is a generated serializable Lambda.
    In the case of a Lambda, you could switch to a different code path (possibly not clean anything in the first version).
    
    @twalthr may have some thoughts on that as well...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    So I guess this PR is abandoned?


---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by DieBauer <gi...@git.apache.org>.
Github user DieBauer commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    I'm running into an issue with asm in for example the flink-scala module, when compiling with 2.12.
    
    ```
    java.io.IOException: Class not found
    	at org.objectweb.asm.ClassReader.a(Unknown Source)
    	at org.objectweb.asm.ClassReader.<init>(Unknown Source)
    	at org.apache.flink.api.scala.ClosureCleaner$.getClassReader(ClosureCleaner.scala:44)
    	at org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:92)
    	at org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:115)
    	at org.apache.flink.api.scala.DataSet.clean(DataSet.scala:125)
    	at org.apache.flink.api.scala.DataSet$$anon$12.<init>(DataSet.scala:910)
    ```
    since the closurecleaner was initially copied from spark, I've looked there and found an issues (https://github.com/apache/spark/pull/9512) regarding asm5 and java8. 
    However, flink is already using asm5 in the closurecleaner. Their dependency is         
    
    ```
    <groupId>org.apache.xbean</groupId>
    <artifactId>xbean-asm5-shaded</artifactId>
    ```
    and ours is from org.ow2.asm, asm. There are things going on in the shaded plugin in the parent pom with regard to relocating dependencies of asm, but I'm not sure how that all works out. 
    
    So for now, I'm a bit puzzled why we get this error. 
    
    @greghogan you're right, the profile jdk8 is only enabling the module with examples in java8. But since they are also compiled in the scala-2.11 case, I thought we want to have them? We can drop it of course. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @StephanEwen Nice! That looks promising.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by greghogan <gi...@git.apache.org>.
Github user greghogan commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @DieBauer thanks for taking this on! I haven't been using Flink with Scala but I think this will be important to have for the May release.
    
    The required changes for type inferences are interesting. I'm puzzled why this would regress. Also, if developers are writing against 2.10 then these issues will not manifest until integration tests are run (the same problem you are experiencing).
    
    One other thought: since Scala 2.12 requires Java 8, is it still necessary to specify `jdk8` when executing the `scala-2.12` profile?
    
    Flink Forward starts Monday so developer activity will be low this week. @StephanEwen thoughts when you have the chance?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    Going into the direction of dropping flakka would be a good win for users of Flink like us that are dealing everyday with SBT/Maven hacking to be able to use vanilla Akka and Flink.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    We cannot add any other dependencies to the pom files. Adding "akka" back will create a conflict with the "flakka" files.
    
    What we can do is wither of the following two options:
      - Release "flakka" for Scala 2.12 and then we need to change nothing in Flink. The flakka code is at https://github.com/mxm/flakka - we can do the release, you could help up by checking out what needs to be done to use flakka with Scala 2.12 (if it is at all possible)
    
      - See if we can pull out the dependency as a property ans use "flakka" int the Scala 2.10 and 2.11 case and use vanilla akka 2.4 in the java8/scala2.12 case. That would be a lot of Maven hacking, though - if possible, I would prefer the first variant (less complexity in Flink).
    
    We can also not add more Travis build profiles (builds take too long already). We need to keep that number as it is and simply select one of these profiles to use Scala 2.12 rather than for example 2.10.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    Any news on this?
    2.13.0-M1 is out https://github.com/scala/scala/releases/tag/v2.13.0-M1
    I'm wondering if we will still be on 2.11 when 2,13,0 is out.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    There is about 10% Flink users on Java 7 (we did a poll recently).
    Big clusters change slowly...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    > Java 7 Reaches End of Life. Oracle ceased public availability of security fixes and upgrades for Java 7 as of April 2015
    We are in 2017
    
    @StephanEwen but I understand if there is people still using Java 7 👍


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by DieBauer <gi...@git.apache.org>.
Github user DieBauer commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    Hi, I'm sorry for the late reaction. I haven't found the time to work on this anymore (also priorities shifted... )
    
    Therefore this pull request is stale. (it still could be used as a reference).
    
    I think the main challenge is in serialising the java8 lambdas. And dropping the support for scala 2.10 and Java7 certainly helps in taming the pom.xml profiles. 
    
    I will close this pull request to not keep the hopes up.


---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by DieBauer <gi...@git.apache.org>.
Github user DieBauer commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    Ok, started looking into the issue a bit more. 
    It seems like it has to do with the new lambda generation in scala 2.12 and not with the asm library. 
    
    From the scala 2.12-M3 release notes (https://github.com/scala/scala/releases/tag/v2.12.0-M3): 
    
    > Java 8 style closure classes
    > 
    > Scala 2.12 emits closures in the same style as Java 8.
    > 
    > For each lambda the compiler generates a method containing the lambda body.
    > At runtime, this method is passed as an argument to the LambdaMetaFactory provided by the JDK, which creates a closure object.
    > 
    > Compared to Scala 2.11, the new scheme has the advantage that the compiler does not generate an anonymous class for each lambda anymore.
    > This leads to significantly smaller JAR files.
    
    
    Our ClosureCleaner uses the class name for instantiating the ClassReader, which is used later on. 
    
    However, since scala2.12 doesn't generate anonymous classes, the file isn't found (null), therefore we get class not found exception, which make sense now. 
    
    We have to look into how to circumvent/implement this new generation of 'lambdas'. 
    
    A small technical example, the testclass which throwed an exception `AcceptPFTestBase`. 
      And then the line containing: `protected val groupedTuples = tuples.groupBy(_._1)`
    Since tuples is a `Dataset` the function that we have to check is `_._1` (an anonymous function). 
    
    when compiling/executing with scala 2.11 
    we get `class org.apache.flink.api.scala.extensions.base.AcceptPFTestBase$$anonfun$1` as `cls` in the `ClosureCleaner.getClassReader` method. 
    And this is indeed a file generated by the scala compiler and can be resolved by `ls.getResourceAsStream(className)`.
    
    However when using scala 2.12
    we get `class org.apache.flink.api.scala.extensions.base.AcceptPFTestBase$$Lambda$11/1489743810` which is not an existing file, and cannot be resolved by `ls.getResourceAsStream(className)`. 
    
    
    
    Concluding, according to me, with the new scala 2.12 style with lambdas, the current closurecleaner doesn't suffice. 
    There is also a Spark issue (https://issues.apache.org/jira/browse/SPARK-14540) regarding closures in scala 2.12/java8. 
    
    Any thoughts how to proceed?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by frankbohman <gi...@git.apache.org>.
Github user frankbohman commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    where can i watch the status page-thing.. that tells us when we can get off of old 2.11 ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by aljoscha <gi...@git.apache.org>.
Github user aljoscha commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @DieBauer Do you still wan't to work on this? I also started trying to make Flink ready for 2.12 before I noticed this older branch. I'd be very happy to stop, though, if you're interested in bringing this to an end. It should be easier now that we dropped Java 8 support and also agreed to drop Scala 2.10 support.


---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by ariskk <gi...@git.apache.org>.
Github user ariskk commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    We are really looking forward to this 👍 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @joan - I actually agree with you. We needed to use "flakka" to be able to support Java 7 and bind to a wildcard address (across interfaces).
    
    Would be great to be able to do that differently and not have a custom akka build (at least for Java 8 / Scala 2.11 / 2.12)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by DieBauer <gi...@git.apache.org>.
Github user DieBauer closed the pull request at:

    https://github.com/apache/flink/pull/3703


---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @greghogan That's a pretty good news.
    Thanks for your answer.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by joan38 <gi...@git.apache.org>.
Github user joan38 commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    Is there any news on this?


---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by greghogan <gi...@git.apache.org>.
Github user greghogan commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @joan38 there has been a discussion on the mailing list about dropping Java 7 support (no one has objected) which will make it simpler to support Scala 2.12 in the upcoming release.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by fhueske <gi...@git.apache.org>.
Github user fhueske commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @frankbohman just watch the JIRA issue: https://issues.apache.org/jira/browse/FLINK-5005


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3703: [FLINK-5005] WIP: publish scala 2.12 artifacts

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the issue:

    https://github.com/apache/flink/pull/3703
  
    @joan38 I have some WIP for a flag that allows you to use vanilla akka when running on Java 8, Scala 2.11
    
    Here is the branch: https://github.com/StephanEwen/incubator-flink/commits/vanilla_akka
    
    You can try to build it via: `mvn clean package -Dscala-2.11 -Pjdk8,vanilla-akka`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---