You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@daffodil.apache.org by "Interrante, John A (GE Research, US)" <in...@research.ge.com> on 2020/09/30 18:20:42 UTC

Can Daffodil drop support for Scala 2.11?

I'd like to ask the Apache Daffodil developers to weigh on this question:

                Does Daffodil need to run on Scala 2.11 anymore?

I've been told the only reason why we're still publishing Scala 2.11 builds is for Apache Spark, which had been working only on Scala 2.11 for a long time even after Scala 2.12 came out.  However, the Spark 2.4 releases have been running on both 2.11 and 2.12 since November 2018 and Scala 2.12 has become the default language for the Spark 3.0 releases.  In fact, the Spark 3.0 releases have removed support for 2.11 although they have not completed all of the changes needed to support Scala 2.13 yet.

Does anyone know of any reasons why Daffodil needs to continue building on Scala 2.11?  My motivation for asking is because my pull request uses an open source library called os-lib in the runtime2 backend but os-lib has not published any new Scala 2.11 builds since March 2019.

John

Re: Can Daffodil drop support for Scala 2.11?

Posted by Steve Lawrence <sl...@apache.org>.
+1 from me.

Sounds like Spark isn't an issue anymore.

Also, note that the last time there was a Scala 2.11.x release was
2.11.12 at the end of 2017 [1]. So there's been plenty of time for tools
to upgrade.  And it sounds like there are no plans to create a 2.11.13
release [2].

- Steve

[1] https://github.com/scala/scala/releases/tag/v2.11.12
[2] https://github.com/scala/scala-dev/issues/451

On 9/30/20 2:38 PM, Beckerle, Mike wrote:
> There's actually some code-conditionalizations for 2.11 vs. 2.12 that would be able to go away if we get rid of 2.11, which would be an added benefit.
> 
> I would support getting rid of 2.11.
> 
> I did use an apache zeppelin notebook with Daffodil that needed 2.11. It was required to use 2.11 because it was integrated with an older revision of Apache spark which was still restricted to 2.11. So not only does Apache Spark have to have 2.12, but things like zeppelin that use Apache Spark must have moved on from the 2.11 to 2.12 version of spark.
> 
> I just checked, however, and the apache zeppelin source tree's spark folder has 2.10, 2.11, and 2.12 subdirectories, so I think this not an issue.
> 
> ________________________________
> From: Interrante, John A (GE Research, US) <in...@research.ge.com>
> Sent: Wednesday, September 30, 2020 2:20 PM
> To: dev@daffodil.apache.org <de...@daffodil.apache.org>
> Subject: Can Daffodil drop support for Scala 2.11?
> 
> I'd like to ask the Apache Daffodil developers to weigh on this question:
> 
>                 Does Daffodil need to run on Scala 2.11 anymore?
> 
> I've been told the only reason why we're still publishing Scala 2.11 builds is for Apache Spark, which had been working only on Scala 2.11 for a long time even after Scala 2.12 came out.  However, the Spark 2.4 releases have been running on both 2.11 and 2.12 since November 2018 and Scala 2.12 has become the default language for the Spark 3.0 releases.  In fact, the Spark 3.0 releases have removed support for 2.11 although they have not completed all of the changes needed to support Scala 2.13 yet.
> 
> Does anyone know of any reasons why Daffodil needs to continue building on Scala 2.11?  My motivation for asking is because my pull request uses an open source library called os-lib in the runtime2 backend but os-lib has not published any new Scala 2.11 builds since March 2019.
> 
> John
> 


Re: Can Daffodil drop support for Scala 2.11?

Posted by "Beckerle, Mike" <mb...@owlcyberdefense.com>.
There's actually some code-conditionalizations for 2.11 vs. 2.12 that would be able to go away if we get rid of 2.11, which would be an added benefit.

I would support getting rid of 2.11.

I did use an apache zeppelin notebook with Daffodil that needed 2.11. It was required to use 2.11 because it was integrated with an older revision of Apache spark which was still restricted to 2.11. So not only does Apache Spark have to have 2.12, but things like zeppelin that use Apache Spark must have moved on from the 2.11 to 2.12 version of spark.

I just checked, however, and the apache zeppelin source tree's spark folder has 2.10, 2.11, and 2.12 subdirectories, so I think this not an issue.

________________________________
From: Interrante, John A (GE Research, US) <in...@research.ge.com>
Sent: Wednesday, September 30, 2020 2:20 PM
To: dev@daffodil.apache.org <de...@daffodil.apache.org>
Subject: Can Daffodil drop support for Scala 2.11?

I'd like to ask the Apache Daffodil developers to weigh on this question:

                Does Daffodil need to run on Scala 2.11 anymore?

I've been told the only reason why we're still publishing Scala 2.11 builds is for Apache Spark, which had been working only on Scala 2.11 for a long time even after Scala 2.12 came out.  However, the Spark 2.4 releases have been running on both 2.11 and 2.12 since November 2018 and Scala 2.12 has become the default language for the Spark 3.0 releases.  In fact, the Spark 3.0 releases have removed support for 2.11 although they have not completed all of the changes needed to support Scala 2.13 yet.

Does anyone know of any reasons why Daffodil needs to continue building on Scala 2.11?  My motivation for asking is because my pull request uses an open source library called os-lib in the runtime2 backend but os-lib has not published any new Scala 2.11 builds since March 2019.

John