You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by DB Tsai <d_...@apple.com> on 2018/11/06 19:12:58 UTC

Make Scala 2.12 as default Scala version in Spark 3.0

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks, 

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Dongjoon Hyun <do...@gmail.com>.
+1 for making Scala 2.12 as default for Spark 3.0.

Bests,
Dongjoon.


On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_...@apple.com> wrote:

> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by DB Tsai <db...@dbtsai.com.INVALID>.
Most of the time in the PR build is on running tests. How about we
also add Scala 2.11 compilation for both main and test without running
the tests in the PR build?

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0

On Fri, Nov 16, 2018 at 10:09 PM Marcelo Vanzin
<va...@cloudera.com.invalid> wrote:
>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>
> On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_...@apple.com> wrote:
> >
> > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
> >
> > Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
> >
> > We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
> >
> > What do you think?
> >
> > Thanks,
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Justin Miller <ju...@gospotcheck.com>.
I’d add if folks rely on Twitter in their stack, they might be stuck on
older versions for a while (of their Twitter libs) which might require they
stay on 2.11 for longer than they might otherwise like.

On Friday, November 16, 2018, Marcelo Vanzin <va...@cloudera.com.invalid>
wrote:

> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>
> On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_...@apple.com> wrote:
> >
> > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
> >
> > Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely
> to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community, https://github.com/scala/
> scala-dev/issues/559#issuecomment-436160166
> >
> > We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
> >
> > What do you think?
> >
> > Thanks,
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

-- 

Justin Miller
Senior Data Engineer
*GoSpotCheck*
Direct: 720-517-3979 <+17205173979>
Email: justin@gospotcheck.com

September 24-26, 2018
Denver, Colorado Learn More and Register
<https://www.gospotcheck.com/field-days/>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by shane knapp <sk...@berkeley.edu>.
>
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>

i actually beg to differ...  it's more of a PITA than you might realize
managing more than one PRB (we have two already).

a much better solution would be for the test launching code either in the
PRB config, or scripts in the repo manage this.
-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by DB Tsai <db...@dbtsai.com.INVALID>.
+1 on removing Scala 2.11 support for 3.0 given  Scala 2.11 is already EOL.

On Tue, Nov 20, 2018 at 2:53 PM Sean Owen <sr...@apache.org> wrote:

> PS: pull request at https://github.com/apache/spark/pull/23098
> Not going to merge it until there's clear agreement.
>
> On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue <rb...@netflix.com> wrote:
> >
> > +1 to removing 2.11 support for 3.0 and a PR.
> >
> > It sounds like having multiple Scala builds is just not feasible and I
> don't think this will be too disruptive for users since it is already a
> breaking change.
> >
> > On Tue, Nov 20, 2018 at 7:05 AM Sean Owen <sr...@apache.org> wrote:
> >>
> >> One more data point -- from looking at the SBT build yesterday, it
> >> seems like most plugin updates require SBT 1.x. And both they and SBT
> >> 1.x seem to need Scala 2.12. And the new zinc also does.
> >> Now, the current SBT and zinc and plugins all appear to work OK with
> >> 2.12 now, but updating will pretty much have to wait until 2.11
> >> support goes. (I don't think it's feasible to have two SBT builds.)
> >>
> >> I actually haven't heard an argument for keeping 2.11, compared to the
> >> overhead of maintaining it. Any substantive objections? Would it be
> >> too forward to put out a WIP PR that removes it?
> >>
> >> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen <sr...@apache.org> wrote:
> >> >
> >> > I support dropping 2.11 support. My general logic is:
> >> >
> >> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
> >> > Spark 3 arrives
> >> > - I haven't heard of a critical dependency that has no 2.12
> counterpart
> >> > - 2.11 users can stay on 2.4.x, which will be notionally supported
> >> > through, say, end of 2019
> >> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> >> > experience resolving these differences across these two versions; it's
> >> > a hassle as you need two git clones with different scala versions in
> >> > the project tags
> >> > - The project is already short on resources to support things as it is
> >> > - Dropping things is generally necessary to add new things, to keep
> >> > complexity reasonable -- like Scala 2.13 support
> >> >
> >> > Maintaining a separate PR builder for 2.11 isn't so bad
> >> >
> >> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
> >> > <va...@cloudera.com.invalid> wrote:
> >> > >
> >> > > Now that the switch to 2.12 by default has been made, it might be
> good
> >> > > to have a serious discussion about dropping 2.11 altogether. Many of
> >> > > the main arguments have already been talked about. But I don't
> >> > > remember anyone mentioning how easy it would be to break the 2.11
> >> > > build now.
> >> > >
> >> > > For example, the following works fine in 2.12 but breaks in 2.11:
> >> > >
> >> > > java.util.Arrays.asList("hi").stream().forEach(println)
> >> > >
> >> > > We had a similar issue when we supported java 1.6 but the builds
> were
> >> > > all on 1.7 by default. Every once in a while something would
> silently
> >> > > break, because PR builds only check the default. And the jenkins
> >> > > builds, which are less monitored, would stay broken for a while.
> >> > >
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
> >
> >
> > --
> > Ryan Blue
> > Software Engineer
> > Netflix
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
> --
- DB Sent from my iPhone

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@apache.org>.
PS: pull request at https://github.com/apache/spark/pull/23098
Not going to merge it until there's clear agreement.

On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue <rb...@netflix.com> wrote:
>
> +1 to removing 2.11 support for 3.0 and a PR.
>
> It sounds like having multiple Scala builds is just not feasible and I don't think this will be too disruptive for users since it is already a breaking change.
>
> On Tue, Nov 20, 2018 at 7:05 AM Sean Owen <sr...@apache.org> wrote:
>>
>> One more data point -- from looking at the SBT build yesterday, it
>> seems like most plugin updates require SBT 1.x. And both they and SBT
>> 1.x seem to need Scala 2.12. And the new zinc also does.
>> Now, the current SBT and zinc and plugins all appear to work OK with
>> 2.12 now, but updating will pretty much have to wait until 2.11
>> support goes. (I don't think it's feasible to have two SBT builds.)
>>
>> I actually haven't heard an argument for keeping 2.11, compared to the
>> overhead of maintaining it. Any substantive objections? Would it be
>> too forward to put out a WIP PR that removes it?
>>
>> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen <sr...@apache.org> wrote:
>> >
>> > I support dropping 2.11 support. My general logic is:
>> >
>> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
>> > Spark 3 arrives
>> > - I haven't heard of a critical dependency that has no 2.12 counterpart
>> > - 2.11 users can stay on 2.4.x, which will be notionally supported
>> > through, say, end of 2019
>> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
>> > experience resolving these differences across these two versions; it's
>> > a hassle as you need two git clones with different scala versions in
>> > the project tags
>> > - The project is already short on resources to support things as it is
>> > - Dropping things is generally necessary to add new things, to keep
>> > complexity reasonable -- like Scala 2.13 support
>> >
>> > Maintaining a separate PR builder for 2.11 isn't so bad
>> >
>> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
>> > <va...@cloudera.com.invalid> wrote:
>> > >
>> > > Now that the switch to 2.12 by default has been made, it might be good
>> > > to have a serious discussion about dropping 2.11 altogether. Many of
>> > > the main arguments have already been talked about. But I don't
>> > > remember anyone mentioning how easy it would be to break the 2.11
>> > > build now.
>> > >
>> > > For example, the following works fine in 2.12 but breaks in 2.11:
>> > >
>> > > java.util.Arrays.asList("hi").stream().forEach(println)
>> > >
>> > > We had a similar issue when we supported java 1.6 but the builds were
>> > > all on 1.7 by default. Every once in a while something would silently
>> > > break, because PR builds only check the default. And the jenkins
>> > > builds, which are less monitored, would stay broken for a while.
>> > >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by shane knapp <sk...@berkeley.edu>.
ok, i think the "how do we, and how many builds for different versions of
scala" thing is getting folks confused:

1)  we can easily have more than one non-pull-request-builders to test
against N versions of scala
2)  we have one pull request builder which will test against the root pom,
which is now 2.12

On Tue, Nov 20, 2018 at 8:16 AM Ryan Blue <rb...@netflix.com.invalid> wrote:

> +1 to removing 2.11 support for 3.0 and a PR.
>
> It sounds like having multiple Scala builds is just not feasible and I
> don't think this will be too disruptive for users since it is already a
> breaking change.
>
> On Tue, Nov 20, 2018 at 7:05 AM Sean Owen <sr...@apache.org> wrote:
>
>> One more data point -- from looking at the SBT build yesterday, it
>> seems like most plugin updates require SBT 1.x. And both they and SBT
>> 1.x seem to need Scala 2.12. And the new zinc also does.
>> Now, the current SBT and zinc and plugins all appear to work OK with
>> 2.12 now, but updating will pretty much have to wait until 2.11
>> support goes. (I don't think it's feasible to have two SBT builds.)
>>
>> I actually haven't heard an argument for keeping 2.11, compared to the
>> overhead of maintaining it. Any substantive objections? Would it be
>> too forward to put out a WIP PR that removes it?
>>
>> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen <sr...@apache.org> wrote:
>> >
>> > I support dropping 2.11 support. My general logic is:
>> >
>> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
>> > Spark 3 arrives
>> > - I haven't heard of a critical dependency that has no 2.12 counterpart
>> > - 2.11 users can stay on 2.4.x, which will be notionally supported
>> > through, say, end of 2019
>> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
>> > experience resolving these differences across these two versions; it's
>> > a hassle as you need two git clones with different scala versions in
>> > the project tags
>> > - The project is already short on resources to support things as it is
>> > - Dropping things is generally necessary to add new things, to keep
>> > complexity reasonable -- like Scala 2.13 support
>> >
>> > Maintaining a separate PR builder for 2.11 isn't so bad
>> >
>> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
>> > <va...@cloudera.com.invalid> wrote:
>> > >
>> > > Now that the switch to 2.12 by default has been made, it might be good
>> > > to have a serious discussion about dropping 2.11 altogether. Many of
>> > > the main arguments have already been talked about. But I don't
>> > > remember anyone mentioning how easy it would be to break the 2.11
>> > > build now.
>> > >
>> > > For example, the following works fine in 2.12 but breaks in 2.11:
>> > >
>> > > java.util.Arrays.asList("hi").stream().forEach(println)
>> > >
>> > > We had a similar issue when we supported java 1.6 but the builds were
>> > > all on 1.7 by default. Every once in a while something would silently
>> > > break, because PR builds only check the default. And the jenkins
>> > > builds, which are less monitored, would stay broken for a while.
>> > >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>


-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Ryan Blue <rb...@netflix.com.INVALID>.
+1 to removing 2.11 support for 3.0 and a PR.

It sounds like having multiple Scala builds is just not feasible and I
don't think this will be too disruptive for users since it is already a
breaking change.

On Tue, Nov 20, 2018 at 7:05 AM Sean Owen <sr...@apache.org> wrote:

> One more data point -- from looking at the SBT build yesterday, it
> seems like most plugin updates require SBT 1.x. And both they and SBT
> 1.x seem to need Scala 2.12. And the new zinc also does.
> Now, the current SBT and zinc and plugins all appear to work OK with
> 2.12 now, but updating will pretty much have to wait until 2.11
> support goes. (I don't think it's feasible to have two SBT builds.)
>
> I actually haven't heard an argument for keeping 2.11, compared to the
> overhead of maintaining it. Any substantive objections? Would it be
> too forward to put out a WIP PR that removes it?
>
> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen <sr...@apache.org> wrote:
> >
> > I support dropping 2.11 support. My general logic is:
> >
> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
> > Spark 3 arrives
> > - I haven't heard of a critical dependency that has no 2.12 counterpart
> > - 2.11 users can stay on 2.4.x, which will be notionally supported
> > through, say, end of 2019
> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> > experience resolving these differences across these two versions; it's
> > a hassle as you need two git clones with different scala versions in
> > the project tags
> > - The project is already short on resources to support things as it is
> > - Dropping things is generally necessary to add new things, to keep
> > complexity reasonable -- like Scala 2.13 support
> >
> > Maintaining a separate PR builder for 2.11 isn't so bad
> >
> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
> > <va...@cloudera.com.invalid> wrote:
> > >
> > > Now that the switch to 2.12 by default has been made, it might be good
> > > to have a serious discussion about dropping 2.11 altogether. Many of
> > > the main arguments have already been talked about. But I don't
> > > remember anyone mentioning how easy it would be to break the 2.11
> > > build now.
> > >
> > > For example, the following works fine in 2.12 but breaks in 2.11:
> > >
> > > java.util.Arrays.asList("hi").stream().forEach(println)
> > >
> > > We had a similar issue when we supported java 1.6 but the builds were
> > > all on 1.7 by default. Every once in a while something would silently
> > > break, because PR builds only check the default. And the jenkins
> > > builds, which are less monitored, would stay broken for a while.
> > >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@apache.org>.
One more data point -- from looking at the SBT build yesterday, it
seems like most plugin updates require SBT 1.x. And both they and SBT
1.x seem to need Scala 2.12. And the new zinc also does.
Now, the current SBT and zinc and plugins all appear to work OK with
2.12 now, but updating will pretty much have to wait until 2.11
support goes. (I don't think it's feasible to have two SBT builds.)

I actually haven't heard an argument for keeping 2.11, compared to the
overhead of maintaining it. Any substantive objections? Would it be
too forward to put out a WIP PR that removes it?

On Sat, Nov 17, 2018 at 7:28 PM Sean Owen <sr...@apache.org> wrote:
>
> I support dropping 2.11 support. My general logic is:
>
> - 2.11 is EOL, and is all the more EOL in the middle of next year when
> Spark 3 arrives
> - I haven't heard of a critical dependency that has no 2.12 counterpart
> - 2.11 users can stay on 2.4.x, which will be notionally supported
> through, say, end of 2019
> - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> experience resolving these differences across these two versions; it's
> a hassle as you need two git clones with different scala versions in
> the project tags
> - The project is already short on resources to support things as it is
> - Dropping things is generally necessary to add new things, to keep
> complexity reasonable -- like Scala 2.13 support
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>
> On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
> <va...@cloudera.com.invalid> wrote:
> >
> > Now that the switch to 2.12 by default has been made, it might be good
> > to have a serious discussion about dropping 2.11 altogether. Many of
> > the main arguments have already been talked about. But I don't
> > remember anyone mentioning how easy it would be to break the 2.11
> > build now.
> >
> > For example, the following works fine in 2.12 but breaks in 2.11:
> >
> > java.util.Arrays.asList("hi").stream().forEach(println)
> >
> > We had a similar issue when we supported java 1.6 but the builds were
> > all on 1.7 by default. Every once in a while something would silently
> > break, because PR builds only check the default. And the jenkins
> > builds, which are less monitored, would stay broken for a while.
> >

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@apache.org>.
I support dropping 2.11 support. My general logic is:

- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end of 2019
- Maintaining 2.11 vs 2.12 support is modestly difficult, in my
experience resolving these differences across these two versions; it's
a hassle as you need two git clones with different scala versions in
the project tags
- The project is already short on resources to support things as it is
- Dropping things is generally necessary to add new things, to keep
complexity reasonable -- like Scala 2.13 support

Maintaining a separate PR builder for 2.11 isn't so bad

On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
<va...@cloudera.com.invalid> wrote:
>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Marcelo Vanzin <va...@cloudera.com.INVALID>.
Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.

For example, the following works fine in 2.12 but breaks in 2.11:

java.util.Arrays.asList("hi").stream().forEach(println)

We had a similar issue when we supported java 1.6 but the builds were
all on 1.7 by default. Every once in a while something would silently
break, because PR builds only check the default. And the jenkins
builds, which are less monitored, would stay broken for a while.

On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_...@apple.com> wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@gmail.com>.
This seems fine to me. At least we should be primarily testing against
2.12 now.
Shane will need to alter the current 2.12 master build to actually
test 2.11, but should be a trivial change.

On Thu, Nov 8, 2018 at 12:11 AM DB Tsai <db...@dbtsai.com> wrote:
>
> Based on the discussions, I created a PR that makes Spark's default
> Scala version as 2.12, and then Scala 2.11 will be the alternative
> version. This implies that Scala 2.12 will be used by our CI builds
> including pull request builds.
>
> https://github.com/apache/spark/pull/22967
>
> We can decide later if we want to change the alternative Scala version
> to 2.13 and drop 2.11 if we just want to support two Scala versions at
> one time.
>
> Thanks.
>
> Sincerely,
>
> DB Tsai
> ----------------------------------------------------------
> Web: https://www.dbtsai.com
> PGP Key ID: 0x5CED8B896A6BDFA0
> On Wed, Nov 7, 2018 at 11:18 AM Sean Owen <sr...@gmail.com> wrote:
> >
> > It's not making 2.12 the default, but not dropping 2.11. Supporting
> > 2.13 could mean supporting 3 Scala versions at once, which I claim is
> > just too much. I think the options are likely:
> >
> > - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> > default. Add 2.13 support in 3.x and drop 2.11 in the same release
> > - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> > Drop 2.11 support in Spark 3.0, and support only 2.12.
> > - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
> >
> >
> > On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <ma...@clearstorydata.com> wrote:
> > >
> > > I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
> > >
> > > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sr...@gmail.com> wrote:
> > >>
> > >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> > >> support in 3.0 for this, if it were otherwise ready to go?
> > >> I think it's not a hard rule that something has to be deprecated
> > >> previously to be removed in a major release. The notice is helpful,
> > >> sure, but there are lots of ways to provide that notice to end users.
> > >> Lots of things are breaking changes in a major release. Or: deprecate
> > >> in Spark 2.4.1, if desired?
> > >>
> > >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
> > >> >
> > >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
> > >> >
> > >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
> > >> >>
> > >> >> Have we deprecated Scala 2.11 already in an existing release?
> > >>
> > >> ---------------------------------------------------------------------
> > >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> > >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by DB Tsai <db...@dbtsai.com.INVALID>.
Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.

https://github.com/apache/spark/pull/22967

We can decide later if we want to change the alternative Scala version
to 2.13 and drop 2.11 if we just want to support two Scala versions at
one time.

Thanks.

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0
On Wed, Nov 7, 2018 at 11:18 AM Sean Owen <sr...@gmail.com> wrote:
>
> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <ma...@clearstorydata.com> wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sr...@gmail.com> wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Mark Hamstra <ma...@clearstorydata.com>.
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and
2.13 at the same time; always 2.12; now figure out when we stop 2.11
support and start 2.13 support.

On Wed, Nov 7, 2018 at 11:10 AM Sean Owen <sr...@gmail.com> wrote:

> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <ma...@clearstorydata.com>
> wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in
> making 2.12 the default Scala version in Spark 3.0 that would prevent us
> from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sr...@gmail.com> wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10
> in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com>
> wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@gmail.com>.
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:

- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop 2.11 in the same release
- Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
Drop 2.11 support in Spark 3.0, and support only 2.12.
- (same as above, but add Spark 2.13 support if possible for Spark 3.0)


On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <ma...@clearstorydata.com> wrote:
>
> I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
>
> On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sr...@gmail.com> wrote:
>>
>> That's possible here, sure. The issue is: would you exclude Scala 2.13
>> support in 3.0 for this, if it were otherwise ready to go?
>> I think it's not a hard rule that something has to be deprecated
>> previously to be removed in a major release. The notice is helpful,
>> sure, but there are lots of ways to provide that notice to end users.
>> Lots of things are breaking changes in a major release. Or: deprecate
>> in Spark 2.4.1, if desired?
>>
>> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
>> >
>> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>> >
>> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
>> >>
>> >> Have we deprecated Scala 2.11 already in an existing release?
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Mark Hamstra <ma...@clearstorydata.com>.
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?

On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sr...@gmail.com> wrote:

> That's possible here, sure. The issue is: would you exclude Scala 2.13
> support in 3.0 for this, if it were otherwise ready to go?
> I think it's not a hard rule that something has to be deprecated
> previously to be removed in a major release. The notice is helpful,
> sure, but there are lots of ways to provide that notice to end users.
> Lots of things are breaking changes in a major release. Or: deprecate
> in Spark 2.4.1, if desired?
>
> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
> >
> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >
> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
> >>
> >> Have we deprecated Scala 2.11 already in an existing release?
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Dean Wampler <de...@gmail.com>.
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1
release in January and GA a few months later. Of course, nothing is ever
certain. What's the thinking for the Spark 3.0 timeline? If it's likely to
be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an
alternative Scala version.

dean


*Dean Wampler, Ph.D.*

*VP, Fast Data Engineering at Lightbend*
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures
for Streaming Applications
<http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
and other content from O'Reilly
@deanwampler <http://twitter.com/deanwampler>
https://www.linkedin.com/in/deanwampler/
http://polyglotprogramming.com
https://github.com/deanwampler
https://www.flickr.com/photos/deanwampler/


On Tue, Nov 6, 2018 at 7:48 PM Sean Owen <sr...@gmail.com> wrote:

> That's possible here, sure. The issue is: would you exclude Scala 2.13
> support in 3.0 for this, if it were otherwise ready to go?
> I think it's not a hard rule that something has to be deprecated
> previously to be removed in a major release. The notice is helpful,
> sure, but there are lots of ways to provide that notice to end users.
> Lots of things are breaking changes in a major release. Or: deprecate
> in Spark 2.4.1, if desired?
>
> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
> >
> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >
> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
> >>
> >> Have we deprecated Scala 2.11 already in an existing release?
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@gmail.com>.
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to provide that notice to end users.
Lots of things are breaking changes in a major release. Or: deprecate
in Spark 2.4.1, if desired?

On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cl...@gmail.com> wrote:
>
> We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>
> On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:
>>
>> Have we deprecated Scala 2.11 already in an existing release?

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Wenchen Fan <cl...@gmail.com>.
We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
3.x?

On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rx...@databricks.com> wrote:

> Have we deprecated Scala 2.11 already in an existing release?
>
> On Tue, Nov 6, 2018 at 4:43 PM DB Tsai <d_...@apple.com> wrote:
>
>> Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> > On Nov 6, 2018, at 2:55 PM, Felix Cheung <fe...@hotmail.com>
>> wrote:
>> >
>> > So to clarify, only scala 2.12 is supported in Spark 3?
>> >
>> >
>> > From: Ryan Blue <rb...@netflix.com.invalid>
>> > Sent: Tuesday, November 6, 2018 1:24 PM
>> > To: d_tsai@apple.com
>> > Cc: Sean Owen; Spark Dev List; cdelgado@apple.com
>> > Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>> >
>> > +1 to Scala 2.12 as the default in Spark 3.0.
>> >
>> > On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_...@apple.com> wrote:
>> > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>> >
>> > As Scala 2.11 will not support Java 11 unless we make a significant
>> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
>> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
>> Java 8. But I agree with Sean that this can make the decencies really
>> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>> >
>> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >
>> >> On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com> wrote:
>> >>
>> >> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> >> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> >> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> >> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> >> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>> >>
>> >> Java (9-)11 support also complicates this. I think getting it to work
>> >> will need some significant dependency updates, and I worry not all
>> >> will be available for 2.11 or will present some knotty problems. We'll
>> >> find out soon if that forces the issue.
>> >>
>> >> Also note that Scala 2.13 is pretty close to release, and we'll want
>> >> to support it soon after release, perhaps sooner than the long delay
>> >> before 2.12 was supported (because it was hard!). It will probably be
>> >> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> >> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> >> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> >> the release of Spark 3.0, I wonder if that too argues for dropping
>> >> 2.11 support.
>> >>
>> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> >> while, no matter what; it still exists in the 2.4.x branch of course.
>> >> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>> >>
>> >> Sean
>> >>
>> >>
>> >> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
>> >>>
>> >>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the
>> next Spark version will be 3.0, so it's a great time to discuss should we
>> make Scala 2.12 as default Scala version in Spark 3.0.
>> >>>
>> >>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's
>> unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor
>> the needed work per discussion in Scala community,
>> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>> >>>
>> >>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to
>> make Scala 2.12 as default for Spark 3.0 now, we will have ample time to
>> work on bugs and issues that we may run into.
>> >>>
>> >>> What do you think?
>> >>>
>> >>> Thanks,
>> >>>
>> >>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >>>
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >>>
>> >
>> >
>> >
>> > --
>> > Ryan Blue
>> > Software Engineer
>> > Netflix
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Reynold Xin <rx...@databricks.com>.
Have we deprecated Scala 2.11 already in an existing release?

On Tue, Nov 6, 2018 at 4:43 PM DB Tsai <d_...@apple.com> wrote:

> Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
> > On Nov 6, 2018, at 2:55 PM, Felix Cheung <fe...@hotmail.com>
> wrote:
> >
> > So to clarify, only scala 2.12 is supported in Spark 3?
> >
> >
> > From: Ryan Blue <rb...@netflix.com.invalid>
> > Sent: Tuesday, November 6, 2018 1:24 PM
> > To: d_tsai@apple.com
> > Cc: Sean Owen; Spark Dev List; cdelgado@apple.com
> > Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
> >
> > +1 to Scala 2.12 as the default in Spark 3.0.
> >
> > On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_...@apple.com> wrote:
> > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
> >
> > As Scala 2.11 will not support Java 11 unless we make a significant
> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
> Java 8. But I agree with Sean that this can make the decencies really
> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >
> >> On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com> wrote:
> >>
> >> I think we should make Scala 2.12 the default in Spark 3.0. I would
> >> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> >> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> >> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> >> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
> >>
> >> Java (9-)11 support also complicates this. I think getting it to work
> >> will need some significant dependency updates, and I worry not all
> >> will be available for 2.11 or will present some knotty problems. We'll
> >> find out soon if that forces the issue.
> >>
> >> Also note that Scala 2.13 is pretty close to release, and we'll want
> >> to support it soon after release, perhaps sooner than the long delay
> >> before 2.12 was supported (because it was hard!). It will probably be
> >> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> >> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> >> and 2.13, or something. But if 2.13 support is otherwise attainable at
> >> the release of Spark 3.0, I wonder if that too argues for dropping
> >> 2.11 support.
> >>
> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> >> while, no matter what; it still exists in the 2.4.x branch of course.
> >> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
> >>
> >> Sean
> >>
> >>
> >> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
> >>>
> >>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the
> next Spark version will be 3.0, so it's a great time to discuss should we
> make Scala 2.12 as default Scala version in Spark 3.0.
> >>>
> >>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely
> to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
> >>>
> >>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to
> make Scala 2.12 as default for Spark 3.0 now, we will have ample time to
> work on bugs and issues that we may run into.
> >>>
> >>> What do you think?
> >>>
> >>> Thanks,
> >>>
> >>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >>>
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>>
> >
> >
> >
> > --
> > Ryan Blue
> > Software Engineer
> > Netflix
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by DB Tsai <d_...@apple.com>.
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 2:55 PM, Felix Cheung <fe...@hotmail.com> wrote:
> 
> So to clarify, only scala 2.12 is supported in Spark 3?
> 
>  
> From: Ryan Blue <rb...@netflix.com.invalid>
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: d_tsai@apple.com
> Cc: Sean Owen; Spark Dev List; cdelgado@apple.com
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>  
> +1 to Scala 2.12 as the default in Spark 3.0.
> 
> On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_...@apple.com> wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 
> 
> As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
> 
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
> 
>> On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com> wrote:
>> 
>> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>> 
>> Java (9-)11 support also complicates this. I think getting it to work
>> will need some significant dependency updates, and I worry not all
>> will be available for 2.11 or will present some knotty problems. We'll
>> find out soon if that forces the issue.
>> 
>> Also note that Scala 2.13 is pretty close to release, and we'll want
>> to support it soon after release, perhaps sooner than the long delay
>> before 2.12 was supported (because it was hard!). It will probably be
>> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> the release of Spark 3.0, I wonder if that too argues for dropping
>> 2.11 support.
>> 
>> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> while, no matter what; it still exists in the 2.4.x branch of course.
>> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>> 
>> Sean
>> 
>> 
>> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
>>> 
>>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>>> 
>>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>>> 
>>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>>> 
>>> What do you think?
>>> 
>>> Thanks,
>>> 
>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>>> 
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>> 
> 
> 
> 
> -- 
> Ryan Blue
> Software Engineer
> Netflix


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Felix Cheung <fe...@hotmail.com>.
So to clarify, only scala 2.12 is supported in Spark 3?


________________________________
From: Ryan Blue <rb...@netflix.com.invalid>
Sent: Tuesday, November 6, 2018 1:24 PM
To: d_tsai@apple.com
Cc: Sean Owen; Spark Dev List; cdelgado@apple.com
Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0

+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_...@apple.com>> wrote:
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.

As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com>> wrote:

I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com>> wrote:

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>




--
Ryan Blue
Software Engineer
Netflix

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Ryan Blue <rb...@netflix.com.INVALID>.
+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_...@apple.com> wrote:

> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant
> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
> Java 8. But I agree with Sean that this can make the decencies really
> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
> On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com> wrote:
>
> I think we should make Scala 2.12 the default in Spark 3.0. I would
> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>
> Java (9-)11 support also complicates this. I think getting it to work
> will need some significant dependency updates, and I worry not all
> will be available for 2.11 or will present some knotty problems. We'll
> find out soon if that forces the issue.
>
> Also note that Scala 2.13 is pretty close to release, and we'll want
> to support it soon after release, perhaps sooner than the long delay
> before 2.12 was supported (because it was hard!). It will probably be
> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> and 2.13, or something. But if 2.13 support is otherwise attainable at
> the release of Spark 3.0, I wonder if that too argues for dropping
> 2.11 support.
>
> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> while, no matter what; it still exists in the 2.4.x branch of course.
> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>
> Sean
>
>
> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
>
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>
>
>
>

-- 
Ryan Blue
Software Engineer
Netflix

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by DB Tsai <d_...@apple.com>.
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 

As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 11:38 AM, Sean Owen <sr...@gmail.com> wrote:
> 
> I think we should make Scala 2.12 the default in Spark 3.0. I would
> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
> 
> Java (9-)11 support also complicates this. I think getting it to work
> will need some significant dependency updates, and I worry not all
> will be available for 2.11 or will present some knotty problems. We'll
> find out soon if that forces the issue.
> 
> Also note that Scala 2.13 is pretty close to release, and we'll want
> to support it soon after release, perhaps sooner than the long delay
> before 2.12 was supported (because it was hard!). It will probably be
> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> and 2.13, or something. But if 2.13 support is otherwise attainable at
> the release of Spark 3.0, I wonder if that too argues for dropping
> 2.11 support.
> 
> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> while, no matter what; it still exists in the 2.4.x branch of course.
> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
> 
> Sean
> 
> 
> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
>> 
>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>> 
>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>> 
>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>> 
>> What do you think?
>> 
>> Thanks,
>> 
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> 


Re: Make Scala 2.12 as default Scala version in Spark 3.0

Posted by Sean Owen <sr...@gmail.com>.
I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_...@apple.com> wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org