You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kyuubi.apache.org by Kent Yao <ya...@apache.org> on 2022/06/29 06:59:16 UTC

[DISCUSS] Spark 3.0 Support EOL Progress

Hi team,

For our master branch a.k.a v1.6.0, we now test and build against
multiple spark revisions, including spark 3.0.x/3.1.x/3.2.x/3.3.x
running with java 8 and 11, and cross-version tests with a spark 3.2
compiled engine running on spark 3.0.x/3.1.x/3.3.x distributions.

Considering that spark 3.0 is EOL via spark community and the heavy
workload on our CI, we'd better drop spark 3.0.x support step by step.

Here are the options/steps I have been thinking of,

O.1. drop the ci jobs for  3.0.x compile/build/test for java 8 and 11 only.
O.2. drop the ci job for running the 3.2 complied engine on spark 3.0
distribution. + O.1
O.3. drop the spark-3.0 maven profile + O.2

Best Regards

Kent

Re: [DISCUSS] Spark 3.0 Support EOL Progress

Posted by Kent Yao <ya...@apache.org>.
Thanks everyone.

https://github.com/apache/incubator-kyuubi/issues/2974 has been created for further tracking, and volunteers for following pull requests are welcome.

BR

Kent

On 2022/06/30 02:15:57 Fu Chen wrote:
> +1 for O.3. Thanks Kent.
> 
> zhaomin1423 <zh...@163.com> 于2022年6月30日周四 09:11写道:
> 
> > Thanks Kent, I vote for O.3.
> >
> >
> >
> >
> > ---- Replied Message ----
> > | From | Cheng Pan<pa...@gmail.com> |
> > | Date | 06/29/2022 15:06 |
> > | To | <de...@kyuubi.apache.org> |
> > | Subject | Re: [DISCUSS] Spark 3.0 Support EOL Progress |
> > Thanks Kent for bringing this up.
> >
> > I vote for O.3, and suggest declaring support for Spark 3.0 is
> > deprecated in the 1.6.0 release notes, then do O.3 in the
> > 1.7.0-SNAPSHOT period.
> >
> > Thanks,
> > Cheng Pan
> >
> > On Wed, Jun 29, 2022 at 2:59 PM Kent Yao <ya...@apache.org> wrote:
> >
> > Hi team,
> >
> > For our master branch a.k.a v1.6.0, we now test and build against
> > multiple spark revisions, including spark 3.0.x/3.1.x/3.2.x/3.3.x
> > running with java 8 and 11, and cross-version tests with a spark 3.2
> > compiled engine running on spark 3.0.x/3.1.x/3.3.x distributions.
> >
> > Considering that spark 3.0 is EOL via spark community and the heavy
> > workload on our CI, we'd better drop spark 3.0.x support step by step.
> >
> > Here are the options/steps I have been thinking of,
> >
> > O.1. drop the ci jobs for  3.0.x compile/build/test for java 8 and 11 only.
> > O.2. drop the ci job for running the 3.2 complied engine on spark 3.0
> > distribution. + O.1
> > O.3. drop the spark-3.0 maven profile + O.2
> >
> > Best Regards
> >
> > Kent
> >
> 

Re: [DISCUSS] Spark 3.0 Support EOL Progress

Posted by Fu Chen <cf...@gmail.com>.
+1 for O.3. Thanks Kent.

zhaomin1423 <zh...@163.com> 于2022年6月30日周四 09:11写道:

> Thanks Kent, I vote for O.3.
>
>
>
>
> ---- Replied Message ----
> | From | Cheng Pan<pa...@gmail.com> |
> | Date | 06/29/2022 15:06 |
> | To | <de...@kyuubi.apache.org> |
> | Subject | Re: [DISCUSS] Spark 3.0 Support EOL Progress |
> Thanks Kent for bringing this up.
>
> I vote for O.3, and suggest declaring support for Spark 3.0 is
> deprecated in the 1.6.0 release notes, then do O.3 in the
> 1.7.0-SNAPSHOT period.
>
> Thanks,
> Cheng Pan
>
> On Wed, Jun 29, 2022 at 2:59 PM Kent Yao <ya...@apache.org> wrote:
>
> Hi team,
>
> For our master branch a.k.a v1.6.0, we now test and build against
> multiple spark revisions, including spark 3.0.x/3.1.x/3.2.x/3.3.x
> running with java 8 and 11, and cross-version tests with a spark 3.2
> compiled engine running on spark 3.0.x/3.1.x/3.3.x distributions.
>
> Considering that spark 3.0 is EOL via spark community and the heavy
> workload on our CI, we'd better drop spark 3.0.x support step by step.
>
> Here are the options/steps I have been thinking of,
>
> O.1. drop the ci jobs for  3.0.x compile/build/test for java 8 and 11 only.
> O.2. drop the ci job for running the 3.2 complied engine on spark 3.0
> distribution. + O.1
> O.3. drop the spark-3.0 maven profile + O.2
>
> Best Regards
>
> Kent
>

Re: [DISCUSS] Spark 3.0 Support EOL Progress

Posted by zhaomin1423 <zh...@163.com>.
Thanks Kent, I vote for O.3.




---- Replied Message ----
| From | Cheng Pan<pa...@gmail.com> |
| Date | 06/29/2022 15:06 |
| To | <de...@kyuubi.apache.org> |
| Subject | Re: [DISCUSS] Spark 3.0 Support EOL Progress |
Thanks Kent for bringing this up.

I vote for O.3, and suggest declaring support for Spark 3.0 is
deprecated in the 1.6.0 release notes, then do O.3 in the
1.7.0-SNAPSHOT period.

Thanks,
Cheng Pan

On Wed, Jun 29, 2022 at 2:59 PM Kent Yao <ya...@apache.org> wrote:

Hi team,

For our master branch a.k.a v1.6.0, we now test and build against
multiple spark revisions, including spark 3.0.x/3.1.x/3.2.x/3.3.x
running with java 8 and 11, and cross-version tests with a spark 3.2
compiled engine running on spark 3.0.x/3.1.x/3.3.x distributions.

Considering that spark 3.0 is EOL via spark community and the heavy
workload on our CI, we'd better drop spark 3.0.x support step by step.

Here are the options/steps I have been thinking of,

O.1. drop the ci jobs for  3.0.x compile/build/test for java 8 and 11 only.
O.2. drop the ci job for running the 3.2 complied engine on spark 3.0
distribution. + O.1
O.3. drop the spark-3.0 maven profile + O.2

Best Regards

Kent

Re: [DISCUSS] Spark 3.0 Support EOL Progress

Posted by Cheng Pan <pa...@gmail.com>.
Thanks Kent for bringing this up.

I vote for O.3, and suggest declaring support for Spark 3.0 is
deprecated in the 1.6.0 release notes, then do O.3 in the
1.7.0-SNAPSHOT period.

Thanks,
Cheng Pan

On Wed, Jun 29, 2022 at 2:59 PM Kent Yao <ya...@apache.org> wrote:
>
> Hi team,
>
> For our master branch a.k.a v1.6.0, we now test and build against
> multiple spark revisions, including spark 3.0.x/3.1.x/3.2.x/3.3.x
> running with java 8 and 11, and cross-version tests with a spark 3.2
> compiled engine running on spark 3.0.x/3.1.x/3.3.x distributions.
>
> Considering that spark 3.0 is EOL via spark community and the heavy
> workload on our CI, we'd better drop spark 3.0.x support step by step.
>
> Here are the options/steps I have been thinking of,
>
> O.1. drop the ci jobs for  3.0.x compile/build/test for java 8 and 11 only.
> O.2. drop the ci job for running the 3.2 complied engine on spark 3.0
> distribution. + O.1
> O.3. drop the spark-3.0 maven profile + O.2
>
> Best Regards
>
> Kent