You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dongjoon Hyun <do...@gmail.com> on 2019/07/09 16:15:24 UTC

Release Apache Spark 2.4.4 before 3.0.0

Hi, All.

Spark 2.4.3 was released two months ago (8th May).

As of today (9th July), there exist 45 fixes in `branch-2.4` including the
following correctness or blocker issues.

    - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
decimals not fitting in long
    - SPARK-26045 Error in the spark 2.4 release package with the
spark-avro_2.11 dependency
    - SPARK-27798 from_avro can modify variables in other rows in local mode
    - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
    - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
    - SPARK-28308 CalendarInterval sub-second part should be padded before
parsing

It would be great if we can have Spark 2.4.4 before we are going to get
busier for 3.0.0.
If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll it
next Monday. (15th July).
How do you think about this?

Bests,
Dongjoon.

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Additionally, one more correctness patch landed yesterday.

    - SPARK-28015 Check stringToDate() consumes entire input for the yyyy
and yyyy-[m]m formats

Bests,
Dongjoon.


On Tue, Jul 9, 2019 at 10:11 AM Dongjoon Hyun <do...@gmail.com>
wrote:

> Thank you for the reply, Sean. Sure. 2.4.x should be a LTS version.
>
> The main reason of 2.4.4 release (before 3.0.0) is to have a better basis
> for comparison to 3.0.0.
> For example, SPARK-27798 had an old bug, but its correctness issue is only
> exposed at Spark 2.4.3.
> It would be great if we can have a better basis.
>
> Bests,
> Dongjoon.
>
>
> On Tue, Jul 9, 2019 at 9:52 AM Sean Owen <sr...@gmail.com> wrote:
>
>> We will certainly want a 2.4.4 release eventually. In fact I'd expect
>> 2.4.x gets maintained for longer than the usual 18 months, as it's the
>> last 2.x branch.
>> It doesn't need to happen before 3.0, but could. Usually maintenance
>> releases happen 3-4 months apart and the last one was 2 months ago. If
>> these are significant issues, sure. It'll probably be August before
>> it's out anyway.
>>
>> On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com>
>> wrote:
>> >
>> > Hi, All.
>> >
>> > Spark 2.4.3 was released two months ago (8th May).
>> >
>> > As of today (9th July), there exist 45 fixes in `branch-2.4` including
>> the following correctness or blocker issues.
>> >
>> >     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>> decimals not fitting in long
>> >     - SPARK-26045 Error in the spark 2.4 release package with the
>> spark-avro_2.11 dependency
>> >     - SPARK-27798 from_avro can modify variables in other rows in local
>> mode
>> >     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>> >     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>> entries
>> >     - SPARK-28308 CalendarInterval sub-second part should be padded
>> before parsing
>> >
>> > It would be great if we can have Spark 2.4.4 before we are going to get
>> busier for 3.0.0.
>> > If it's okay, I'd like to volunteer for an 2.4.4 release manager to
>> roll it next Monday. (15th July).
>> > How do you think about this?
>> >
>> > Bests,
>> > Dongjoon.
>>
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Additionally, one more correctness patch landed yesterday.

    - SPARK-28015 Check stringToDate() consumes entire input for the yyyy
and yyyy-[m]m formats

Bests,
Dongjoon.


On Tue, Jul 9, 2019 at 10:11 AM Dongjoon Hyun <do...@gmail.com>
wrote:

> Thank you for the reply, Sean. Sure. 2.4.x should be a LTS version.
>
> The main reason of 2.4.4 release (before 3.0.0) is to have a better basis
> for comparison to 3.0.0.
> For example, SPARK-27798 had an old bug, but its correctness issue is only
> exposed at Spark 2.4.3.
> It would be great if we can have a better basis.
>
> Bests,
> Dongjoon.
>
>
> On Tue, Jul 9, 2019 at 9:52 AM Sean Owen <sr...@gmail.com> wrote:
>
>> We will certainly want a 2.4.4 release eventually. In fact I'd expect
>> 2.4.x gets maintained for longer than the usual 18 months, as it's the
>> last 2.x branch.
>> It doesn't need to happen before 3.0, but could. Usually maintenance
>> releases happen 3-4 months apart and the last one was 2 months ago. If
>> these are significant issues, sure. It'll probably be August before
>> it's out anyway.
>>
>> On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com>
>> wrote:
>> >
>> > Hi, All.
>> >
>> > Spark 2.4.3 was released two months ago (8th May).
>> >
>> > As of today (9th July), there exist 45 fixes in `branch-2.4` including
>> the following correctness or blocker issues.
>> >
>> >     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>> decimals not fitting in long
>> >     - SPARK-26045 Error in the spark 2.4 release package with the
>> spark-avro_2.11 dependency
>> >     - SPARK-27798 from_avro can modify variables in other rows in local
>> mode
>> >     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>> >     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>> entries
>> >     - SPARK-28308 CalendarInterval sub-second part should be padded
>> before parsing
>> >
>> > It would be great if we can have Spark 2.4.4 before we are going to get
>> busier for 3.0.0.
>> > If it's okay, I'd like to volunteer for an 2.4.4 release manager to
>> roll it next Monday. (15th July).
>> > How do you think about this?
>> >
>> > Bests,
>> > Dongjoon.
>>
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you for the reply, Sean. Sure. 2.4.x should be a LTS version.

The main reason of 2.4.4 release (before 3.0.0) is to have a better basis
for comparison to 3.0.0.
For example, SPARK-27798 had an old bug, but its correctness issue is only
exposed at Spark 2.4.3.
It would be great if we can have a better basis.

Bests,
Dongjoon.


On Tue, Jul 9, 2019 at 9:52 AM Sean Owen <sr...@gmail.com> wrote:

> We will certainly want a 2.4.4 release eventually. In fact I'd expect
> 2.4.x gets maintained for longer than the usual 18 months, as it's the
> last 2.x branch.
> It doesn't need to happen before 3.0, but could. Usually maintenance
> releases happen 3-4 months apart and the last one was 2 months ago. If
> these are significant issues, sure. It'll probably be August before
> it's out anyway.
>
> On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > Hi, All.
> >
> > Spark 2.4.3 was released two months ago (8th May).
> >
> > As of today (9th July), there exist 45 fixes in `branch-2.4` including
> the following correctness or blocker issues.
> >
> >     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
> >     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
> >     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
> >     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
> >     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
> entries
> >     - SPARK-28308 CalendarInterval sub-second part should be padded
> before parsing
> >
> > It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> > If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> > How do you think about this?
> >
> > Bests,
> > Dongjoon.
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you for the reply, Sean. Sure. 2.4.x should be a LTS version.

The main reason of 2.4.4 release (before 3.0.0) is to have a better basis
for comparison to 3.0.0.
For example, SPARK-27798 had an old bug, but its correctness issue is only
exposed at Spark 2.4.3.
It would be great if we can have a better basis.

Bests,
Dongjoon.


On Tue, Jul 9, 2019 at 9:52 AM Sean Owen <sr...@gmail.com> wrote:

> We will certainly want a 2.4.4 release eventually. In fact I'd expect
> 2.4.x gets maintained for longer than the usual 18 months, as it's the
> last 2.x branch.
> It doesn't need to happen before 3.0, but could. Usually maintenance
> releases happen 3-4 months apart and the last one was 2 months ago. If
> these are significant issues, sure. It'll probably be August before
> it's out anyway.
>
> On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > Hi, All.
> >
> > Spark 2.4.3 was released two months ago (8th May).
> >
> > As of today (9th July), there exist 45 fixes in `branch-2.4` including
> the following correctness or blocker issues.
> >
> >     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
> >     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
> >     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
> >     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
> >     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
> entries
> >     - SPARK-28308 CalendarInterval sub-second part should be padded
> before parsing
> >
> > It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> > If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> > How do you think about this?
> >
> > Bests,
> > Dongjoon.
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Sean Owen <sr...@gmail.com>.
We will certainly want a 2.4.4 release eventually. In fact I'd expect
2.4.x gets maintained for longer than the usual 18 months, as it's the
last 2.x branch.
It doesn't need to happen before 3.0, but could. Usually maintenance
releases happen 3-4 months apart and the last one was 2 months ago. If
these are significant issues, sure. It'll probably be August before
it's out anyway.

On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com> wrote:
>
> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Stavros Kontopoulos <st...@lightbend.com>.
Hi Dongjoon,

Should we also consider fixing
https://issues.apache.org/jira/browse/SPARK-27812 before the cut?

Best,
Stavros

On Mon, Jul 15, 2019 at 7:04 PM Dongjoon Hyun <do...@gmail.com>
wrote:

> Hi, Apache Spark PMC members.
>
> Can we cut Apache Spark 2.4.4 next Monday (22nd July)?
>
> Bests,
> Dongjoon.
>
>
> On Fri, Jul 12, 2019 at 3:18 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
>
>> Thank you, Jacek.
>>
>> BTW, I added `@private` since we need PMC's help to make an Apache Spark
>> release.
>>
>> Can I get more feedbacks from the other PMC members?
>>
>> Please me know if you have any concerns (e.g. Release date or Release
>> manager?)
>>
>> As one of the community members, I assumed the followings (if we are on
>> schedule).
>>
>> - 2.4.4 at the end of July
>> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
>> February 2018)
>> - 3.0.0 (possibily September?)
>> - 3.1.0 (January 2020?)
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
>>
>>> Hi,
>>>
>>> Thanks Dongjoon Hyun for stepping up as a release manager!
>>> Much appreciated.
>>>
>>> If there's a volunteer to cut a release, I'm always to support it.
>>>
>>> In addition, the more frequent releases the better for end users so they
>>> have a choice to upgrade and have all the latest fixes or wait. It's their
>>> call not ours (when we'd keep them waiting).
>>>
>>> My big 2 yes'es for the release!
>>>
>>> Jacek
>>>
>>>
>>> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com>
>>> wrote:
>>>
>>>> Hi, All.
>>>>
>>>> Spark 2.4.3 was released two months ago (8th May).
>>>>
>>>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>>>> the following correctness or blocker issues.
>>>>
>>>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>>>> decimals not fitting in long
>>>>     - SPARK-26045 Error in the spark 2.4 release package with the
>>>> spark-avro_2.11 dependency
>>>>     - SPARK-27798 from_avro can modify variables in other rows in local
>>>> mode
>>>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>>>> entries
>>>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>>>> before parsing
>>>>
>>>> It would be great if we can have Spark 2.4.4 before we are going to get
>>>> busier for 3.0.0.
>>>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to
>>>> roll it next Monday. (15th July).
>>>> How do you think about this?
>>>>
>>>> Bests,
>>>> Dongjoon.
>>>>
>>>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Stavros Kontopoulos <st...@lightbend.com>.
Hi Dongjoon,

Should we also consider fixing
https://issues.apache.org/jira/browse/SPARK-27812 before the cut?

Best,
Stavros

On Mon, Jul 15, 2019 at 7:04 PM Dongjoon Hyun <do...@gmail.com>
wrote:

> Hi, Apache Spark PMC members.
>
> Can we cut Apache Spark 2.4.4 next Monday (22nd July)?
>
> Bests,
> Dongjoon.
>
>
> On Fri, Jul 12, 2019 at 3:18 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
>
>> Thank you, Jacek.
>>
>> BTW, I added `@private` since we need PMC's help to make an Apache Spark
>> release.
>>
>> Can I get more feedbacks from the other PMC members?
>>
>> Please me know if you have any concerns (e.g. Release date or Release
>> manager?)
>>
>> As one of the community members, I assumed the followings (if we are on
>> schedule).
>>
>> - 2.4.4 at the end of July
>> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
>> February 2018)
>> - 3.0.0 (possibily September?)
>> - 3.1.0 (January 2020?)
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
>>
>>> Hi,
>>>
>>> Thanks Dongjoon Hyun for stepping up as a release manager!
>>> Much appreciated.
>>>
>>> If there's a volunteer to cut a release, I'm always to support it.
>>>
>>> In addition, the more frequent releases the better for end users so they
>>> have a choice to upgrade and have all the latest fixes or wait. It's their
>>> call not ours (when we'd keep them waiting).
>>>
>>> My big 2 yes'es for the release!
>>>
>>> Jacek
>>>
>>>
>>> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com>
>>> wrote:
>>>
>>>> Hi, All.
>>>>
>>>> Spark 2.4.3 was released two months ago (8th May).
>>>>
>>>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>>>> the following correctness or blocker issues.
>>>>
>>>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>>>> decimals not fitting in long
>>>>     - SPARK-26045 Error in the spark 2.4 release package with the
>>>> spark-avro_2.11 dependency
>>>>     - SPARK-27798 from_avro can modify variables in other rows in local
>>>> mode
>>>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>>>> entries
>>>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>>>> before parsing
>>>>
>>>> It would be great if we can have Spark 2.4.4 before we are going to get
>>>> busier for 3.0.0.
>>>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to
>>>> roll it next Monday. (15th July).
>>>> How do you think about this?
>>>>
>>>> Bests,
>>>> Dongjoon.
>>>>
>>>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Hi, Apache Spark PMC members.

Can we cut Apache Spark 2.4.4 next Monday (22nd July)?

Bests,
Dongjoon.


On Fri, Jul 12, 2019 at 3:18 PM Dongjoon Hyun <do...@gmail.com>
wrote:

> Thank you, Jacek.
>
> BTW, I added `@private` since we need PMC's help to make an Apache Spark
> release.
>
> Can I get more feedbacks from the other PMC members?
>
> Please me know if you have any concerns (e.g. Release date or Release
> manager?)
>
> As one of the community members, I assumed the followings (if we are on
> schedule).
>
> - 2.4.4 at the end of July
> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
> February 2018)
> - 3.0.0 (possibily September?)
> - 3.1.0 (January 2020?)
>
> Bests,
> Dongjoon.
>
>
> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
>
>> Hi,
>>
>> Thanks Dongjoon Hyun for stepping up as a release manager!
>> Much appreciated.
>>
>> If there's a volunteer to cut a release, I'm always to support it.
>>
>> In addition, the more frequent releases the better for end users so they
>> have a choice to upgrade and have all the latest fixes or wait. It's their
>> call not ours (when we'd keep them waiting).
>>
>> My big 2 yes'es for the release!
>>
>> Jacek
>>
>>
>> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
>>
>>> Hi, All.
>>>
>>> Spark 2.4.3 was released two months ago (8th May).
>>>
>>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>>> the following correctness or blocker issues.
>>>
>>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>>> decimals not fitting in long
>>>     - SPARK-26045 Error in the spark 2.4 release package with the
>>> spark-avro_2.11 dependency
>>>     - SPARK-27798 from_avro can modify variables in other rows in local
>>> mode
>>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>>> entries
>>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>>> before parsing
>>>
>>> It would be great if we can have Spark 2.4.4 before we are going to get
>>> busier for 3.0.0.
>>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
>>> it next Monday. (15th July).
>>> How do you think about this?
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Hi, Apache Spark PMC members.

Can we cut Apache Spark 2.4.4 next Monday (22nd July)?

Bests,
Dongjoon.


On Fri, Jul 12, 2019 at 3:18 PM Dongjoon Hyun <do...@gmail.com>
wrote:

> Thank you, Jacek.
>
> BTW, I added `@private` since we need PMC's help to make an Apache Spark
> release.
>
> Can I get more feedbacks from the other PMC members?
>
> Please me know if you have any concerns (e.g. Release date or Release
> manager?)
>
> As one of the community members, I assumed the followings (if we are on
> schedule).
>
> - 2.4.4 at the end of July
> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
> February 2018)
> - 3.0.0 (possibily September?)
> - 3.1.0 (January 2020?)
>
> Bests,
> Dongjoon.
>
>
> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
>
>> Hi,
>>
>> Thanks Dongjoon Hyun for stepping up as a release manager!
>> Much appreciated.
>>
>> If there's a volunteer to cut a release, I'm always to support it.
>>
>> In addition, the more frequent releases the better for end users so they
>> have a choice to upgrade and have all the latest fixes or wait. It's their
>> call not ours (when we'd keep them waiting).
>>
>> My big 2 yes'es for the release!
>>
>> Jacek
>>
>>
>> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
>>
>>> Hi, All.
>>>
>>> Spark 2.4.3 was released two months ago (8th May).
>>>
>>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>>> the following correctness or blocker issues.
>>>
>>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>>> decimals not fitting in long
>>>     - SPARK-26045 Error in the spark 2.4 release package with the
>>> spark-avro_2.11 dependency
>>>     - SPARK-27798 from_avro can modify variables in other rows in local
>>> mode
>>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist
>>> entries
>>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>>> before parsing
>>>
>>> It would be great if we can have Spark 2.4.4 before we are going to get
>>> busier for 3.0.0.
>>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
>>> it next Monday. (15th July).
>>> How do you think about this?
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>

unsubscribe

Posted by Joevu <js...@163.com>.
unsubscribe






At 2019-07-16 23:24:28, "Dongjoon Hyun" <do...@gmail.com> wrote:

Thank you for volunteering for 2.3.4 release manager, Kazuaki!
It's great to see a new release manager in advance. :D


Thank you for reply, Stavros.
In addition to that issue, I'm also monitoring some other K8s issues and PRs.
But, I'm not sure we can have that because some PRs seems to fail at building consensus (even for 3.0.0).
In any way, could you ping the reviewers once more on those PRs which you have concerns?
If it is merged into `branch-2.4`, it will be Apache Spark 2.4.4 of course.


Bests,
Dongjoon.




On Tue, Jul 16, 2019 at 4:00 AM Kazuaki Ishizaki <IS...@jp.ibm.com> wrote:

Thank you Dongjoon for being a release manager.

If the assumed dates are ok, I would like to volunteer for an 2.3.4 release manager.

Best Regards,
Kazuaki Ishizaki,



From:        Dongjoon Hyun <do...@gmail.com>
To:        dev <de...@spark.apache.org>, "user @spark" <us...@spark.apache.org>, Apache Spark PMC <pr...@spark.apache.org>
Date:        2019/07/13 07:18
Subject:        [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0




Thank you, Jacek.

BTW, I added `@private` since we need PMC's help to make an Apache Spark release.

Can I get more feedbacks from the other PMC members?

Please me know if you have any concerns (e.g. Release date or Release manager?)

As one of the community members, I assumed the followings (if we are on schedule).

- 2.4.4 at the end of July
- 2.3.4 at the end of August (since 2.3.0 was released at the end of February 2018)
- 3.0.0 (possibily September?)
- 3.1.0 (January 2020?)

Bests,
Dongjoon.


On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
Hi,

Thanks Dongjoon Hyun for stepping up as a release manager! Much appreciated. 

If there's a volunteer to cut a release, I'm always to support it.

In addition, the more frequent releases the better for end users so they have a choice to upgrade and have all the latest fixes or wait. It's their call not ours (when we'd keep them waiting).

My big 2 yes'es for the release!

Jacek


On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
Hi, All.

Spark 2.4.3 was released two months ago (8th May).

As of today (9th July), there exist 45 fixes in `branch-2.4` including the following correctness or blocker issues.

    - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for decimals not fitting in long
    - SPARK-26045 Error in the spark 2.4 release package with the spark-avro_2.11 dependency
    - SPARK-27798 from_avro can modify variables in other rows in local mode
    - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
    - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
    - SPARK-28308 CalendarInterval sub-second part should be padded before parsing

It would be great if we can have Spark 2.4.4 before we are going to get busier for 3.0.0.
If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll it next Monday. (15th July).
How do you think about this?

Bests,
Dongjoon.


Re: Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you for volunteering for 2.3.4 release manager, Kazuaki!
It's great to see a new release manager in advance. :D

Thank you for reply, Stavros.
In addition to that issue, I'm also monitoring some other K8s issues and
PRs.
But, I'm not sure we can have that because some PRs seems to fail at
building consensus (even for 3.0.0).
In any way, could you ping the reviewers once more on those PRs which you
have concerns?
If it is merged into `branch-2.4`, it will be Apache Spark 2.4.4 of course.

Bests,
Dongjoon.


On Tue, Jul 16, 2019 at 4:00 AM Kazuaki Ishizaki <IS...@jp.ibm.com>
wrote:

> Thank you Dongjoon for being a release manager.
>
> If the assumed dates are ok, I would like to volunteer for an 2.3.4
> release manager.
>
> Best Regards,
> Kazuaki Ishizaki,
>
>
>
> From:        Dongjoon Hyun <do...@gmail.com>
> To:        dev <de...@spark.apache.org>, "user @spark" <
> user@spark.apache.org>, Apache Spark PMC <pr...@spark.apache.org>
> Date:        2019/07/13 07:18
> Subject:        [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0
> ------------------------------
>
>
>
> Thank you, Jacek.
>
> BTW, I added `@private` since we need PMC's help to make an Apache Spark
> release.
>
> Can I get more feedbacks from the other PMC members?
>
> Please me know if you have any concerns (e.g. Release date or Release
> manager?)
>
> As one of the community members, I assumed the followings (if we are on
> schedule).
>
> - 2.4.4 at the end of July
> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
> February 2018)
> - 3.0.0 (possibily September?)
> - 3.1.0 (January 2020?)
>
> Bests,
> Dongjoon.
>
>
> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <*jacek@japila.pl*
> <ja...@japila.pl>> wrote:
> Hi,
>
> Thanks Dongjoon Hyun for stepping up as a release manager!
> Much appreciated.
>
> If there's a volunteer to cut a release, I'm always to support it.
>
> In addition, the more frequent releases the better for end users so they
> have a choice to upgrade and have all the latest fixes or wait. It's their
> call not ours (when we'd keep them waiting).
>
> My big 2 yes'es for the release!
>
> Jacek
>
>
> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <*dongjoon.hyun@gmail.com*
> <do...@gmail.com>> wrote:
> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the
> following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before
> parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.
>
>

Re: Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you for volunteering for 2.3.4 release manager, Kazuaki!
It's great to see a new release manager in advance. :D

Thank you for reply, Stavros.
In addition to that issue, I'm also monitoring some other K8s issues and
PRs.
But, I'm not sure we can have that because some PRs seems to fail at
building consensus (even for 3.0.0).
In any way, could you ping the reviewers once more on those PRs which you
have concerns?
If it is merged into `branch-2.4`, it will be Apache Spark 2.4.4 of course.

Bests,
Dongjoon.


On Tue, Jul 16, 2019 at 4:00 AM Kazuaki Ishizaki <IS...@jp.ibm.com>
wrote:

> Thank you Dongjoon for being a release manager.
>
> If the assumed dates are ok, I would like to volunteer for an 2.3.4
> release manager.
>
> Best Regards,
> Kazuaki Ishizaki,
>
>
>
> From:        Dongjoon Hyun <do...@gmail.com>
> To:        dev <de...@spark.apache.org>, "user @spark" <
> user@spark.apache.org>, Apache Spark PMC <pr...@spark.apache.org>
> Date:        2019/07/13 07:18
> Subject:        [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0
> ------------------------------
>
>
>
> Thank you, Jacek.
>
> BTW, I added `@private` since we need PMC's help to make an Apache Spark
> release.
>
> Can I get more feedbacks from the other PMC members?
>
> Please me know if you have any concerns (e.g. Release date or Release
> manager?)
>
> As one of the community members, I assumed the followings (if we are on
> schedule).
>
> - 2.4.4 at the end of July
> - 2.3.4 at the end of August (since 2.3.0 was released at the end of
> February 2018)
> - 3.0.0 (possibily September?)
> - 3.1.0 (January 2020?)
>
> Bests,
> Dongjoon.
>
>
> On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <*jacek@japila.pl*
> <ja...@japila.pl>> wrote:
> Hi,
>
> Thanks Dongjoon Hyun for stepping up as a release manager!
> Much appreciated.
>
> If there's a volunteer to cut a release, I'm always to support it.
>
> In addition, the more frequent releases the better for end users so they
> have a choice to upgrade and have all the latest fixes or wait. It's their
> call not ours (when we'd keep them waiting).
>
> My big 2 yes'es for the release!
>
> Jacek
>
>
> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <*dongjoon.hyun@gmail.com*
> <do...@gmail.com>> wrote:
> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the
> following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before
> parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.
>
>

Re: Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Kazuaki Ishizaki <IS...@jp.ibm.com>.
Thank you Dongjoon for being a release manager.

If the assumed dates are ok, I would like to volunteer for an 2.3.4 
release manager.

Best Regards,
Kazuaki Ishizaki,



From:   Dongjoon Hyun <do...@gmail.com>
To:     dev <de...@spark.apache.org>, "user @spark" <us...@spark.apache.org>, 
Apache Spark PMC <pr...@spark.apache.org>
Date:   2019/07/13 07:18
Subject:        [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0



Thank you, Jacek.

BTW, I added `@private` since we need PMC's help to make an Apache Spark 
release.

Can I get more feedbacks from the other PMC members?

Please me know if you have any concerns (e.g. Release date or Release 
manager?)

As one of the community members, I assumed the followings (if we are on 
schedule).

- 2.4.4 at the end of July
- 2.3.4 at the end of August (since 2.3.0 was released at the end of 
February 2018)
- 3.0.0 (possibily September?)
- 3.1.0 (January 2020?)

Bests,
Dongjoon.


On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
Hi,

Thanks Dongjoon Hyun for stepping up as a release manager! 
Much appreciated. 

If there's a volunteer to cut a release, I'm always to support it.

In addition, the more frequent releases the better for end users so they 
have a choice to upgrade and have all the latest fixes or wait. It's their 
call not ours (when we'd keep them waiting).

My big 2 yes'es for the release!

Jacek


On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
Hi, All.

Spark 2.4.3 was released two months ago (8th May).

As of today (9th July), there exist 45 fixes in `branch-2.4` including the 
following correctness or blocker issues.

    - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for 
decimals not fitting in long
    - SPARK-26045 Error in the spark 2.4 release package with the 
spark-avro_2.11 dependency
    - SPARK-27798 from_avro can modify variables in other rows in local 
mode
    - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
    - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
    - SPARK-28308 CalendarInterval sub-second part should be padded before 
parsing

It would be great if we can have Spark 2.4.4 before we are going to get 
busier for 3.0.0.
If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll 
it next Monday. (15th July).
How do you think about this?

Bests,
Dongjoon.



Re: Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Kazuaki Ishizaki <IS...@jp.ibm.com>.
Thank you Dongjoon for being a release manager.

If the assumed dates are ok, I would like to volunteer for an 2.3.4 
release manager.

Best Regards,
Kazuaki Ishizaki,



From:   Dongjoon Hyun <do...@gmail.com>
To:     dev <de...@spark.apache.org>, "user @spark" <us...@spark.apache.org>, 
Apache Spark PMC <pr...@spark.apache.org>
Date:   2019/07/13 07:18
Subject:        [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0



Thank you, Jacek.

BTW, I added `@private` since we need PMC's help to make an Apache Spark 
release.

Can I get more feedbacks from the other PMC members?

Please me know if you have any concerns (e.g. Release date or Release 
manager?)

As one of the community members, I assumed the followings (if we are on 
schedule).

- 2.4.4 at the end of July
- 2.3.4 at the end of August (since 2.3.0 was released at the end of 
February 2018)
- 3.0.0 (possibily September?)
- 3.1.0 (January 2020?)

Bests,
Dongjoon.


On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:
Hi,

Thanks Dongjoon Hyun for stepping up as a release manager! 
Much appreciated. 

If there's a volunteer to cut a release, I'm always to support it.

In addition, the more frequent releases the better for end users so they 
have a choice to upgrade and have all the latest fixes or wait. It's their 
call not ours (when we'd keep them waiting).

My big 2 yes'es for the release!

Jacek


On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
Hi, All.

Spark 2.4.3 was released two months ago (8th May).

As of today (9th July), there exist 45 fixes in `branch-2.4` including the 
following correctness or blocker issues.

    - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for 
decimals not fitting in long
    - SPARK-26045 Error in the spark 2.4 release package with the 
spark-avro_2.11 dependency
    - SPARK-27798 from_avro can modify variables in other rows in local 
mode
    - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
    - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
    - SPARK-28308 CalendarInterval sub-second part should be padded before 
parsing

It would be great if we can have Spark 2.4.4 before we are going to get 
busier for 3.0.0.
If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll 
it next Monday. (15th July).
How do you think about this?

Bests,
Dongjoon.



Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you, Jacek.

BTW, I added `@private` since we need PMC's help to make an Apache Spark
release.

Can I get more feedbacks from the other PMC members?

Please me know if you have any concerns (e.g. Release date or Release
manager?)

As one of the community members, I assumed the followings (if we are on
schedule).

- 2.4.4 at the end of July
- 2.3.4 at the end of August (since 2.3.0 was released at the end of
February 2018)
- 3.0.0 (possibily September?)
- 3.1.0 (January 2020?)

Bests,
Dongjoon.


On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Thanks Dongjoon Hyun for stepping up as a release manager!
> Much appreciated.
>
> If there's a volunteer to cut a release, I'm always to support it.
>
> In addition, the more frequent releases the better for end users so they
> have a choice to upgrade and have all the latest fixes or wait. It's their
> call not ours (when we'd keep them waiting).
>
> My big 2 yes'es for the release!
>
> Jacek
>
>
> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
>
>> Hi, All.
>>
>> Spark 2.4.3 was released two months ago (8th May).
>>
>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>> the following correctness or blocker issues.
>>
>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>> decimals not fitting in long
>>     - SPARK-26045 Error in the spark 2.4 release package with the
>> spark-avro_2.11 dependency
>>     - SPARK-27798 from_avro can modify variables in other rows in local
>> mode
>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>> before parsing
>>
>> It would be great if we can have Spark 2.4.4 before we are going to get
>> busier for 3.0.0.
>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
>> it next Monday. (15th July).
>> How do you think about this?
>>
>> Bests,
>> Dongjoon.
>>
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Dongjoon Hyun <do...@gmail.com>.
Thank you, Jacek.

BTW, I added `@private` since we need PMC's help to make an Apache Spark
release.

Can I get more feedbacks from the other PMC members?

Please me know if you have any concerns (e.g. Release date or Release
manager?)

As one of the community members, I assumed the followings (if we are on
schedule).

- 2.4.4 at the end of July
- 2.3.4 at the end of August (since 2.3.0 was released at the end of
February 2018)
- 3.0.0 (possibily September?)
- 3.1.0 (January 2020?)

Bests,
Dongjoon.


On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Thanks Dongjoon Hyun for stepping up as a release manager!
> Much appreciated.
>
> If there's a volunteer to cut a release, I'm always to support it.
>
> In addition, the more frequent releases the better for end users so they
> have a choice to upgrade and have all the latest fixes or wait. It's their
> call not ours (when we'd keep them waiting).
>
> My big 2 yes'es for the release!
>
> Jacek
>
>
> On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:
>
>> Hi, All.
>>
>> Spark 2.4.3 was released two months ago (8th May).
>>
>> As of today (9th July), there exist 45 fixes in `branch-2.4` including
>> the following correctness or blocker issues.
>>
>>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
>> decimals not fitting in long
>>     - SPARK-26045 Error in the spark 2.4 release package with the
>> spark-avro_2.11 dependency
>>     - SPARK-27798 from_avro can modify variables in other rows in local
>> mode
>>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>>     - SPARK-28308 CalendarInterval sub-second part should be padded
>> before parsing
>>
>> It would be great if we can have Spark 2.4.4 before we are going to get
>> busier for 3.0.0.
>> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
>> it next Monday. (15th July).
>> How do you think about this?
>>
>> Bests,
>> Dongjoon.
>>
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Thanks Dongjoon Hyun for stepping up as a release manager!
Much appreciated.

If there's a volunteer to cut a release, I'm always to support it.

In addition, the more frequent releases the better for end users so they
have a choice to upgrade and have all the latest fixes or wait. It's their
call not ours (when we'd keep them waiting).

My big 2 yes'es for the release!

Jacek


On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:

> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the
> following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before
> parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Thanks Dongjoon Hyun for stepping up as a release manager!
Much appreciated.

If there's a volunteer to cut a release, I'm always to support it.

In addition, the more frequent releases the better for end users so they
have a choice to upgrade and have all the latest fixes or wait. It's their
call not ours (when we'd keep them waiting).

My big 2 yes'es for the release!

Jacek


On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <do...@gmail.com> wrote:

> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the
> following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for
> decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the
> spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local
> mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before
> parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get
> busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll
> it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.
>

Re: Release Apache Spark 2.4.4 before 3.0.0

Posted by Sean Owen <sr...@gmail.com>.
We will certainly want a 2.4.4 release eventually. In fact I'd expect
2.4.x gets maintained for longer than the usual 18 months, as it's the
last 2.x branch.
It doesn't need to happen before 3.0, but could. Usually maintenance
releases happen 3-4 months apart and the last one was 2 months ago. If
these are significant issues, sure. It'll probably be August before
it's out anyway.

On Tue, Jul 9, 2019 at 11:15 AM Dongjoon Hyun <do...@gmail.com> wrote:
>
> Hi, All.
>
> Spark 2.4.3 was released two months ago (8th May).
>
> As of today (9th July), there exist 45 fixes in `branch-2.4` including the following correctness or blocker issues.
>
>     - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for decimals not fitting in long
>     - SPARK-26045 Error in the spark 2.4 release package with the spark-avro_2.11 dependency
>     - SPARK-27798 from_avro can modify variables in other rows in local mode
>     - SPARK-27907 HiveUDAF should return NULL in case of 0 rows
>     - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries
>     - SPARK-28308 CalendarInterval sub-second part should be padded before parsing
>
> It would be great if we can have Spark 2.4.4 before we are going to get busier for 3.0.0.
> If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll it next Monday. (15th July).
> How do you think about this?
>
> Bests,
> Dongjoon.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org