You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemds.apache.org by se...@bgaard.dk on 2022/12/07 12:01:22 UTC

Update spark version

Hi all,

Just to let all know,

I am considering updating the Spark, and Hadoop version of our system,
would this interfere with any ongoing work currently?

Spark 3.2.0 -> 3.3.1
Hadoop 3.3.3 -> 3.3.4

best regards
Sebastian


Re: Update spark version

Posted by se...@bgaard.dk.
Compatibility sure.

The updates to the different versions are mainly security updates.
For the hadoop 3.3.3 to 3.3.4 is security related.
The spark 3.2.0 to 3.3.1 is both security and a minor update.

I think both are fair updates to make, that does not require us to do a major release.
We also risk that someone points out the security vulnerabilities in an upcoming release if we do not update.

best regards
Sebastian



On 7 December 2022 at 13:43:50 +01:00, Matthias Boehm <mb...@gmail.com> wrote:

> yes, backwards compatibility is important to keep in mind. In the past, we mentioned the *minimum* Spark/Hadoop versions SystemML/SystemDS required, while still being able to run with more recent versions.
> 
> So let's separate two different aspects here: (1) the minimum version compatible with our source code, and (2) the dependencies we reference for tests and build. Although Spark and Hadoop are generally very good in terms of backwards compatibility inside major versions, it would be good to keep (1) and (2) in sync to avoid hidden incompatibilities.
> 
> Are there specific reasons for this update - API changes, security warnings, etc?
> 
> Regards,
> Matthias
> 
> On 12/7/2022 1:06 PM, arnab phani wrote:
> 
> > Can these upgrades harm the backward compatibility of the next SystemDS
> > release?
> > If so, then we either need to make a major release or delay the upgrades
> > till the next major release.
> > 
> > Regards,
> > Arnab..
> > 
> > On Wed, Dec 7, 2022 at 1:01 PM <<s...@bgaard.dk>> wrote:
> > 
> > 
> > > Hi all,
> > > 
> > > Just to let all know,
> > > 
> > > I am considering updating the Spark, and Hadoop version of our system,
> > > would this interfere with any ongoing work currently?
> > > 
> > > Spark 3.2.0 -> 3.3.1
> > > Hadoop 3.3.3 -> 3.3.4
> > > 
> > > best regards
> > > Sebastian
> > > 


Re: Update spark version

Posted by Matthias Boehm <mb...@gmail.com>.
yes, backwards compatibility is important to keep in mind. In the past, 
we mentioned the *minimum* Spark/Hadoop versions SystemML/SystemDS 
required, while still being able to run with more recent versions.

So let's separate two different aspects here: (1) the minimum version 
compatible with our source code, and (2) the dependencies we reference 
for tests and build. Although Spark and Hadoop are generally very good 
in terms of backwards compatibility inside major versions, it would be 
good to keep (1) and (2) in sync to avoid hidden incompatibilities.

Are there specific reasons for this update - API changes, security 
warnings, etc?

Regards,
Matthias

On 12/7/2022 1:06 PM, arnab phani wrote:
> Can these upgrades harm the backward compatibility of the next SystemDS
> release?
> If so, then we either need to make a major release or delay the upgrades
> till the next major release.
>
> Regards,
> Arnab..
>
> On Wed, Dec 7, 2022 at 1:01 PM <se...@bgaard.dk> wrote:
>
>> Hi all,
>>
>> Just to let all know,
>>
>> I am considering updating the Spark, and Hadoop version of our system,
>> would this interfere with any ongoing work currently?
>>
>> Spark 3.2.0 -> 3.3.1
>> Hadoop 3.3.3 -> 3.3.4
>>
>> best regards
>> Sebastian
>>


Re: Update spark version

Posted by arnab phani <ph...@gmail.com>.
Can these upgrades harm the backward compatibility of the next SystemDS
release?
If so, then we either need to make a major release or delay the upgrades
till the next major release.

Regards,
Arnab..

On Wed, Dec 7, 2022 at 1:01 PM <se...@bgaard.dk> wrote:

> Hi all,
>
> Just to let all know,
>
> I am considering updating the Spark, and Hadoop version of our system,
> would this interfere with any ongoing work currently?
>
> Spark 3.2.0 -> 3.3.1
> Hadoop 3.3.3 -> 3.3.4
>
> best regards
> Sebastian
>