You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Gabor Bota <ga...@cloudera.com.INVALID> on 2019/04/01 19:35:34 UTC

Update guava to 27.0-jre in hadoop-project

Hi devs,

I'm working on the guava version from 11.0.2 to 27.0-jre in hadoop-project.
We need to do the upgrade because of CVE-2018-10237
<https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.

I've created an issue (HADOOP-15960
<https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress and
created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
update should be done in the trunk, and then it can be backported to lower
version branches. Backporting to 2.x is not feasible right now, because of
Guava 20 is the last Java 7 compatible version[1], and we have Java 7
compatibility on version 2 branches - but we are planning to update (
HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).

For the new deprecations after the update, I've created another issue (
HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
can be fixed after the update is committed.

Unit and integration testing in hadoop trunk
There were modifications in the test in the following modules so
precommit tests were running on jenkins:

   - hadoop-common-project
   - hadoop-hdfs-project
   - hadoop-mapreduce-project
   - hadoop-yarn-project

There was one failure but after re-running the test locally it was
successful, so not related to the change.

Because of 5 hour test time limit for jenkins precommit build, I had to run
tests on hadoop-tools manually and the tests were successful. You can find
test results for trunk under HADOOP-16210
<https://issues.apache.org/jira/browse/HADOOP-16210>.

Integration testing with other components
I've done testing with HBase master on hadoop branch-3.0 with guava 27, and
the tests were running fine. Thanks to Peter Somogyi for help.
We are planning to do some testing with Peter Vary on Hive with branch-3.1
this week.

Thanks,
Gabor

[1]
https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ

Re: Update guava to 27.0-jre in hadoop-project

Posted by Wei-Chiu Chuang <we...@cloudera.com.INVALID>.
+1

I watched Gabor working on this and this is a very comprehensive, which
also includes testing in downstreamers (HBase and Hive). So very good work.

Thanks!

On Wed, Apr 3, 2019 at 3:41 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> I am taking silence as happiness here.
>
> +1 to the patch
>
> On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:
>
> >
> > I know that the number of guava updates we could call painless is 0, but
> > we need to do this.
> >
> > The last time we successfully updated Guava was 2012: h
> > ttps://issues.apache.org/jira/browse/HDFS-3187
> > That was the java 6 era
> >
> > The last unsuccessful attempt, April 2017:
> > https://issues.apache.org/jira/browse/HADOOP-14386
> >
> > Let's try again and this time if there are problems say: sorry, but its
> > time to move on.
> >
> > I think we should only worry about branch-3.2+ for now, though the other
> > branches could be lined up for those changes needed to ensure that
> > everything builds if you explicitly set the version (e.g findbugs
> changes.
> > Then we can worry about 3.1.x line, which is the 3.x branch most widely
> > picked up to date.
> >
> > I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> > us on to java 8 builds there so that people can do a branch-2 build if
> they
> > need to.
> >
> > *Is everyone happy with the proposed patch*:
> > https://github.com/apache/hadoop/pull/674
> >
> > -Steve
> >
> >
> > On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <gabor.bota@cloudera.com
> .invalid>
> > wrote:
> >
> >> Hi devs,
> >>
> >> I'm working on the guava version from 11.0.2 to 27.0-jre in
> >> hadoop-project.
> >> We need to do the upgrade because of CVE-2018-10237
> >> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
> >>
> >> I've created an issue (HADOOP-15960
> >> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> >> and
> >> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> >> update should be done in the trunk, and then it can be backported to
> lower
> >> version branches. Backporting to 2.x is not feasible right now, because
> of
> >> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> >> compatibility on version 2 branches - but we are planning to update (
> >> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
> >>
> >> For the new deprecations after the update, I've created another issue (
> >> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>).
> Those
> >> can be fixed after the update is committed.
> >>
> >> Unit and integration testing in hadoop trunk
> >> There were modifications in the test in the following modules so
> >> precommit tests were running on jenkins:
> >>
> >>    - hadoop-common-project
> >>    - hadoop-hdfs-project
> >>    - hadoop-mapreduce-project
> >>    - hadoop-yarn-project
> >>
> >> There was one failure but after re-running the test locally it was
> >> successful, so not related to the change.
> >>
> >> Because of 5 hour test time limit for jenkins precommit build, I had to
> >> run
> >> tests on hadoop-tools manually and the tests were successful. You can
> find
> >> test results for trunk under HADOOP-16210
> >> <https://issues.apache.org/jira/browse/HADOOP-16210>.
> >>
> >> Integration testing with other components
> >> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
> >> and
> >> the tests were running fine. Thanks to Peter Somogyi for help.
> >> We are planning to do some testing with Peter Vary on Hive with
> branch-3.1
> >> this week.
> >>
> >> Thanks,
> >> Gabor
> >>
> >> [1]
> >>
> >>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
> >>
> >
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Wei-Chiu Chuang <we...@cloudera.com.INVALID>.
+1

I watched Gabor working on this and this is a very comprehensive, which
also includes testing in downstreamers (HBase and Hive). So very good work.

Thanks!

On Wed, Apr 3, 2019 at 3:41 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> I am taking silence as happiness here.
>
> +1 to the patch
>
> On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:
>
> >
> > I know that the number of guava updates we could call painless is 0, but
> > we need to do this.
> >
> > The last time we successfully updated Guava was 2012: h
> > ttps://issues.apache.org/jira/browse/HDFS-3187
> > That was the java 6 era
> >
> > The last unsuccessful attempt, April 2017:
> > https://issues.apache.org/jira/browse/HADOOP-14386
> >
> > Let's try again and this time if there are problems say: sorry, but its
> > time to move on.
> >
> > I think we should only worry about branch-3.2+ for now, though the other
> > branches could be lined up for those changes needed to ensure that
> > everything builds if you explicitly set the version (e.g findbugs
> changes.
> > Then we can worry about 3.1.x line, which is the 3.x branch most widely
> > picked up to date.
> >
> > I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> > us on to java 8 builds there so that people can do a branch-2 build if
> they
> > need to.
> >
> > *Is everyone happy with the proposed patch*:
> > https://github.com/apache/hadoop/pull/674
> >
> > -Steve
> >
> >
> > On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <gabor.bota@cloudera.com
> .invalid>
> > wrote:
> >
> >> Hi devs,
> >>
> >> I'm working on the guava version from 11.0.2 to 27.0-jre in
> >> hadoop-project.
> >> We need to do the upgrade because of CVE-2018-10237
> >> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
> >>
> >> I've created an issue (HADOOP-15960
> >> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> >> and
> >> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> >> update should be done in the trunk, and then it can be backported to
> lower
> >> version branches. Backporting to 2.x is not feasible right now, because
> of
> >> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> >> compatibility on version 2 branches - but we are planning to update (
> >> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
> >>
> >> For the new deprecations after the update, I've created another issue (
> >> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>).
> Those
> >> can be fixed after the update is committed.
> >>
> >> Unit and integration testing in hadoop trunk
> >> There were modifications in the test in the following modules so
> >> precommit tests were running on jenkins:
> >>
> >>    - hadoop-common-project
> >>    - hadoop-hdfs-project
> >>    - hadoop-mapreduce-project
> >>    - hadoop-yarn-project
> >>
> >> There was one failure but after re-running the test locally it was
> >> successful, so not related to the change.
> >>
> >> Because of 5 hour test time limit for jenkins precommit build, I had to
> >> run
> >> tests on hadoop-tools manually and the tests were successful. You can
> find
> >> test results for trunk under HADOOP-16210
> >> <https://issues.apache.org/jira/browse/HADOOP-16210>.
> >>
> >> Integration testing with other components
> >> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
> >> and
> >> the tests were running fine. Thanks to Peter Somogyi for help.
> >> We are planning to do some testing with Peter Vary on Hive with
> branch-3.1
> >> this week.
> >>
> >> Thanks,
> >> Gabor
> >>
> >> [1]
> >>
> >>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
> >>
> >
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Wei-Chiu Chuang <we...@cloudera.com.INVALID>.
+1

I watched Gabor working on this and this is a very comprehensive, which
also includes testing in downstreamers (HBase and Hive). So very good work.

Thanks!

On Wed, Apr 3, 2019 at 3:41 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> I am taking silence as happiness here.
>
> +1 to the patch
>
> On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:
>
> >
> > I know that the number of guava updates we could call painless is 0, but
> > we need to do this.
> >
> > The last time we successfully updated Guava was 2012: h
> > ttps://issues.apache.org/jira/browse/HDFS-3187
> > That was the java 6 era
> >
> > The last unsuccessful attempt, April 2017:
> > https://issues.apache.org/jira/browse/HADOOP-14386
> >
> > Let's try again and this time if there are problems say: sorry, but its
> > time to move on.
> >
> > I think we should only worry about branch-3.2+ for now, though the other
> > branches could be lined up for those changes needed to ensure that
> > everything builds if you explicitly set the version (e.g findbugs
> changes.
> > Then we can worry about 3.1.x line, which is the 3.x branch most widely
> > picked up to date.
> >
> > I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> > us on to java 8 builds there so that people can do a branch-2 build if
> they
> > need to.
> >
> > *Is everyone happy with the proposed patch*:
> > https://github.com/apache/hadoop/pull/674
> >
> > -Steve
> >
> >
> > On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <gabor.bota@cloudera.com
> .invalid>
> > wrote:
> >
> >> Hi devs,
> >>
> >> I'm working on the guava version from 11.0.2 to 27.0-jre in
> >> hadoop-project.
> >> We need to do the upgrade because of CVE-2018-10237
> >> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
> >>
> >> I've created an issue (HADOOP-15960
> >> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> >> and
> >> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> >> update should be done in the trunk, and then it can be backported to
> lower
> >> version branches. Backporting to 2.x is not feasible right now, because
> of
> >> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> >> compatibility on version 2 branches - but we are planning to update (
> >> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
> >>
> >> For the new deprecations after the update, I've created another issue (
> >> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>).
> Those
> >> can be fixed after the update is committed.
> >>
> >> Unit and integration testing in hadoop trunk
> >> There were modifications in the test in the following modules so
> >> precommit tests were running on jenkins:
> >>
> >>    - hadoop-common-project
> >>    - hadoop-hdfs-project
> >>    - hadoop-mapreduce-project
> >>    - hadoop-yarn-project
> >>
> >> There was one failure but after re-running the test locally it was
> >> successful, so not related to the change.
> >>
> >> Because of 5 hour test time limit for jenkins precommit build, I had to
> >> run
> >> tests on hadoop-tools manually and the tests were successful. You can
> find
> >> test results for trunk under HADOOP-16210
> >> <https://issues.apache.org/jira/browse/HADOOP-16210>.
> >>
> >> Integration testing with other components
> >> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
> >> and
> >> the tests were running fine. Thanks to Peter Somogyi for help.
> >> We are planning to do some testing with Peter Vary on Hive with
> branch-3.1
> >> this week.
> >>
> >> Thanks,
> >> Gabor
> >>
> >> [1]
> >>
> >>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
> >>
> >
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Wei-Chiu Chuang <we...@cloudera.com.INVALID>.
+1

I watched Gabor working on this and this is a very comprehensive, which
also includes testing in downstreamers (HBase and Hive). So very good work.

Thanks!

On Wed, Apr 3, 2019 at 3:41 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> I am taking silence as happiness here.
>
> +1 to the patch
>
> On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:
>
> >
> > I know that the number of guava updates we could call painless is 0, but
> > we need to do this.
> >
> > The last time we successfully updated Guava was 2012: h
> > ttps://issues.apache.org/jira/browse/HDFS-3187
> > That was the java 6 era
> >
> > The last unsuccessful attempt, April 2017:
> > https://issues.apache.org/jira/browse/HADOOP-14386
> >
> > Let's try again and this time if there are problems say: sorry, but its
> > time to move on.
> >
> > I think we should only worry about branch-3.2+ for now, though the other
> > branches could be lined up for those changes needed to ensure that
> > everything builds if you explicitly set the version (e.g findbugs
> changes.
> > Then we can worry about 3.1.x line, which is the 3.x branch most widely
> > picked up to date.
> >
> > I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> > us on to java 8 builds there so that people can do a branch-2 build if
> they
> > need to.
> >
> > *Is everyone happy with the proposed patch*:
> > https://github.com/apache/hadoop/pull/674
> >
> > -Steve
> >
> >
> > On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <gabor.bota@cloudera.com
> .invalid>
> > wrote:
> >
> >> Hi devs,
> >>
> >> I'm working on the guava version from 11.0.2 to 27.0-jre in
> >> hadoop-project.
> >> We need to do the upgrade because of CVE-2018-10237
> >> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
> >>
> >> I've created an issue (HADOOP-15960
> >> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> >> and
> >> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> >> update should be done in the trunk, and then it can be backported to
> lower
> >> version branches. Backporting to 2.x is not feasible right now, because
> of
> >> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> >> compatibility on version 2 branches - but we are planning to update (
> >> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
> >>
> >> For the new deprecations after the update, I've created another issue (
> >> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>).
> Those
> >> can be fixed after the update is committed.
> >>
> >> Unit and integration testing in hadoop trunk
> >> There were modifications in the test in the following modules so
> >> precommit tests were running on jenkins:
> >>
> >>    - hadoop-common-project
> >>    - hadoop-hdfs-project
> >>    - hadoop-mapreduce-project
> >>    - hadoop-yarn-project
> >>
> >> There was one failure but after re-running the test locally it was
> >> successful, so not related to the change.
> >>
> >> Because of 5 hour test time limit for jenkins precommit build, I had to
> >> run
> >> tests on hadoop-tools manually and the tests were successful. You can
> find
> >> test results for trunk under HADOOP-16210
> >> <https://issues.apache.org/jira/browse/HADOOP-16210>.
> >>
> >> Integration testing with other components
> >> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
> >> and
> >> the tests were running fine. Thanks to Peter Somogyi for help.
> >> We are planning to do some testing with Peter Vary on Hive with
> branch-3.1
> >> this week.
> >>
> >> Thanks,
> >> Gabor
> >>
> >> [1]
> >>
> >>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
> >>
> >
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I am taking silence as happiness here.

+1 to the patch

On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:

>
> I know that the number of guava updates we could call painless is 0, but
> we need to do this.
>
> The last time we successfully updated Guava was 2012: h
> ttps://issues.apache.org/jira/browse/HDFS-3187
> That was the java 6 era
>
> The last unsuccessful attempt, April 2017:
> https://issues.apache.org/jira/browse/HADOOP-14386
>
> Let's try again and this time if there are problems say: sorry, but its
> time to move on.
>
> I think we should only worry about branch-3.2+ for now, though the other
> branches could be lined up for those changes needed to ensure that
> everything builds if you explicitly set the version (e.g findbugs changes.
> Then we can worry about 3.1.x line, which is the 3.x branch most widely
> picked up to date.
>
> I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> us on to java 8 builds there so that people can do a branch-2 build if they
> need to.
>
> *Is everyone happy with the proposed patch*:
> https://github.com/apache/hadoop/pull/674
>
> -Steve
>
>
> On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
> wrote:
>
>> Hi devs,
>>
>> I'm working on the guava version from 11.0.2 to 27.0-jre in
>> hadoop-project.
>> We need to do the upgrade because of CVE-2018-10237
>> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>>
>> I've created an issue (HADOOP-15960
>> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
>> and
>> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
>> update should be done in the trunk, and then it can be backported to lower
>> version branches. Backporting to 2.x is not feasible right now, because of
>> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
>> compatibility on version 2 branches - but we are planning to update (
>> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>>
>> For the new deprecations after the update, I've created another issue (
>> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
>> can be fixed after the update is committed.
>>
>> Unit and integration testing in hadoop trunk
>> There were modifications in the test in the following modules so
>> precommit tests were running on jenkins:
>>
>>    - hadoop-common-project
>>    - hadoop-hdfs-project
>>    - hadoop-mapreduce-project
>>    - hadoop-yarn-project
>>
>> There was one failure but after re-running the test locally it was
>> successful, so not related to the change.
>>
>> Because of 5 hour test time limit for jenkins precommit build, I had to
>> run
>> tests on hadoop-tools manually and the tests were successful. You can find
>> test results for trunk under HADOOP-16210
>> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>>
>> Integration testing with other components
>> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
>> and
>> the tests were running fine. Thanks to Peter Somogyi for help.
>> We are planning to do some testing with Peter Vary on Hive with branch-3.1
>> this week.
>>
>> Thanks,
>> Gabor
>>
>> [1]
>>
>> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>>
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I am taking silence as happiness here.

+1 to the patch

On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:

>
> I know that the number of guava updates we could call painless is 0, but
> we need to do this.
>
> The last time we successfully updated Guava was 2012: h
> ttps://issues.apache.org/jira/browse/HDFS-3187
> That was the java 6 era
>
> The last unsuccessful attempt, April 2017:
> https://issues.apache.org/jira/browse/HADOOP-14386
>
> Let's try again and this time if there are problems say: sorry, but its
> time to move on.
>
> I think we should only worry about branch-3.2+ for now, though the other
> branches could be lined up for those changes needed to ensure that
> everything builds if you explicitly set the version (e.g findbugs changes.
> Then we can worry about 3.1.x line, which is the 3.x branch most widely
> picked up to date.
>
> I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> us on to java 8 builds there so that people can do a branch-2 build if they
> need to.
>
> *Is everyone happy with the proposed patch*:
> https://github.com/apache/hadoop/pull/674
>
> -Steve
>
>
> On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
> wrote:
>
>> Hi devs,
>>
>> I'm working on the guava version from 11.0.2 to 27.0-jre in
>> hadoop-project.
>> We need to do the upgrade because of CVE-2018-10237
>> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>>
>> I've created an issue (HADOOP-15960
>> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
>> and
>> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
>> update should be done in the trunk, and then it can be backported to lower
>> version branches. Backporting to 2.x is not feasible right now, because of
>> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
>> compatibility on version 2 branches - but we are planning to update (
>> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>>
>> For the new deprecations after the update, I've created another issue (
>> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
>> can be fixed after the update is committed.
>>
>> Unit and integration testing in hadoop trunk
>> There were modifications in the test in the following modules so
>> precommit tests were running on jenkins:
>>
>>    - hadoop-common-project
>>    - hadoop-hdfs-project
>>    - hadoop-mapreduce-project
>>    - hadoop-yarn-project
>>
>> There was one failure but after re-running the test locally it was
>> successful, so not related to the change.
>>
>> Because of 5 hour test time limit for jenkins precommit build, I had to
>> run
>> tests on hadoop-tools manually and the tests were successful. You can find
>> test results for trunk under HADOOP-16210
>> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>>
>> Integration testing with other components
>> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
>> and
>> the tests were running fine. Thanks to Peter Somogyi for help.
>> We are planning to do some testing with Peter Vary on Hive with branch-3.1
>> this week.
>>
>> Thanks,
>> Gabor
>>
>> [1]
>>
>> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>>
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I am taking silence as happiness here.

+1 to the patch

On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:

>
> I know that the number of guava updates we could call painless is 0, but
> we need to do this.
>
> The last time we successfully updated Guava was 2012: h
> ttps://issues.apache.org/jira/browse/HDFS-3187
> That was the java 6 era
>
> The last unsuccessful attempt, April 2017:
> https://issues.apache.org/jira/browse/HADOOP-14386
>
> Let's try again and this time if there are problems say: sorry, but its
> time to move on.
>
> I think we should only worry about branch-3.2+ for now, though the other
> branches could be lined up for those changes needed to ensure that
> everything builds if you explicitly set the version (e.g findbugs changes.
> Then we can worry about 3.1.x line, which is the 3.x branch most widely
> picked up to date.
>
> I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> us on to java 8 builds there so that people can do a branch-2 build if they
> need to.
>
> *Is everyone happy with the proposed patch*:
> https://github.com/apache/hadoop/pull/674
>
> -Steve
>
>
> On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
> wrote:
>
>> Hi devs,
>>
>> I'm working on the guava version from 11.0.2 to 27.0-jre in
>> hadoop-project.
>> We need to do the upgrade because of CVE-2018-10237
>> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>>
>> I've created an issue (HADOOP-15960
>> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
>> and
>> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
>> update should be done in the trunk, and then it can be backported to lower
>> version branches. Backporting to 2.x is not feasible right now, because of
>> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
>> compatibility on version 2 branches - but we are planning to update (
>> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>>
>> For the new deprecations after the update, I've created another issue (
>> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
>> can be fixed after the update is committed.
>>
>> Unit and integration testing in hadoop trunk
>> There were modifications in the test in the following modules so
>> precommit tests were running on jenkins:
>>
>>    - hadoop-common-project
>>    - hadoop-hdfs-project
>>    - hadoop-mapreduce-project
>>    - hadoop-yarn-project
>>
>> There was one failure but after re-running the test locally it was
>> successful, so not related to the change.
>>
>> Because of 5 hour test time limit for jenkins precommit build, I had to
>> run
>> tests on hadoop-tools manually and the tests were successful. You can find
>> test results for trunk under HADOOP-16210
>> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>>
>> Integration testing with other components
>> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
>> and
>> the tests were running fine. Thanks to Peter Somogyi for help.
>> We are planning to do some testing with Peter Vary on Hive with branch-3.1
>> this week.
>>
>> Thanks,
>> Gabor
>>
>> [1]
>>
>> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>>
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I am taking silence as happiness here.

+1 to the patch

On Tue, Apr 2, 2019 at 9:54 AM Steve Loughran <st...@cloudera.com> wrote:

>
> I know that the number of guava updates we could call painless is 0, but
> we need to do this.
>
> The last time we successfully updated Guava was 2012: h
> ttps://issues.apache.org/jira/browse/HDFS-3187
> That was the java 6 era
>
> The last unsuccessful attempt, April 2017:
> https://issues.apache.org/jira/browse/HADOOP-14386
>
> Let's try again and this time if there are problems say: sorry, but its
> time to move on.
>
> I think we should only worry about branch-3.2+ for now, though the other
> branches could be lined up for those changes needed to ensure that
> everything builds if you explicitly set the version (e.g findbugs changes.
> Then we can worry about 3.1.x line, which is the 3.x branch most widely
> picked up to date.
>
> I want to avoid branch-2 entirely, though as Gabor notes, I want to move
> us on to java 8 builds there so that people can do a branch-2 build if they
> need to.
>
> *Is everyone happy with the proposed patch*:
> https://github.com/apache/hadoop/pull/674
>
> -Steve
>
>
> On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
> wrote:
>
>> Hi devs,
>>
>> I'm working on the guava version from 11.0.2 to 27.0-jre in
>> hadoop-project.
>> We need to do the upgrade because of CVE-2018-10237
>> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>>
>> I've created an issue (HADOOP-15960
>> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
>> and
>> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
>> update should be done in the trunk, and then it can be backported to lower
>> version branches. Backporting to 2.x is not feasible right now, because of
>> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
>> compatibility on version 2 branches - but we are planning to update (
>> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>>
>> For the new deprecations after the update, I've created another issue (
>> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
>> can be fixed after the update is committed.
>>
>> Unit and integration testing in hadoop trunk
>> There were modifications in the test in the following modules so
>> precommit tests were running on jenkins:
>>
>>    - hadoop-common-project
>>    - hadoop-hdfs-project
>>    - hadoop-mapreduce-project
>>    - hadoop-yarn-project
>>
>> There was one failure but after re-running the test locally it was
>> successful, so not related to the change.
>>
>> Because of 5 hour test time limit for jenkins precommit build, I had to
>> run
>> tests on hadoop-tools manually and the tests were successful. You can find
>> test results for trunk under HADOOP-16210
>> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>>
>> Integration testing with other components
>> I've done testing with HBase master on hadoop branch-3.0 with guava 27,
>> and
>> the tests were running fine. Thanks to Peter Somogyi for help.
>> We are planning to do some testing with Peter Vary on Hive with branch-3.1
>> this week.
>>
>> Thanks,
>> Gabor
>>
>> [1]
>>
>> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>>
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I know that the number of guava updates we could call painless is 0, but we
need to do this.

The last time we successfully updated Guava was 2012: h
ttps://issues.apache.org/jira/browse/HDFS-3187
That was the java 6 era

The last unsuccessful attempt, April 2017:
https://issues.apache.org/jira/browse/HADOOP-14386

Let's try again and this time if there are problems say: sorry, but its
time to move on.

I think we should only worry about branch-3.2+ for now, though the other
branches could be lined up for those changes needed to ensure that
everything builds if you explicitly set the version (e.g findbugs changes.
Then we can worry about 3.1.x line, which is the 3.x branch most widely
picked up to date.

I want to avoid branch-2 entirely, though as Gabor notes, I want to move us
on to java 8 builds there so that people can do a branch-2 build if they
need to.

*Is everyone happy with the proposed patch*:
https://github.com/apache/hadoop/pull/674

-Steve


On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
wrote:

> Hi devs,
>
> I'm working on the guava version from 11.0.2 to 27.0-jre in hadoop-project.
> We need to do the upgrade because of CVE-2018-10237
> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>
> I've created an issue (HADOOP-15960
> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> and
> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> update should be done in the trunk, and then it can be backported to lower
> version branches. Backporting to 2.x is not feasible right now, because of
> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> compatibility on version 2 branches - but we are planning to update (
> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>
> For the new deprecations after the update, I've created another issue (
> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
> can be fixed after the update is committed.
>
> Unit and integration testing in hadoop trunk
> There were modifications in the test in the following modules so
> precommit tests were running on jenkins:
>
>    - hadoop-common-project
>    - hadoop-hdfs-project
>    - hadoop-mapreduce-project
>    - hadoop-yarn-project
>
> There was one failure but after re-running the test locally it was
> successful, so not related to the change.
>
> Because of 5 hour test time limit for jenkins precommit build, I had to run
> tests on hadoop-tools manually and the tests were successful. You can find
> test results for trunk under HADOOP-16210
> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>
> Integration testing with other components
> I've done testing with HBase master on hadoop branch-3.0 with guava 27, and
> the tests were running fine. Thanks to Peter Somogyi for help.
> We are planning to do some testing with Peter Vary on Hive with branch-3.1
> this week.
>
> Thanks,
> Gabor
>
> [1]
>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I know that the number of guava updates we could call painless is 0, but we
need to do this.

The last time we successfully updated Guava was 2012: h
ttps://issues.apache.org/jira/browse/HDFS-3187
That was the java 6 era

The last unsuccessful attempt, April 2017:
https://issues.apache.org/jira/browse/HADOOP-14386

Let's try again and this time if there are problems say: sorry, but its
time to move on.

I think we should only worry about branch-3.2+ for now, though the other
branches could be lined up for those changes needed to ensure that
everything builds if you explicitly set the version (e.g findbugs changes.
Then we can worry about 3.1.x line, which is the 3.x branch most widely
picked up to date.

I want to avoid branch-2 entirely, though as Gabor notes, I want to move us
on to java 8 builds there so that people can do a branch-2 build if they
need to.

*Is everyone happy with the proposed patch*:
https://github.com/apache/hadoop/pull/674

-Steve


On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
wrote:

> Hi devs,
>
> I'm working on the guava version from 11.0.2 to 27.0-jre in hadoop-project.
> We need to do the upgrade because of CVE-2018-10237
> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>
> I've created an issue (HADOOP-15960
> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> and
> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> update should be done in the trunk, and then it can be backported to lower
> version branches. Backporting to 2.x is not feasible right now, because of
> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> compatibility on version 2 branches - but we are planning to update (
> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>
> For the new deprecations after the update, I've created another issue (
> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
> can be fixed after the update is committed.
>
> Unit and integration testing in hadoop trunk
> There were modifications in the test in the following modules so
> precommit tests were running on jenkins:
>
>    - hadoop-common-project
>    - hadoop-hdfs-project
>    - hadoop-mapreduce-project
>    - hadoop-yarn-project
>
> There was one failure but after re-running the test locally it was
> successful, so not related to the change.
>
> Because of 5 hour test time limit for jenkins precommit build, I had to run
> tests on hadoop-tools manually and the tests were successful. You can find
> test results for trunk under HADOOP-16210
> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>
> Integration testing with other components
> I've done testing with HBase master on hadoop branch-3.0 with guava 27, and
> the tests were running fine. Thanks to Peter Somogyi for help.
> We are planning to do some testing with Peter Vary on Hive with branch-3.1
> this week.
>
> Thanks,
> Gabor
>
> [1]
>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I know that the number of guava updates we could call painless is 0, but we
need to do this.

The last time we successfully updated Guava was 2012: h
ttps://issues.apache.org/jira/browse/HDFS-3187
That was the java 6 era

The last unsuccessful attempt, April 2017:
https://issues.apache.org/jira/browse/HADOOP-14386

Let's try again and this time if there are problems say: sorry, but its
time to move on.

I think we should only worry about branch-3.2+ for now, though the other
branches could be lined up for those changes needed to ensure that
everything builds if you explicitly set the version (e.g findbugs changes.
Then we can worry about 3.1.x line, which is the 3.x branch most widely
picked up to date.

I want to avoid branch-2 entirely, though as Gabor notes, I want to move us
on to java 8 builds there so that people can do a branch-2 build if they
need to.

*Is everyone happy with the proposed patch*:
https://github.com/apache/hadoop/pull/674

-Steve


On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
wrote:

> Hi devs,
>
> I'm working on the guava version from 11.0.2 to 27.0-jre in hadoop-project.
> We need to do the upgrade because of CVE-2018-10237
> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>
> I've created an issue (HADOOP-15960
> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> and
> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> update should be done in the trunk, and then it can be backported to lower
> version branches. Backporting to 2.x is not feasible right now, because of
> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> compatibility on version 2 branches - but we are planning to update (
> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>
> For the new deprecations after the update, I've created another issue (
> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
> can be fixed after the update is committed.
>
> Unit and integration testing in hadoop trunk
> There were modifications in the test in the following modules so
> precommit tests were running on jenkins:
>
>    - hadoop-common-project
>    - hadoop-hdfs-project
>    - hadoop-mapreduce-project
>    - hadoop-yarn-project
>
> There was one failure but after re-running the test locally it was
> successful, so not related to the change.
>
> Because of 5 hour test time limit for jenkins precommit build, I had to run
> tests on hadoop-tools manually and the tests were successful. You can find
> test results for trunk under HADOOP-16210
> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>
> Integration testing with other components
> I've done testing with HBase master on hadoop branch-3.0 with guava 27, and
> the tests were running fine. Thanks to Peter Somogyi for help.
> We are planning to do some testing with Peter Vary on Hive with branch-3.1
> this week.
>
> Thanks,
> Gabor
>
> [1]
>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>

Re: Update guava to 27.0-jre in hadoop-project

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
I know that the number of guava updates we could call painless is 0, but we
need to do this.

The last time we successfully updated Guava was 2012: h
ttps://issues.apache.org/jira/browse/HDFS-3187
That was the java 6 era

The last unsuccessful attempt, April 2017:
https://issues.apache.org/jira/browse/HADOOP-14386

Let's try again and this time if there are problems say: sorry, but its
time to move on.

I think we should only worry about branch-3.2+ for now, though the other
branches could be lined up for those changes needed to ensure that
everything builds if you explicitly set the version (e.g findbugs changes.
Then we can worry about 3.1.x line, which is the 3.x branch most widely
picked up to date.

I want to avoid branch-2 entirely, though as Gabor notes, I want to move us
on to java 8 builds there so that people can do a branch-2 build if they
need to.

*Is everyone happy with the proposed patch*:
https://github.com/apache/hadoop/pull/674

-Steve


On Mon, Apr 1, 2019 at 8:35 PM Gabor Bota <ga...@cloudera.com.invalid>
wrote:

> Hi devs,
>
> I'm working on the guava version from 11.0.2 to 27.0-jre in hadoop-project.
> We need to do the upgrade because of CVE-2018-10237
> <https://nvd.nist.gov/vuln/detail/CVE-2018-10237>.
>
> I've created an issue (HADOOP-15960
> <https://issues.apache.org/jira/browse/HADOOP-15960>) to track progress
> and
> created subtasks for hadoop branches 3.0, 3.1, 3.2 and trunk. The first
> update should be done in the trunk, and then it can be backported to lower
> version branches. Backporting to 2.x is not feasible right now, because of
> Guava 20 is the last Java 7 compatible version[1], and we have Java 7
> compatibility on version 2 branches - but we are planning to update (
> HADOOP-16219 <https://issues.apache.org/jira/browse/HADOOP-16219>).
>
> For the new deprecations after the update, I've created another issue (
> HADOOP-16222 <https://issues.apache.org/jira/browse/HADOOP-16222>). Those
> can be fixed after the update is committed.
>
> Unit and integration testing in hadoop trunk
> There were modifications in the test in the following modules so
> precommit tests were running on jenkins:
>
>    - hadoop-common-project
>    - hadoop-hdfs-project
>    - hadoop-mapreduce-project
>    - hadoop-yarn-project
>
> There was one failure but after re-running the test locally it was
> successful, so not related to the change.
>
> Because of 5 hour test time limit for jenkins precommit build, I had to run
> tests on hadoop-tools manually and the tests were successful. You can find
> test results for trunk under HADOOP-16210
> <https://issues.apache.org/jira/browse/HADOOP-16210>.
>
> Integration testing with other components
> I've done testing with HBase master on hadoop branch-3.0 with guava 27, and
> the tests were running fine. Thanks to Peter Somogyi for help.
> We are planning to do some testing with Peter Vary on Hive with branch-3.1
> this week.
>
> Thanks,
> Gabor
>
> [1]
>
> https://groups.google.com/forum/#!msg/guava-discuss/ZRmDJnAq9T0/-HExv44eCAAJ
>