You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by "张铎 (Duo Zhang)" <pa...@gmail.com> on 2019/09/21 00:24:45 UTC

Re: [NOTICE] Building trunk needs protoc 3.7.1

I think this one is alread in place so we have to upgrade...

https://issues.apache.org/jira/browse/HADOOP-16557

Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:

> Hi Vinay,
>
> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>
> Thanks,
> Wangda
>
> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
> wrote:
>
> > Hi All,
> >
> > A very long pending task, protobuf upgrade is happening in HADOOP-13363.
> As
> > part of that protobuf version is upgraded to 3.7.1.
> >
> > Please update your build environments to have 3.7.1 protobuf version.
> >
> > BUILIDING.txt has been updated with latest instructions.
> >
> > This pre-requisite to update protoc dependecy manually is required until
> > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > dynamically resolve required protoc exe.
> >
> > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> test
> > environments.
> >
> > Thanks,
> > -Vinay
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by "张铎 (Duo Zhang)" <pa...@gmail.com>.
The new protobuf plugin related issues have all been pushed to trunk(though
I think we'd better port them to all active branches).

So what's the next step? Shade and relocated protobuf? HBase has already
done this before so I do not think it will take too much time. If we all
agree on the solution, I think we can finish this in one week.

But maybe a problem is that, is it OK to upgrade protobuf in a minor
release? Of course if we shade and relocate protobuf it will less hurt to
users as they can depend on protobuf 2.5 explicitly if they want, but still
a bit uncomfortable.

Thanks.

Wangda Tan <wh...@gmail.com> 于2019年9月24日周二 上午2:29写道:

> Hi Vinay,
>
> Thanks for the clarification.
>
> Do you have a timeline about all you described works w.r.t.  the
> compatibility will be completed? I'm asking this is because we need to
> release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
> already, we should get it out earlier.
>
> If the PB work will take more time, do you think if we should create a
> branch for 3.3, revert PB changes from branch-3.3, and keep on working on
> PB for the next minor release? (or major release if we do see some
> compatibility issue in the future).
>
> Just my $0.02
>
> Thanks,
> Wangda
>
> On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <stevel@cloudera.com.invalid
> >
> wrote:
>
> > aah, that makes sense
> >
> > On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Thanks Steve.
> > >
> > > Idea is not to shade all artifacts.
> > > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > > dependencies ( com.google.* may be), add  this artifact as dependency
> in
> > > hadoop modules. Use shaded classes directly in the code of hadoop
> modules
> > > instead of shading at package phase.
> > >
> > > Hbase, ozone and ratis already following this way. The artifact (
> > > hadoop-thirdparty) with shaded dependencies can be maintained in a
> > separate
> > > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > > separate module in Hadoop repo. If maintained in separate repo, need to
> > > build this only when there are changes related to shaded dependencies.
> > >
> > >
> > > -Vinay
> > >
> > > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> > wrote:
> > >
> > > >
> > > >
> > > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <
> vinayakumarb@apache.org
> > >
> > > > wrote:
> > > >
> > > >>    Protobuf provides Wire compatibility between releases.. but not
> > > >> guarantees the source compatibility in generated sources. There will
> > be
> > > a
> > > >> problem in compatibility if anyone uses generated protobuf message
> > > outside
> > > >> of Hadoop modules. Which ideally shouldn't be as generated sources
> are
> > > not
> > > >> public APIs.
> > > >>
> > > >>    There should not be any compatibility problems between releases
> in
> > > >> terms
> > > >> of communication provided both uses same syntax (proto2) of proto
> > > message.
> > > >> This I have verified by communication between protobuf 2.5.0 client
> > with
> > > >> protobuf 3.7.1 server.
> > > >>
> > > >>    To avoid the downstream transitive dependency classpath problem,
> > who
> > > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > > classes
> > > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> > hadoop
> > > >> classpath.
> > > >>
> > > >> Hope I have answered your question.
> > > >>
> > > >> -Vinay
> > > >>
> > > >>
> > > > While I support the move and CP isolation, this is going to (finally)
> > > > force us to make shaded versions of all artifacts which we publish
> with
> > > the
> > > > intent of them being loaded on the classpath of other applications
> > > >
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by "张铎 (Duo Zhang)" <pa...@gmail.com>.
The new protobuf plugin related issues have all been pushed to trunk(though
I think we'd better port them to all active branches).

So what's the next step? Shade and relocated protobuf? HBase has already
done this before so I do not think it will take too much time. If we all
agree on the solution, I think we can finish this in one week.

But maybe a problem is that, is it OK to upgrade protobuf in a minor
release? Of course if we shade and relocate protobuf it will less hurt to
users as they can depend on protobuf 2.5 explicitly if they want, but still
a bit uncomfortable.

Thanks.

Wangda Tan <wh...@gmail.com> 于2019年9月24日周二 上午2:29写道:

> Hi Vinay,
>
> Thanks for the clarification.
>
> Do you have a timeline about all you described works w.r.t.  the
> compatibility will be completed? I'm asking this is because we need to
> release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
> already, we should get it out earlier.
>
> If the PB work will take more time, do you think if we should create a
> branch for 3.3, revert PB changes from branch-3.3, and keep on working on
> PB for the next minor release? (or major release if we do see some
> compatibility issue in the future).
>
> Just my $0.02
>
> Thanks,
> Wangda
>
> On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <stevel@cloudera.com.invalid
> >
> wrote:
>
> > aah, that makes sense
> >
> > On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Thanks Steve.
> > >
> > > Idea is not to shade all artifacts.
> > > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > > dependencies ( com.google.* may be), add  this artifact as dependency
> in
> > > hadoop modules. Use shaded classes directly in the code of hadoop
> modules
> > > instead of shading at package phase.
> > >
> > > Hbase, ozone and ratis already following this way. The artifact (
> > > hadoop-thirdparty) with shaded dependencies can be maintained in a
> > separate
> > > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > > separate module in Hadoop repo. If maintained in separate repo, need to
> > > build this only when there are changes related to shaded dependencies.
> > >
> > >
> > > -Vinay
> > >
> > > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> > wrote:
> > >
> > > >
> > > >
> > > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <
> vinayakumarb@apache.org
> > >
> > > > wrote:
> > > >
> > > >>    Protobuf provides Wire compatibility between releases.. but not
> > > >> guarantees the source compatibility in generated sources. There will
> > be
> > > a
> > > >> problem in compatibility if anyone uses generated protobuf message
> > > outside
> > > >> of Hadoop modules. Which ideally shouldn't be as generated sources
> are
> > > not
> > > >> public APIs.
> > > >>
> > > >>    There should not be any compatibility problems between releases
> in
> > > >> terms
> > > >> of communication provided both uses same syntax (proto2) of proto
> > > message.
> > > >> This I have verified by communication between protobuf 2.5.0 client
> > with
> > > >> protobuf 3.7.1 server.
> > > >>
> > > >>    To avoid the downstream transitive dependency classpath problem,
> > who
> > > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > > classes
> > > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> > hadoop
> > > >> classpath.
> > > >>
> > > >> Hope I have answered your question.
> > > >>
> > > >> -Vinay
> > > >>
> > > >>
> > > > While I support the move and CP isolation, this is going to (finally)
> > > > force us to make shaded versions of all artifacts which we publish
> with
> > > the
> > > > intent of them being loaded on the classpath of other applications
> > > >
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by "张铎 (Duo Zhang)" <pa...@gmail.com>.
The new protobuf plugin related issues have all been pushed to trunk(though
I think we'd better port them to all active branches).

So what's the next step? Shade and relocated protobuf? HBase has already
done this before so I do not think it will take too much time. If we all
agree on the solution, I think we can finish this in one week.

But maybe a problem is that, is it OK to upgrade protobuf in a minor
release? Of course if we shade and relocate protobuf it will less hurt to
users as they can depend on protobuf 2.5 explicitly if they want, but still
a bit uncomfortable.

Thanks.

Wangda Tan <wh...@gmail.com> 于2019年9月24日周二 上午2:29写道:

> Hi Vinay,
>
> Thanks for the clarification.
>
> Do you have a timeline about all you described works w.r.t.  the
> compatibility will be completed? I'm asking this is because we need to
> release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
> already, we should get it out earlier.
>
> If the PB work will take more time, do you think if we should create a
> branch for 3.3, revert PB changes from branch-3.3, and keep on working on
> PB for the next minor release? (or major release if we do see some
> compatibility issue in the future).
>
> Just my $0.02
>
> Thanks,
> Wangda
>
> On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <stevel@cloudera.com.invalid
> >
> wrote:
>
> > aah, that makes sense
> >
> > On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Thanks Steve.
> > >
> > > Idea is not to shade all artifacts.
> > > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > > dependencies ( com.google.* may be), add  this artifact as dependency
> in
> > > hadoop modules. Use shaded classes directly in the code of hadoop
> modules
> > > instead of shading at package phase.
> > >
> > > Hbase, ozone and ratis already following this way. The artifact (
> > > hadoop-thirdparty) with shaded dependencies can be maintained in a
> > separate
> > > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > > separate module in Hadoop repo. If maintained in separate repo, need to
> > > build this only when there are changes related to shaded dependencies.
> > >
> > >
> > > -Vinay
> > >
> > > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> > wrote:
> > >
> > > >
> > > >
> > > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <
> vinayakumarb@apache.org
> > >
> > > > wrote:
> > > >
> > > >>    Protobuf provides Wire compatibility between releases.. but not
> > > >> guarantees the source compatibility in generated sources. There will
> > be
> > > a
> > > >> problem in compatibility if anyone uses generated protobuf message
> > > outside
> > > >> of Hadoop modules. Which ideally shouldn't be as generated sources
> are
> > > not
> > > >> public APIs.
> > > >>
> > > >>    There should not be any compatibility problems between releases
> in
> > > >> terms
> > > >> of communication provided both uses same syntax (proto2) of proto
> > > message.
> > > >> This I have verified by communication between protobuf 2.5.0 client
> > with
> > > >> protobuf 3.7.1 server.
> > > >>
> > > >>    To avoid the downstream transitive dependency classpath problem,
> > who
> > > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > > classes
> > > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> > hadoop
> > > >> classpath.
> > > >>
> > > >> Hope I have answered your question.
> > > >>
> > > >> -Vinay
> > > >>
> > > >>
> > > > While I support the move and CP isolation, this is going to (finally)
> > > > force us to make shaded versions of all artifacts which we publish
> with
> > > the
> > > > intent of them being loaded on the classpath of other applications
> > > >
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Wangda Tan <wh...@gmail.com>.
Hi Vinay,

Thanks for the clarification.

Do you have a timeline about all you described works w.r.t.  the
compatibility will be completed? I'm asking this is because we need to
release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
already, we should get it out earlier.

If the PB work will take more time, do you think if we should create a
branch for 3.3, revert PB changes from branch-3.3, and keep on working on
PB for the next minor release? (or major release if we do see some
compatibility issue in the future).

Just my $0.02

Thanks,
Wangda

On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> aah, that makes sense
>
> On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
> > Thanks Steve.
> >
> > Idea is not to shade all artifacts.
> > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > dependencies ( com.google.* may be), add  this artifact as dependency in
> > hadoop modules. Use shaded classes directly in the code of hadoop modules
> > instead of shading at package phase.
> >
> > Hbase, ozone and ratis already following this way. The artifact (
> > hadoop-thirdparty) with shaded dependencies can be maintained in a
> separate
> > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > separate module in Hadoop repo. If maintained in separate repo, need to
> > build this only when there are changes related to shaded dependencies.
> >
> >
> > -Vinay
> >
> > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> wrote:
> >
> > >
> > >
> > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vinayakumarb@apache.org
> >
> > > wrote:
> > >
> > >>    Protobuf provides Wire compatibility between releases.. but not
> > >> guarantees the source compatibility in generated sources. There will
> be
> > a
> > >> problem in compatibility if anyone uses generated protobuf message
> > outside
> > >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> > not
> > >> public APIs.
> > >>
> > >>    There should not be any compatibility problems between releases in
> > >> terms
> > >> of communication provided both uses same syntax (proto2) of proto
> > message.
> > >> This I have verified by communication between protobuf 2.5.0 client
> with
> > >> protobuf 3.7.1 server.
> > >>
> > >>    To avoid the downstream transitive dependency classpath problem,
> who
> > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > classes
> > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> hadoop
> > >> classpath.
> > >>
> > >> Hope I have answered your question.
> > >>
> > >> -Vinay
> > >>
> > >>
> > > While I support the move and CP isolation, this is going to (finally)
> > > force us to make shaded versions of all artifacts which we publish with
> > the
> > > intent of them being loaded on the classpath of other applications
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Wangda Tan <wh...@gmail.com>.
Hi Vinay,

Thanks for the clarification.

Do you have a timeline about all you described works w.r.t.  the
compatibility will be completed? I'm asking this is because we need to
release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
already, we should get it out earlier.

If the PB work will take more time, do you think if we should create a
branch for 3.3, revert PB changes from branch-3.3, and keep on working on
PB for the next minor release? (or major release if we do see some
compatibility issue in the future).

Just my $0.02

Thanks,
Wangda

On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> aah, that makes sense
>
> On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
> > Thanks Steve.
> >
> > Idea is not to shade all artifacts.
> > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > dependencies ( com.google.* may be), add  this artifact as dependency in
> > hadoop modules. Use shaded classes directly in the code of hadoop modules
> > instead of shading at package phase.
> >
> > Hbase, ozone and ratis already following this way. The artifact (
> > hadoop-thirdparty) with shaded dependencies can be maintained in a
> separate
> > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > separate module in Hadoop repo. If maintained in separate repo, need to
> > build this only when there are changes related to shaded dependencies.
> >
> >
> > -Vinay
> >
> > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> wrote:
> >
> > >
> > >
> > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vinayakumarb@apache.org
> >
> > > wrote:
> > >
> > >>    Protobuf provides Wire compatibility between releases.. but not
> > >> guarantees the source compatibility in generated sources. There will
> be
> > a
> > >> problem in compatibility if anyone uses generated protobuf message
> > outside
> > >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> > not
> > >> public APIs.
> > >>
> > >>    There should not be any compatibility problems between releases in
> > >> terms
> > >> of communication provided both uses same syntax (proto2) of proto
> > message.
> > >> This I have verified by communication between protobuf 2.5.0 client
> with
> > >> protobuf 3.7.1 server.
> > >>
> > >>    To avoid the downstream transitive dependency classpath problem,
> who
> > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > classes
> > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> hadoop
> > >> classpath.
> > >>
> > >> Hope I have answered your question.
> > >>
> > >> -Vinay
> > >>
> > >>
> > > While I support the move and CP isolation, this is going to (finally)
> > > force us to make shaded versions of all artifacts which we publish with
> > the
> > > intent of them being loaded on the classpath of other applications
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Wangda Tan <wh...@gmail.com>.
Hi Vinay,

Thanks for the clarification.

Do you have a timeline about all you described works w.r.t.  the
compatibility will be completed? I'm asking this is because we need to
release 3.3.0 earlier if possible since there're 1k+ of patches in 3.3.0
already, we should get it out earlier.

If the PB work will take more time, do you think if we should create a
branch for 3.3, revert PB changes from branch-3.3, and keep on working on
PB for the next minor release? (or major release if we do see some
compatibility issue in the future).

Just my $0.02

Thanks,
Wangda

On Mon, Sep 23, 2019 at 5:43 AM Steve Loughran <st...@cloudera.com.invalid>
wrote:

> aah, that makes sense
>
> On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
> > Thanks Steve.
> >
> > Idea is not to shade all artifacts.
> > Instead maintain one artifact ( hadoop-thirdparty) which have all such
> > dependencies ( com.google.* may be), add  this artifact as dependency in
> > hadoop modules. Use shaded classes directly in the code of hadoop modules
> > instead of shading at package phase.
> >
> > Hbase, ozone and ratis already following this way. The artifact (
> > hadoop-thirdparty) with shaded dependencies can be maintained in a
> separate
> > repo as suggested by stack on HADOOP-13363 or could be maintained as a
> > separate module in Hadoop repo. If maintained in separate repo, need to
> > build this only when there are changes related to shaded dependencies.
> >
> >
> > -Vinay
> >
> > On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com>
> wrote:
> >
> > >
> > >
> > > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vinayakumarb@apache.org
> >
> > > wrote:
> > >
> > >>    Protobuf provides Wire compatibility between releases.. but not
> > >> guarantees the source compatibility in generated sources. There will
> be
> > a
> > >> problem in compatibility if anyone uses generated protobuf message
> > outside
> > >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> > not
> > >> public APIs.
> > >>
> > >>    There should not be any compatibility problems between releases in
> > >> terms
> > >> of communication provided both uses same syntax (proto2) of proto
> > message.
> > >> This I have verified by communication between protobuf 2.5.0 client
> with
> > >> protobuf 3.7.1 server.
> > >>
> > >>    To avoid the downstream transitive dependency classpath problem,
> who
> > >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> > classes
> > >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in
> hadoop
> > >> classpath.
> > >>
> > >> Hope I have answered your question.
> > >>
> > >> -Vinay
> > >>
> > >>
> > > While I support the move and CP isolation, this is going to (finally)
> > > force us to make shaded versions of all artifacts which we publish with
> > the
> > > intent of them being loaded on the classpath of other applications
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
aah, that makes sense

On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
wrote:

> Thanks Steve.
>
> Idea is not to shade all artifacts.
> Instead maintain one artifact ( hadoop-thirdparty) which have all such
> dependencies ( com.google.* may be), add  this artifact as dependency in
> hadoop modules. Use shaded classes directly in the code of hadoop modules
> instead of shading at package phase.
>
> Hbase, ozone and ratis already following this way. The artifact (
> hadoop-thirdparty) with shaded dependencies can be maintained in a separate
> repo as suggested by stack on HADOOP-13363 or could be maintained as a
> separate module in Hadoop repo. If maintained in separate repo, need to
> build this only when there are changes related to shaded dependencies.
>
>
> -Vinay
>
> On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:
>
> >
> >
> > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> >>    Protobuf provides Wire compatibility between releases.. but not
> >> guarantees the source compatibility in generated sources. There will be
> a
> >> problem in compatibility if anyone uses generated protobuf message
> outside
> >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> not
> >> public APIs.
> >>
> >>    There should not be any compatibility problems between releases in
> >> terms
> >> of communication provided both uses same syntax (proto2) of proto
> message.
> >> This I have verified by communication between protobuf 2.5.0 client with
> >> protobuf 3.7.1 server.
> >>
> >>    To avoid the downstream transitive dependency classpath problem, who
> >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> classes
> >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> >> classpath.
> >>
> >> Hope I have answered your question.
> >>
> >> -Vinay
> >>
> >>
> > While I support the move and CP isolation, this is going to (finally)
> > force us to make shaded versions of all artifacts which we publish with
> the
> > intent of them being loaded on the classpath of other applications
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
aah, that makes sense

On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
wrote:

> Thanks Steve.
>
> Idea is not to shade all artifacts.
> Instead maintain one artifact ( hadoop-thirdparty) which have all such
> dependencies ( com.google.* may be), add  this artifact as dependency in
> hadoop modules. Use shaded classes directly in the code of hadoop modules
> instead of shading at package phase.
>
> Hbase, ozone and ratis already following this way. The artifact (
> hadoop-thirdparty) with shaded dependencies can be maintained in a separate
> repo as suggested by stack on HADOOP-13363 or could be maintained as a
> separate module in Hadoop repo. If maintained in separate repo, need to
> build this only when there are changes related to shaded dependencies.
>
>
> -Vinay
>
> On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:
>
> >
> >
> > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> >>    Protobuf provides Wire compatibility between releases.. but not
> >> guarantees the source compatibility in generated sources. There will be
> a
> >> problem in compatibility if anyone uses generated protobuf message
> outside
> >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> not
> >> public APIs.
> >>
> >>    There should not be any compatibility problems between releases in
> >> terms
> >> of communication provided both uses same syntax (proto2) of proto
> message.
> >> This I have verified by communication between protobuf 2.5.0 client with
> >> protobuf 3.7.1 server.
> >>
> >>    To avoid the downstream transitive dependency classpath problem, who
> >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> classes
> >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> >> classpath.
> >>
> >> Hope I have answered your question.
> >>
> >> -Vinay
> >>
> >>
> > While I support the move and CP isolation, this is going to (finally)
> > force us to make shaded versions of all artifacts which we publish with
> the
> > intent of them being loaded on the classpath of other applications
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
aah, that makes sense

On Sun, Sep 22, 2019 at 6:11 PM Vinayakumar B <vi...@apache.org>
wrote:

> Thanks Steve.
>
> Idea is not to shade all artifacts.
> Instead maintain one artifact ( hadoop-thirdparty) which have all such
> dependencies ( com.google.* may be), add  this artifact as dependency in
> hadoop modules. Use shaded classes directly in the code of hadoop modules
> instead of shading at package phase.
>
> Hbase, ozone and ratis already following this way. The artifact (
> hadoop-thirdparty) with shaded dependencies can be maintained in a separate
> repo as suggested by stack on HADOOP-13363 or could be maintained as a
> separate module in Hadoop repo. If maintained in separate repo, need to
> build this only when there are changes related to shaded dependencies.
>
>
> -Vinay
>
> On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:
>
> >
> >
> > On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> >>    Protobuf provides Wire compatibility between releases.. but not
> >> guarantees the source compatibility in generated sources. There will be
> a
> >> problem in compatibility if anyone uses generated protobuf message
> outside
> >> of Hadoop modules. Which ideally shouldn't be as generated sources are
> not
> >> public APIs.
> >>
> >>    There should not be any compatibility problems between releases in
> >> terms
> >> of communication provided both uses same syntax (proto2) of proto
> message.
> >> This I have verified by communication between protobuf 2.5.0 client with
> >> protobuf 3.7.1 server.
> >>
> >>    To avoid the downstream transitive dependency classpath problem, who
> >> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1
> classes
> >> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> >> classpath.
> >>
> >> Hope I have answered your question.
> >>
> >> -Vinay
> >>
> >>
> > While I support the move and CP isolation, this is going to (finally)
> > force us to make shaded versions of all artifacts which we publish with
> the
> > intent of them being loaded on the classpath of other applications
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
Thanks Steve.

Idea is not to shade all artifacts.
Instead maintain one artifact ( hadoop-thirdparty) which have all such
dependencies ( com.google.* may be), add  this artifact as dependency in
hadoop modules. Use shaded classes directly in the code of hadoop modules
instead of shading at package phase.

Hbase, ozone and ratis already following this way. The artifact (
hadoop-thirdparty) with shaded dependencies can be maintained in a separate
repo as suggested by stack on HADOOP-13363 or could be maintained as a
separate module in Hadoop repo. If maintained in separate repo, need to
build this only when there are changes related to shaded dependencies.


-Vinay

On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:

>
>
> On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
>>    Protobuf provides Wire compatibility between releases.. but not
>> guarantees the source compatibility in generated sources. There will be a
>> problem in compatibility if anyone uses generated protobuf message outside
>> of Hadoop modules. Which ideally shouldn't be as generated sources are not
>> public APIs.
>>
>>    There should not be any compatibility problems between releases in
>> terms
>> of communication provided both uses same syntax (proto2) of proto message.
>> This I have verified by communication between protobuf 2.5.0 client with
>> protobuf 3.7.1 server.
>>
>>    To avoid the downstream transitive dependency classpath problem, who
>> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
>> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
>> classpath.
>>
>> Hope I have answered your question.
>>
>> -Vinay
>>
>>
> While I support the move and CP isolation, this is going to (finally)
> force us to make shaded versions of all artifacts which we publish with the
> intent of them being loaded on the classpath of other applications
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
Thanks Steve.

Idea is not to shade all artifacts.
Instead maintain one artifact ( hadoop-thirdparty) which have all such
dependencies ( com.google.* may be), add  this artifact as dependency in
hadoop modules. Use shaded classes directly in the code of hadoop modules
instead of shading at package phase.

Hbase, ozone and ratis already following this way. The artifact (
hadoop-thirdparty) with shaded dependencies can be maintained in a separate
repo as suggested by stack on HADOOP-13363 or could be maintained as a
separate module in Hadoop repo. If maintained in separate repo, need to
build this only when there are changes related to shaded dependencies.


-Vinay

On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:

>
>
> On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
>>    Protobuf provides Wire compatibility between releases.. but not
>> guarantees the source compatibility in generated sources. There will be a
>> problem in compatibility if anyone uses generated protobuf message outside
>> of Hadoop modules. Which ideally shouldn't be as generated sources are not
>> public APIs.
>>
>>    There should not be any compatibility problems between releases in
>> terms
>> of communication provided both uses same syntax (proto2) of proto message.
>> This I have verified by communication between protobuf 2.5.0 client with
>> protobuf 3.7.1 server.
>>
>>    To avoid the downstream transitive dependency classpath problem, who
>> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
>> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
>> classpath.
>>
>> Hope I have answered your question.
>>
>> -Vinay
>>
>>
> While I support the move and CP isolation, this is going to (finally)
> force us to make shaded versions of all artifacts which we publish with the
> intent of them being loaded on the classpath of other applications
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
Thanks Steve.

Idea is not to shade all artifacts.
Instead maintain one artifact ( hadoop-thirdparty) which have all such
dependencies ( com.google.* may be), add  this artifact as dependency in
hadoop modules. Use shaded classes directly in the code of hadoop modules
instead of shading at package phase.

Hbase, ozone and ratis already following this way. The artifact (
hadoop-thirdparty) with shaded dependencies can be maintained in a separate
repo as suggested by stack on HADOOP-13363 or could be maintained as a
separate module in Hadoop repo. If maintained in separate repo, need to
build this only when there are changes related to shaded dependencies.


-Vinay

On Sun, 22 Sep 2019, 10:11 pm Steve Loughran, <st...@cloudera.com> wrote:

>
>
> On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
> wrote:
>
>>    Protobuf provides Wire compatibility between releases.. but not
>> guarantees the source compatibility in generated sources. There will be a
>> problem in compatibility if anyone uses generated protobuf message outside
>> of Hadoop modules. Which ideally shouldn't be as generated sources are not
>> public APIs.
>>
>>    There should not be any compatibility problems between releases in
>> terms
>> of communication provided both uses same syntax (proto2) of proto message.
>> This I have verified by communication between protobuf 2.5.0 client with
>> protobuf 3.7.1 server.
>>
>>    To avoid the downstream transitive dependency classpath problem, who
>> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
>> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
>> classpath.
>>
>> Hope I have answered your question.
>>
>> -Vinay
>>
>>
> While I support the move and CP isolation, this is going to (finally)
> force us to make shaded versions of all artifacts which we publish with the
> intent of them being loaded on the classpath of other applications
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
wrote:

>    Protobuf provides Wire compatibility between releases.. but not
> guarantees the source compatibility in generated sources. There will be a
> problem in compatibility if anyone uses generated protobuf message outside
> of Hadoop modules. Which ideally shouldn't be as generated sources are not
> public APIs.
>
>    There should not be any compatibility problems between releases in terms
> of communication provided both uses same syntax (proto2) of proto message.
> This I have verified by communication between protobuf 2.5.0 client with
> protobuf 3.7.1 server.
>
>    To avoid the downstream transitive dependency classpath problem, who
> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> classpath.
>
> Hope I have answered your question.
>
> -Vinay
>
>
While I support the move and CP isolation, this is going to (finally) force
us to make shaded versions of all artifacts which we publish with the
intent of them being loaded on the classpath of other applications

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
wrote:

>    Protobuf provides Wire compatibility between releases.. but not
> guarantees the source compatibility in generated sources. There will be a
> problem in compatibility if anyone uses generated protobuf message outside
> of Hadoop modules. Which ideally shouldn't be as generated sources are not
> public APIs.
>
>    There should not be any compatibility problems between releases in terms
> of communication provided both uses same syntax (proto2) of proto message.
> This I have verified by communication between protobuf 2.5.0 client with
> protobuf 3.7.1 server.
>
>    To avoid the downstream transitive dependency classpath problem, who
> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> classpath.
>
> Hope I have answered your question.
>
> -Vinay
>
>
While I support the move and CP isolation, this is going to (finally) force
us to make shaded versions of all artifacts which we publish with the
intent of them being loaded on the classpath of other applications

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
On Sun, Sep 22, 2019 at 3:22 PM Vinayakumar B <vi...@apache.org>
wrote:

>    Protobuf provides Wire compatibility between releases.. but not
> guarantees the source compatibility in generated sources. There will be a
> problem in compatibility if anyone uses generated protobuf message outside
> of Hadoop modules. Which ideally shouldn't be as generated sources are not
> public APIs.
>
>    There should not be any compatibility problems between releases in terms
> of communication provided both uses same syntax (proto2) of proto message.
> This I have verified by communication between protobuf 2.5.0 client with
> protobuf 3.7.1 server.
>
>    To avoid the downstream transitive dependency classpath problem, who
> might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
> and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
> classpath.
>
> Hope I have answered your question.
>
> -Vinay
>
>
While I support the move and CP isolation, this is going to (finally) force
us to make shaded versions of all artifacts which we publish with the
intent of them being loaded on the classpath of other applications

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
   Protobuf provides Wire compatibility between releases.. but not
guarantees the source compatibility in generated sources. There will be a
problem in compatibility if anyone uses generated protobuf message outside
of Hadoop modules. Which ideally shouldn't be as generated sources are not
public APIs.

   There should not be any compatibility problems between releases in terms
of communication provided both uses same syntax (proto2) of proto message.
This I have verified by communication between protobuf 2.5.0 client with
protobuf 3.7.1 server.

   To avoid the downstream transitive dependency classpath problem, who
might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
classpath.

Hope I have answered your question.

-Vinay

On Sun, 22 Sep 2019, 7:38 pm Vinod Kumar Vavilapalli, <vi...@apache.org>
wrote:

> Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over
> the years not helping either.
>
> Does anyone know what this upgrade will mean w.r.t compatibility for the
> Hadoop releases themselves? Remember that trunk is still 3.x.
>
> Thanks
> +Vinod
>
> > On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org>
> wrote:
> >
> > @Wangda Tan <wh...@gmail.com> ,
> > Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> > stages of protobuf upgrade in subtasks. (jar upgrade, Docker update,
> plugin
> > upgrade, shading, etc).
> > Right now, first task of jar upgrade is done. So need to update the
> protoc
> > executable in the in build environments.
> >
> > @张铎(Duo Zhang) <pa...@gmail.com> ,
> > Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> > was possible. Sorry I missed it.
> >
> > Plugin update needed to be done for whole project, for which precommit
> > jenkins will need more time complete end-to-end runs.
> > So plugin update is planned in stages in further subtasks. It could be
> done
> > in 2-3 days.
> >
> > -Vinay
> >
> > On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com>
> wrote:
> >
> >> I think this one is alread in place so we have to upgrade...
> >>
> >> https://issues.apache.org/jira/browse/HADOOP-16557
> >>
> >> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
> >>
> >>> Hi Vinay,
> >>>
> >>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> >>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is
> completed?
> >>>
> >>> Thanks,
> >>> Wangda
> >>>
> >>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vinayakumarb@apache.org
> >
> >>> wrote:
> >>>
> >>>> Hi All,
> >>>>
> >>>> A very long pending task, protobuf upgrade is happening in
> >> HADOOP-13363.
> >>> As
> >>>> part of that protobuf version is upgraded to 3.7.1.
> >>>>
> >>>> Please update your build environments to have 3.7.1 protobuf version.
> >>>>
> >>>> BUILIDING.txt has been updated with latest instructions.
> >>>>
> >>>> This pre-requisite to update protoc dependecy manually is required
> >> until
> >>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> >>>> dynamically resolve required protoc exe.
> >>>>
> >>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
> >>> test
> >>>> environments.
> >>>>
> >>>> Thanks,
> >>>> -Vinay
> >>>>
> >>>
> >>
>
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
   Protobuf provides Wire compatibility between releases.. but not
guarantees the source compatibility in generated sources. There will be a
problem in compatibility if anyone uses generated protobuf message outside
of Hadoop modules. Which ideally shouldn't be as generated sources are not
public APIs.

   There should not be any compatibility problems between releases in terms
of communication provided both uses same syntax (proto2) of proto message.
This I have verified by communication between protobuf 2.5.0 client with
protobuf 3.7.1 server.

   To avoid the downstream transitive dependency classpath problem, who
might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
classpath.

Hope I have answered your question.

-Vinay

On Sun, 22 Sep 2019, 7:38 pm Vinod Kumar Vavilapalli, <vi...@apache.org>
wrote:

> Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over
> the years not helping either.
>
> Does anyone know what this upgrade will mean w.r.t compatibility for the
> Hadoop releases themselves? Remember that trunk is still 3.x.
>
> Thanks
> +Vinod
>
> > On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org>
> wrote:
> >
> > @Wangda Tan <wh...@gmail.com> ,
> > Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> > stages of protobuf upgrade in subtasks. (jar upgrade, Docker update,
> plugin
> > upgrade, shading, etc).
> > Right now, first task of jar upgrade is done. So need to update the
> protoc
> > executable in the in build environments.
> >
> > @张铎(Duo Zhang) <pa...@gmail.com> ,
> > Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> > was possible. Sorry I missed it.
> >
> > Plugin update needed to be done for whole project, for which precommit
> > jenkins will need more time complete end-to-end runs.
> > So plugin update is planned in stages in further subtasks. It could be
> done
> > in 2-3 days.
> >
> > -Vinay
> >
> > On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com>
> wrote:
> >
> >> I think this one is alread in place so we have to upgrade...
> >>
> >> https://issues.apache.org/jira/browse/HADOOP-16557
> >>
> >> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
> >>
> >>> Hi Vinay,
> >>>
> >>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> >>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is
> completed?
> >>>
> >>> Thanks,
> >>> Wangda
> >>>
> >>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vinayakumarb@apache.org
> >
> >>> wrote:
> >>>
> >>>> Hi All,
> >>>>
> >>>> A very long pending task, protobuf upgrade is happening in
> >> HADOOP-13363.
> >>> As
> >>>> part of that protobuf version is upgraded to 3.7.1.
> >>>>
> >>>> Please update your build environments to have 3.7.1 protobuf version.
> >>>>
> >>>> BUILIDING.txt has been updated with latest instructions.
> >>>>
> >>>> This pre-requisite to update protoc dependecy manually is required
> >> until
> >>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> >>>> dynamically resolve required protoc exe.
> >>>>
> >>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
> >>> test
> >>>> environments.
> >>>>
> >>>> Thanks,
> >>>> -Vinay
> >>>>
> >>>
> >>
>
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
   Protobuf provides Wire compatibility between releases.. but not
guarantees the source compatibility in generated sources. There will be a
problem in compatibility if anyone uses generated protobuf message outside
of Hadoop modules. Which ideally shouldn't be as generated sources are not
public APIs.

   There should not be any compatibility problems between releases in terms
of communication provided both uses same syntax (proto2) of proto message.
This I have verified by communication between protobuf 2.5.0 client with
protobuf 3.7.1 server.

   To avoid the downstream transitive dependency classpath problem, who
might be using protobuf 2.5.0 classes, planning to shade the 3.7.1 classes
and its usages in all hadoop modules.. and keep 2.5.0 jar back in hadoop
classpath.

Hope I have answered your question.

-Vinay

On Sun, 22 Sep 2019, 7:38 pm Vinod Kumar Vavilapalli, <vi...@apache.org>
wrote:

> Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over
> the years not helping either.
>
> Does anyone know what this upgrade will mean w.r.t compatibility for the
> Hadoop releases themselves? Remember that trunk is still 3.x.
>
> Thanks
> +Vinod
>
> > On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org>
> wrote:
> >
> > @Wangda Tan <wh...@gmail.com> ,
> > Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> > stages of protobuf upgrade in subtasks. (jar upgrade, Docker update,
> plugin
> > upgrade, shading, etc).
> > Right now, first task of jar upgrade is done. So need to update the
> protoc
> > executable in the in build environments.
> >
> > @张铎(Duo Zhang) <pa...@gmail.com> ,
> > Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> > was possible. Sorry I missed it.
> >
> > Plugin update needed to be done for whole project, for which precommit
> > jenkins will need more time complete end-to-end runs.
> > So plugin update is planned in stages in further subtasks. It could be
> done
> > in 2-3 days.
> >
> > -Vinay
> >
> > On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com>
> wrote:
> >
> >> I think this one is alread in place so we have to upgrade...
> >>
> >> https://issues.apache.org/jira/browse/HADOOP-16557
> >>
> >> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
> >>
> >>> Hi Vinay,
> >>>
> >>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> >>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is
> completed?
> >>>
> >>> Thanks,
> >>> Wangda
> >>>
> >>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vinayakumarb@apache.org
> >
> >>> wrote:
> >>>
> >>>> Hi All,
> >>>>
> >>>> A very long pending task, protobuf upgrade is happening in
> >> HADOOP-13363.
> >>> As
> >>>> part of that protobuf version is upgraded to 3.7.1.
> >>>>
> >>>> Please update your build environments to have 3.7.1 protobuf version.
> >>>>
> >>>> BUILIDING.txt has been updated with latest instructions.
> >>>>
> >>>> This pre-requisite to update protoc dependecy manually is required
> >> until
> >>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> >>>> dynamically resolve required protoc exe.
> >>>>
> >>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
> >>> test
> >>>> environments.
> >>>>
> >>>> Thanks,
> >>>> -Vinay
> >>>>
> >>>
> >>
>
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinod Kumar Vavilapalli <vi...@apache.org>.
Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over the years not helping either.

Does anyone know what this upgrade will mean w.r.t compatibility for the Hadoop releases themselves? Remember that trunk is still 3.x.

Thanks
+Vinod

> On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org> wrote:
> 
> @Wangda Tan <wh...@gmail.com> ,
> Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
> upgrade, shading, etc).
> Right now, first task of jar upgrade is done. So need to update the protoc
> executable in the in build environments.
> 
> @张铎(Duo Zhang) <pa...@gmail.com> ,
> Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> was possible. Sorry I missed it.
> 
> Plugin update needed to be done for whole project, for which precommit
> jenkins will need more time complete end-to-end runs.
> So plugin update is planned in stages in further subtasks. It could be done
> in 2-3 days.
> 
> -Vinay
> 
> On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:
> 
>> I think this one is alread in place so we have to upgrade...
>> 
>> https://issues.apache.org/jira/browse/HADOOP-16557
>> 
>> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>> 
>>> Hi Vinay,
>>> 
>>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
>>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>>> 
>>> Thanks,
>>> Wangda
>>> 
>>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
>>> wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> A very long pending task, protobuf upgrade is happening in
>> HADOOP-13363.
>>> As
>>>> part of that protobuf version is upgraded to 3.7.1.
>>>> 
>>>> Please update your build environments to have 3.7.1 protobuf version.
>>>> 
>>>> BUILIDING.txt has been updated with latest instructions.
>>>> 
>>>> This pre-requisite to update protoc dependecy manually is required
>> until
>>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
>>>> dynamically resolve required protoc exe.
>>>> 
>>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
>>> test
>>>> environments.
>>>> 
>>>> Thanks,
>>>> -Vinay
>>>> 
>>> 
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinod Kumar Vavilapalli <vi...@apache.org>.
Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over the years not helping either.

Does anyone know what this upgrade will mean w.r.t compatibility for the Hadoop releases themselves? Remember that trunk is still 3.x.

Thanks
+Vinod

> On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org> wrote:
> 
> @Wangda Tan <wh...@gmail.com> ,
> Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
> upgrade, shading, etc).
> Right now, first task of jar upgrade is done. So need to update the protoc
> executable in the in build environments.
> 
> @张铎(Duo Zhang) <pa...@gmail.com> ,
> Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> was possible. Sorry I missed it.
> 
> Plugin update needed to be done for whole project, for which precommit
> jenkins will need more time complete end-to-end runs.
> So plugin update is planned in stages in further subtasks. It could be done
> in 2-3 days.
> 
> -Vinay
> 
> On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:
> 
>> I think this one is alread in place so we have to upgrade...
>> 
>> https://issues.apache.org/jira/browse/HADOOP-16557
>> 
>> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>> 
>>> Hi Vinay,
>>> 
>>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
>>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>>> 
>>> Thanks,
>>> Wangda
>>> 
>>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
>>> wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> A very long pending task, protobuf upgrade is happening in
>> HADOOP-13363.
>>> As
>>>> part of that protobuf version is upgraded to 3.7.1.
>>>> 
>>>> Please update your build environments to have 3.7.1 protobuf version.
>>>> 
>>>> BUILIDING.txt has been updated with latest instructions.
>>>> 
>>>> This pre-requisite to update protoc dependecy manually is required
>> until
>>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
>>>> dynamically resolve required protoc exe.
>>>> 
>>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
>>> test
>>>> environments.
>>>> 
>>>> Thanks,
>>>> -Vinay
>>>> 
>>> 
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: yarn-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: yarn-dev-help@hadoop.apache.org


Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinod Kumar Vavilapalli <vi...@apache.org>.
Quick question, being lazy here, lots of JIRA updates on HADOOP-13363 over the years not helping either.

Does anyone know what this upgrade will mean w.r.t compatibility for the Hadoop releases themselves? Remember that trunk is still 3.x.

Thanks
+Vinod

> On Sep 21, 2019, at 9:55 AM, Vinayakumar B <vi...@apache.org> wrote:
> 
> @Wangda Tan <wh...@gmail.com> ,
> Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
> stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
> upgrade, shading, etc).
> Right now, first task of jar upgrade is done. So need to update the protoc
> executable in the in build environments.
> 
> @张铎(Duo Zhang) <pa...@gmail.com> ,
> Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
> was possible. Sorry I missed it.
> 
> Plugin update needed to be done for whole project, for which precommit
> jenkins will need more time complete end-to-end runs.
> So plugin update is planned in stages in further subtasks. It could be done
> in 2-3 days.
> 
> -Vinay
> 
> On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:
> 
>> I think this one is alread in place so we have to upgrade...
>> 
>> https://issues.apache.org/jira/browse/HADOOP-16557
>> 
>> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>> 
>>> Hi Vinay,
>>> 
>>> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
>>> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>>> 
>>> Thanks,
>>> Wangda
>>> 
>>> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
>>> wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> A very long pending task, protobuf upgrade is happening in
>> HADOOP-13363.
>>> As
>>>> part of that protobuf version is upgraded to 3.7.1.
>>>> 
>>>> Please update your build environments to have 3.7.1 protobuf version.
>>>> 
>>>> BUILIDING.txt has been updated with latest instructions.
>>>> 
>>>> This pre-requisite to update protoc dependecy manually is required
>> until
>>>> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
>>>> dynamically resolve required protoc exe.
>>>> 
>>>> Dockerfile is being updated to have latest 3.7.1 as default protoc for
>>> test
>>>> environments.
>>>> 
>>>> Thanks,
>>>> -Vinay
>>>> 
>>> 
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-dev-help@hadoop.apache.org


Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
@Wangda Tan <wh...@gmail.com> ,
Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
upgrade, shading, etc).
Right now, first task of jar upgrade is done. So need to update the protoc
executable in the in build environments.

@张铎(Duo Zhang) <pa...@gmail.com> ,
Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
was possible. Sorry I missed it.

Plugin update needed to be done for whole project, for which precommit
jenkins will need more time complete end-to-end runs.
So plugin update is planned in stages in further subtasks. It could be done
in 2-3 days.

-Vinay

On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:

> I think this one is alread in place so we have to upgrade...
>
> https://issues.apache.org/jira/browse/HADOOP-16557
>
> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>
> > Hi Vinay,
> >
> > A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> > upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
> >
> > Thanks,
> > Wangda
> >
> > On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Hi All,
> > >
> > > A very long pending task, protobuf upgrade is happening in
> HADOOP-13363.
> > As
> > > part of that protobuf version is upgraded to 3.7.1.
> > >
> > > Please update your build environments to have 3.7.1 protobuf version.
> > >
> > > BUILIDING.txt has been updated with latest instructions.
> > >
> > > This pre-requisite to update protoc dependecy manually is required
> until
> > > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > > dynamically resolve required protoc exe.
> > >
> > > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> > test
> > > environments.
> > >
> > > Thanks,
> > > -Vinay
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
@Wangda Tan <wh...@gmail.com> ,
Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
upgrade, shading, etc).
Right now, first task of jar upgrade is done. So need to update the protoc
executable in the in build environments.

@张铎(Duo Zhang) <pa...@gmail.com> ,
Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
was possible. Sorry I missed it.

Plugin update needed to be done for whole project, for which precommit
jenkins will need more time complete end-to-end runs.
So plugin update is planned in stages in further subtasks. It could be done
in 2-3 days.

-Vinay

On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:

> I think this one is alread in place so we have to upgrade...
>
> https://issues.apache.org/jira/browse/HADOOP-16557
>
> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>
> > Hi Vinay,
> >
> > A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> > upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
> >
> > Thanks,
> > Wangda
> >
> > On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Hi All,
> > >
> > > A very long pending task, protobuf upgrade is happening in
> HADOOP-13363.
> > As
> > > part of that protobuf version is upgraded to 3.7.1.
> > >
> > > Please update your build environments to have 3.7.1 protobuf version.
> > >
> > > BUILIDING.txt has been updated with latest instructions.
> > >
> > > This pre-requisite to update protoc dependecy manually is required
> until
> > > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > > dynamically resolve required protoc exe.
> > >
> > > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> > test
> > > environments.
> > >
> > > Thanks,
> > > -Vinay
> > >
> >
>

Re: [NOTICE] Building trunk needs protoc 3.7.1

Posted by Vinayakumar B <vi...@apache.org>.
@Wangda Tan <wh...@gmail.com> ,
Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
upgrade, shading, etc).
Right now, first task of jar upgrade is done. So need to update the protoc
executable in the in build environments.

@张铎(Duo Zhang) <pa...@gmail.com> ,
Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
was possible. Sorry I missed it.

Plugin update needed to be done for whole project, for which precommit
jenkins will need more time complete end-to-end runs.
So plugin update is planned in stages in further subtasks. It could be done
in 2-3 days.

-Vinay

On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang), <pa...@gmail.com> wrote:

> I think this one is alread in place so we have to upgrade...
>
> https://issues.apache.org/jira/browse/HADOOP-16557
>
> Wangda Tan <wh...@gmail.com> 于2019年9月21日周六 上午7:19写道:
>
> > Hi Vinay,
> >
> > A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> > upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
> >
> > Thanks,
> > Wangda
> >
> > On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B <vi...@apache.org>
> > wrote:
> >
> > > Hi All,
> > >
> > > A very long pending task, protobuf upgrade is happening in
> HADOOP-13363.
> > As
> > > part of that protobuf version is upgraded to 3.7.1.
> > >
> > > Please update your build environments to have 3.7.1 protobuf version.
> > >
> > > BUILIDING.txt has been updated with latest instructions.
> > >
> > > This pre-requisite to update protoc dependecy manually is required
> until
> > > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > > dynamically resolve required protoc exe.
> > >
> > > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> > test
> > > environments.
> > >
> > > Thanks,
> > > -Vinay
> > >
> >
>