You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sean Owen <so...@cloudera.com> on 2017/09/05 11:39:51 UTC

Putting Kafka 0.8 behind an (opt-in) profile

On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional in
the build, because it is not available for Scala 2.12.

https://github.com/apache/spark/pull/19134  adds that profile. I mention it
because this means that Kafka 0.8 becomes "opt-in" and has to be explicitly
enabled, and that may have implications for downstream builds.

Yes, we can add <activeByDefault>true</activeByDefault>. It however only
has effect when no other profiles are set, which makes it more deceptive
than useful IMHO. (We don't use it otherwise.)

Reviewers may want to check my work especially as regards the Python test
support and SBT build.


Another related question is: when is 0.8 support deprecated, removed? It
seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
driver is that Kafka 0.11 and 1.0 will possibly require yet another variant
of streaming support (not sure yet), and 3 versions is too many.
Deprecating now opens more options sooner.

Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Felix Cheung <fe...@hotmail.com>.
+1

________________________________
From: Cody Koeninger <co...@koeninger.org>
Sent: Tuesday, September 5, 2017 8:12:07 AM
To: Sean Owen
Cc: dev
Subject: Re: Putting Kafka 0.8 behind an (opt-in) profile

+1 to going ahead and giving a deprecation warning now

On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
> On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional in
> the build, because it is not available for Scala 2.12.
>
> https://github.com/apache/spark/pull/19134  adds that profile. I mention it
> because this means that Kafka 0.8 becomes "opt-in" and has to be explicitly
> enabled, and that may have implications for downstream builds.
>
> Yes, we can add <activeByDefault>true</activeByDefault>. It however only has
> effect when no other profiles are set, which makes it more deceptive than
> useful IMHO. (We don't use it otherwise.)
>
> Reviewers may want to check my work especially as regards the Python test
> support and SBT build.
>
>
> Another related question is: when is 0.8 support deprecated, removed? It
> seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
> driver is that Kafka 0.11 and 1.0 will possibly require yet another variant
> of streaming support (not sure yet), and 3 versions is too many. Deprecating
> now opens more options sooner.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Sean Owen <so...@cloudera.com>.
Pull request is ready to go: https://github.com/apache/spark/pull/19134

I flag it one more time because it means Kafka 0.8 is deprecated in 2.3.0
and because it will require -Pkafka-0-8 to build in the support now.

Pardon, I want to be sure: does this mean Pyspark Kafka support effectively
has no non-deprecated support now?

On Thu, Sep 7, 2017 at 10:32 AM Sean Owen <so...@cloudera.com> wrote:

> For those following along, see discussions at
> https://github.com/apache/spark/pull/19134
>
> It's now also clear that we'd need to remove Kafka 0.8 examples if Kafka
> 0.8 becomes optional. I think that's all reasonable but the change is
> growing beyond just putting it behind a profile.
>
> On Wed, Sep 6, 2017 at 3:00 PM Cody Koeninger <co...@koeninger.org> wrote:
>
>> I kind of doubt the kafka 0.10 integration is going to change much at
>> all before the upgrade to 0.11
>>
>> On Wed, Sep 6, 2017 at 8:57 AM, Sean Owen <so...@cloudera.com> wrote:
>> > Thanks, I can do that. We're then in the funny position of having one
>> > deprecated Kafka API, and one experimental one.
>> >
>> > Is the Kafka 0.10 integration as stable as it is going to be, and worth
>> > marking as such for 2.3.0?
>> >
>> >
>> > On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger <co...@koeninger.org>
>> wrote:
>> >>
>> >> +1 to going ahead and giving a deprecation warning now
>> >>
>> >> On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
>> >> > On the road to Scala 2.12, we'll need to make Kafka 0.8 support
>> optional
>> >> > in
>> >> > the build, because it is not available for Scala 2.12.
>> >> >
>> >> > https://github.com/apache/spark/pull/19134  adds that profile. I
>> mention
>> >> > it
>> >> > because this means that Kafka 0.8 becomes "opt-in" and has to be
>> >> > explicitly
>> >> > enabled, and that may have implications for downstream builds.
>> >> >
>> >> > Yes, we can add <activeByDefault>true</activeByDefault>. It however
>> only
>> >> > has
>> >> > effect when no other profiles are set, which makes it more deceptive
>> >> > than
>> >> > useful IMHO. (We don't use it otherwise.)
>> >> >
>> >> > Reviewers may want to check my work especially as regards the Python
>> >> > test
>> >> > support and SBT build.
>> >> >
>> >> >
>> >> > Another related question is: when is 0.8 support deprecated,
>> removed? It
>> >> > seems sudden to remove it in 2.3.0. Maybe deprecation is in order.
>> The
>> >> > driver is that Kafka 0.11 and 1.0 will possibly require yet another
>> >> > variant
>> >> > of streaming support (not sure yet), and 3 versions is too many.
>> >> > Deprecating
>> >> > now opens more options sooner.
>>
>

Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Sean Owen <so...@cloudera.com>.
For those following along, see discussions at
https://github.com/apache/spark/pull/19134

It's now also clear that we'd need to remove Kafka 0.8 examples if Kafka
0.8 becomes optional. I think that's all reasonable but the change is
growing beyond just putting it behind a profile.

On Wed, Sep 6, 2017 at 3:00 PM Cody Koeninger <co...@koeninger.org> wrote:

> I kind of doubt the kafka 0.10 integration is going to change much at
> all before the upgrade to 0.11
>
> On Wed, Sep 6, 2017 at 8:57 AM, Sean Owen <so...@cloudera.com> wrote:
> > Thanks, I can do that. We're then in the funny position of having one
> > deprecated Kafka API, and one experimental one.
> >
> > Is the Kafka 0.10 integration as stable as it is going to be, and worth
> > marking as such for 2.3.0?
> >
> >
> > On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger <co...@koeninger.org>
> wrote:
> >>
> >> +1 to going ahead and giving a deprecation warning now
> >>
> >> On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
> >> > On the road to Scala 2.12, we'll need to make Kafka 0.8 support
> optional
> >> > in
> >> > the build, because it is not available for Scala 2.12.
> >> >
> >> > https://github.com/apache/spark/pull/19134  adds that profile. I
> mention
> >> > it
> >> > because this means that Kafka 0.8 becomes "opt-in" and has to be
> >> > explicitly
> >> > enabled, and that may have implications for downstream builds.
> >> >
> >> > Yes, we can add <activeByDefault>true</activeByDefault>. It however
> only
> >> > has
> >> > effect when no other profiles are set, which makes it more deceptive
> >> > than
> >> > useful IMHO. (We don't use it otherwise.)
> >> >
> >> > Reviewers may want to check my work especially as regards the Python
> >> > test
> >> > support and SBT build.
> >> >
> >> >
> >> > Another related question is: when is 0.8 support deprecated, removed?
> It
> >> > seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
> >> > driver is that Kafka 0.11 and 1.0 will possibly require yet another
> >> > variant
> >> > of streaming support (not sure yet), and 3 versions is too many.
> >> > Deprecating
> >> > now opens more options sooner.
>

Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Cody Koeninger <co...@koeninger.org>.
I kind of doubt the kafka 0.10 integration is going to change much at
all before the upgrade to 0.11

On Wed, Sep 6, 2017 at 8:57 AM, Sean Owen <so...@cloudera.com> wrote:
> Thanks, I can do that. We're then in the funny position of having one
> deprecated Kafka API, and one experimental one.
>
> Is the Kafka 0.10 integration as stable as it is going to be, and worth
> marking as such for 2.3.0?
>
>
> On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger <co...@koeninger.org> wrote:
>>
>> +1 to going ahead and giving a deprecation warning now
>>
>> On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
>> > On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional
>> > in
>> > the build, because it is not available for Scala 2.12.
>> >
>> > https://github.com/apache/spark/pull/19134  adds that profile. I mention
>> > it
>> > because this means that Kafka 0.8 becomes "opt-in" and has to be
>> > explicitly
>> > enabled, and that may have implications for downstream builds.
>> >
>> > Yes, we can add <activeByDefault>true</activeByDefault>. It however only
>> > has
>> > effect when no other profiles are set, which makes it more deceptive
>> > than
>> > useful IMHO. (We don't use it otherwise.)
>> >
>> > Reviewers may want to check my work especially as regards the Python
>> > test
>> > support and SBT build.
>> >
>> >
>> > Another related question is: when is 0.8 support deprecated, removed? It
>> > seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
>> > driver is that Kafka 0.11 and 1.0 will possibly require yet another
>> > variant
>> > of streaming support (not sure yet), and 3 versions is too many.
>> > Deprecating
>> > now opens more options sooner.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Sean Owen <so...@cloudera.com>.
Thanks, I can do that. We're then in the funny position of having one
deprecated Kafka API, and one experimental one.

Is the Kafka 0.10 integration as stable as it is going to be, and worth
marking as such for 2.3.0?

On Tue, Sep 5, 2017 at 4:12 PM Cody Koeninger <co...@koeninger.org> wrote:

> +1 to going ahead and giving a deprecation warning now
>
> On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
> > On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional
> in
> > the build, because it is not available for Scala 2.12.
> >
> > https://github.com/apache/spark/pull/19134  adds that profile. I
> mention it
> > because this means that Kafka 0.8 becomes "opt-in" and has to be
> explicitly
> > enabled, and that may have implications for downstream builds.
> >
> > Yes, we can add <activeByDefault>true</activeByDefault>. It however only
> has
> > effect when no other profiles are set, which makes it more deceptive than
> > useful IMHO. (We don't use it otherwise.)
> >
> > Reviewers may want to check my work especially as regards the Python test
> > support and SBT build.
> >
> >
> > Another related question is: when is 0.8 support deprecated, removed? It
> > seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
> > driver is that Kafka 0.11 and 1.0 will possibly require yet another
> variant
> > of streaming support (not sure yet), and 3 versions is too many.
> Deprecating
> > now opens more options sooner.
>

Re: Putting Kafka 0.8 behind an (opt-in) profile

Posted by Cody Koeninger <co...@koeninger.org>.
+1 to going ahead and giving a deprecation warning now

On Tue, Sep 5, 2017 at 6:39 AM, Sean Owen <so...@cloudera.com> wrote:
> On the road to Scala 2.12, we'll need to make Kafka 0.8 support optional in
> the build, because it is not available for Scala 2.12.
>
> https://github.com/apache/spark/pull/19134  adds that profile. I mention it
> because this means that Kafka 0.8 becomes "opt-in" and has to be explicitly
> enabled, and that may have implications for downstream builds.
>
> Yes, we can add <activeByDefault>true</activeByDefault>. It however only has
> effect when no other profiles are set, which makes it more deceptive than
> useful IMHO. (We don't use it otherwise.)
>
> Reviewers may want to check my work especially as regards the Python test
> support and SBT build.
>
>
> Another related question is: when is 0.8 support deprecated, removed? It
> seems sudden to remove it in 2.3.0. Maybe deprecation is in order. The
> driver is that Kafka 0.11 and 1.0 will possibly require yet another variant
> of streaming support (not sure yet), and 3 versions is too many. Deprecating
> now opens more options sooner.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org