You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2019/09/03 11:59:15 UTC

Why two netty libs?

Hi,

Just noticed that Spark 2.4.x uses two netty deps of different versions.
Why?

jars/netty-all-4.1.17.Final.jar
jars/netty-3.9.9.Final.jar

Shouldn't one be excluded or perhaps shaded?

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
The Internals of Spark SQL https://bit.ly/spark-sql-internals
The Internals of Spark Structured Streaming
https://bit.ly/spark-structured-streaming
The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
Follow me at https://twitter.com/jaceklaskowski

Re: Why two netty libs?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Thanks much for the answers. Learning Spark every day!

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
The Internals of Spark SQL https://bit.ly/spark-sql-internals
The Internals of Spark Structured Streaming
https://bit.ly/spark-structured-streaming
The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
Follow me at https://twitter.com/jaceklaskowski



On Wed, Sep 4, 2019 at 3:15 PM Sean Owen <sr...@gmail.com> wrote:

> Yes that's right. I don't think Spark's usage of ZK needs any ZK
> server, so it's safe to exclude in Spark (at least, so far so good!)
>
> On Wed, Sep 4, 2019 at 8:06 AM Steve Loughran
> <st...@cloudera.com.invalid> wrote:
> >
> > Zookeeper client is/was netty 3, AFAIK, so if you want to use it for
> anything, it ends up on the CP
> >
> > On Tue, Sep 3, 2019 at 5:18 PM Shixiong(Ryan) Zhu <
> shixiong@databricks.com> wrote:
> >>
> >> Yep, historical reasons. And Netty 4 is under another namespace, so we
> can use Netty 3 and Netty 4 in the same JVM.
> >>
> >> On Tue, Sep 3, 2019 at 6:15 AM Sean Owen <sr...@gmail.com> wrote:
> >>>
> >>> It was for historical reasons; some other transitive dependencies
> needed it.
> >>> I actually was just able to exclude Netty 3 last week from master.
> >>> Spark uses Netty 4.
> >>>
> >>> On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl>
> wrote:
> >>> >
> >>> > Hi,
> >>> >
> >>> > Just noticed that Spark 2.4.x uses two netty deps of different
> versions. Why?
> >>> >
> >>> > jars/netty-all-4.1.17.Final.jar
> >>> > jars/netty-3.9.9.Final.jar
> >>> >
> >>> > Shouldn't one be excluded or perhaps shaded?
> >>> >
> >>> > Pozdrawiam,
> >>> > Jacek Laskowski
> >>> > ----
> >>> > https://about.me/JacekLaskowski
> >>> > The Internals of Spark SQL https://bit.ly/spark-sql-internals
> >>> > The Internals of Spark Structured Streaming
> https://bit.ly/spark-structured-streaming
> >>> > The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
> >>> > Follow me at https://twitter.com/jaceklaskowski
> >>> >
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>>
> >> --
> >>
> >> Best Regards,
> >>
> >> Ryan
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Why two netty libs?

Posted by Sean Owen <sr...@gmail.com>.
Yes that's right. I don't think Spark's usage of ZK needs any ZK
server, so it's safe to exclude in Spark (at least, so far so good!)

On Wed, Sep 4, 2019 at 8:06 AM Steve Loughran
<st...@cloudera.com.invalid> wrote:
>
> Zookeeper client is/was netty 3, AFAIK, so if you want to use it for anything, it ends up on the CP
>
> On Tue, Sep 3, 2019 at 5:18 PM Shixiong(Ryan) Zhu <sh...@databricks.com> wrote:
>>
>> Yep, historical reasons. And Netty 4 is under another namespace, so we can use Netty 3 and Netty 4 in the same JVM.
>>
>> On Tue, Sep 3, 2019 at 6:15 AM Sean Owen <sr...@gmail.com> wrote:
>>>
>>> It was for historical reasons; some other transitive dependencies needed it.
>>> I actually was just able to exclude Netty 3 last week from master.
>>> Spark uses Netty 4.
>>>
>>> On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl> wrote:
>>> >
>>> > Hi,
>>> >
>>> > Just noticed that Spark 2.4.x uses two netty deps of different versions. Why?
>>> >
>>> > jars/netty-all-4.1.17.Final.jar
>>> > jars/netty-3.9.9.Final.jar
>>> >
>>> > Shouldn't one be excluded or perhaps shaded?
>>> >
>>> > Pozdrawiam,
>>> > Jacek Laskowski
>>> > ----
>>> > https://about.me/JacekLaskowski
>>> > The Internals of Spark SQL https://bit.ly/spark-sql-internals
>>> > The Internals of Spark Structured Streaming https://bit.ly/spark-structured-streaming
>>> > The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
>>> > Follow me at https://twitter.com/jaceklaskowski
>>> >
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>> --
>>
>> Best Regards,
>>
>> Ryan

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Why two netty libs?

Posted by Steve Loughran <st...@cloudera.com.INVALID>.
Zookeeper client is/was netty 3, AFAIK, so if you want to use it for
anything, it ends up on the CP

On Tue, Sep 3, 2019 at 5:18 PM Shixiong(Ryan) Zhu <sh...@databricks.com>
wrote:

> Yep, historical reasons. And Netty 4 is under another namespace, so we can
> use Netty 3 and Netty 4 in the same JVM.
>
> On Tue, Sep 3, 2019 at 6:15 AM Sean Owen <sr...@gmail.com> wrote:
>
>> It was for historical reasons; some other transitive dependencies needed
>> it.
>> I actually was just able to exclude Netty 3 last week from master.
>> Spark uses Netty 4.
>>
>> On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl> wrote:
>> >
>> > Hi,
>> >
>> > Just noticed that Spark 2.4.x uses two netty deps of different
>> versions. Why?
>> >
>> > jars/netty-all-4.1.17.Final.jar
>> > jars/netty-3.9.9.Final.jar
>> >
>> > Shouldn't one be excluded or perhaps shaded?
>> >
>> > Pozdrawiam,
>> > Jacek Laskowski
>> > ----
>> > https://about.me/JacekLaskowski
>> > The Internals of Spark SQL https://bit.ly/spark-sql-internals
>> > The Internals of Spark Structured Streaming
>> https://bit.ly/spark-structured-streaming
>> > The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
>> > Follow me at https://twitter.com/jaceklaskowski
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>> --
>
> Best Regards,
> Ryan
>

Re: Why two netty libs?

Posted by "Shixiong(Ryan) Zhu" <sh...@databricks.com>.
Yep, historical reasons. And Netty 4 is under another namespace, so we can
use Netty 3 and Netty 4 in the same JVM.

On Tue, Sep 3, 2019 at 6:15 AM Sean Owen <sr...@gmail.com> wrote:

> It was for historical reasons; some other transitive dependencies needed
> it.
> I actually was just able to exclude Netty 3 last week from master.
> Spark uses Netty 4.
>
> On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl> wrote:
> >
> > Hi,
> >
> > Just noticed that Spark 2.4.x uses two netty deps of different versions.
> Why?
> >
> > jars/netty-all-4.1.17.Final.jar
> > jars/netty-3.9.9.Final.jar
> >
> > Shouldn't one be excluded or perhaps shaded?
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://about.me/JacekLaskowski
> > The Internals of Spark SQL https://bit.ly/spark-sql-internals
> > The Internals of Spark Structured Streaming
> https://bit.ly/spark-structured-streaming
> > The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
> > Follow me at https://twitter.com/jaceklaskowski
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
> --

Best Regards,
Ryan

Re: Why two netty libs?

Posted by Sean Owen <sr...@gmail.com>.
It was for historical reasons; some other transitive dependencies needed it.
I actually was just able to exclude Netty 3 last week from master.
Spark uses Netty 4.

On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl> wrote:
>
> Hi,
>
> Just noticed that Spark 2.4.x uses two netty deps of different versions. Why?
>
> jars/netty-all-4.1.17.Final.jar
> jars/netty-3.9.9.Final.jar
>
> Shouldn't one be excluded or perhaps shaded?
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://about.me/JacekLaskowski
> The Internals of Spark SQL https://bit.ly/spark-sql-internals
> The Internals of Spark Structured Streaming https://bit.ly/spark-structured-streaming
> The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
> Follow me at https://twitter.com/jaceklaskowski
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org