You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Paweł Szulc <pa...@gmail.com> on 2016/10/10 10:56:13 UTC

spark using two different versions of netty?

Hi,

quick question, why is Spark using two different versions of netty?:


   - io.netty:netty-all:4.0.29.Final:jar
   - io.netty:netty:3.8.0.Final:jar


?

-- 
Regards,
Paul Szulc

twitter: @rabbitonweb
blog: www.rabbitonweb.com

Re: spark using two different versions of netty?

Posted by Paweł Szulc <pa...@gmail.com>.
Yeah, I should be more precise. Those are two direct dependencies.

On Mon, Oct 10, 2016 at 1:15 PM, Sean Owen <so...@cloudera.com> wrote:

> Usually this sort of thing happens because the two versions are in
> different namespaces in different major versions and both are needed. That
> is true of Netty: http://netty.io/wiki/new-and-noteworthy-in-4.0.html
> However, I see that Spark declares a direct dependency on both, when it
> does not use 3.x directly (and should not). The exception is in the Flume
> module, but that could be handled more narrowly. I will look into fixing
> this if applicable.
>
> On Mon, Oct 10, 2016 at 11:56 AM Paweł Szulc <pa...@gmail.com> wrote:
>
>> Hi,
>>
>> quick question, why is Spark using two different versions of netty?:
>>
>>
>>    - io.netty:netty-all:4.0.29.Final:jar
>>    - io.netty:netty:3.8.0.Final:jar
>>
>>
>> ?
>>
>> --
>> Regards,
>> Paul Szulc
>>
>> twitter: @rabbitonweb
>> blog: www.rabbitonweb.com
>>
>


-- 
Regards,
Paul Szulc

twitter: @rabbitonweb
blog: www.rabbitonweb.com

Re: spark using two different versions of netty?

Posted by Sean Owen <so...@cloudera.com>.
Usually this sort of thing happens because the two versions are in
different namespaces in different major versions and both are needed. That
is true of Netty: http://netty.io/wiki/new-and-noteworthy-in-4.0.html
However, I see that Spark declares a direct dependency on both, when it
does not use 3.x directly (and should not). The exception is in the Flume
module, but that could be handled more narrowly. I will look into fixing
this if applicable.

On Mon, Oct 10, 2016 at 11:56 AM Paweł Szulc <pa...@gmail.com> wrote:

> Hi,
>
> quick question, why is Spark using two different versions of netty?:
>
>
>    - io.netty:netty-all:4.0.29.Final:jar
>    - io.netty:netty:3.8.0.Final:jar
>
>
> ?
>
> --
> Regards,
> Paul Szulc
>
> twitter: @rabbitonweb
> blog: www.rabbitonweb.com
>