You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2017/06/19 19:00:01 UTC

[jira] [Commented] (SPARK-21143) Fail to fetch blocks >1MB in size in presence of conflicting Netty version

    [ https://issues.apache.org/jira/browse/SPARK-21143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16054593#comment-16054593 ] 

Shixiong Zhu commented on SPARK-21143:
--------------------------------------

The reason you cannot use 4.0.42.Final is because you are using 4.1.X APIs?

> Fail to fetch blocks >1MB in size in presence of conflicting Netty version
> --------------------------------------------------------------------------
>
>                 Key: SPARK-21143
>                 URL: https://issues.apache.org/jira/browse/SPARK-21143
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Ryan Williams
>            Priority: Minor
>
> One of my spark libraries inherited a transitive-dependency on Netty 4.1.6.Final (vs. Spark's 4.0.42.Final), and I observed a strange failure I wanted to document: fetches of blocks larger than 1MB (pre-compression, afaict) seem to trigger a code path that results in {{AbstractMethodError}}'s and ultimately stage failures.
> I put a minimal repro in [this github repo|https://github.com/ryan-williams/spark-bugs/tree/netty]: {{collect}} on a 1-partition RDD with 1032 {{Array\[Byte\]}}'s of size 1000 works, but at 1033 {{Array}}'s it dies in a confusing way.
> Not sure what fixing/mitigating this in Spark would look like, other than defensively shading+renaming netty.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org