You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by lipzhu <gi...@git.apache.org> on 2018/10/22 19:03:22 UTC

[GitHub] spark pull request #22796: [SPARK-25800][Build] Remove all instances of nett...

GitHub user lipzhu opened a pull request:

    https://github.com/apache/spark/pull/22796

    [SPARK-25800][Build] Remove all instances of netty 3.x in code base.

    ## What changes were proposed in this pull request?
    
    Remove all the dependencies of netty 3.x from pom.xml.
    
    ## How was this patch tested?
    
    Manual tests.
    ./dev/make-distribution.sh --name custom_spark --tgz  -Phadoop-2.7 -Phive -Pyarn -Phive-thriftserver -Phadoop-provided

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/lipzhu/spark SPARK-25800

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22796.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22796
    
----

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #22796: [SPARK-25800][Build] Remove all instances of nett...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22796#discussion_r227046597
  
    --- Diff: dev/deps/spark-deps-hadoop-2.7 ---
    @@ -148,7 +148,7 @@ metrics-graphite-3.1.5.jar
     metrics-json-3.1.5.jar
     metrics-jvm-3.1.5.jar
     minlog-1.3.0.jar
    -netty-3.9.9.Final.jar
    +netty-3.7.0.Final.jar
    --- End diff --
    
    I see, I think what's happening is that we still have a netty 3.x dependency from Hadoop. I guess we should simply leave that in place.
    
    I think that previously we had tried to manually exclude netty dependencies because it helped SBT resolve the version in the same way as Maven. I don't know if we actually want to take netty 3.x away from ZK and Hadoop if it wants them.
    
    At least, this change doesn't actually get rid of netty 3.x from the distribution, but we don't need to do that necessarily. I just wanted to make sure Spark itself has no reference to it nor is trying to manage it or depend on it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    Yeah, maybe the better question is, what brings in this dependency? I took out the declaration and all exclusions for `io.netty:netty` and checked dependencies and it's basically Zookeeper (3.7.0) and Hadoop (3.6.9). 
    
    Looks like we managed it up to make sure to pick up a security fix, actually.
    
    OK, so yes that's a good reason not to let it go back down @dongjoon-hyun . I think we have to leave this in as even Hadoop 3.x needs netty 3.x apparently. However we can try updating to the latest 3.x, if desired here.
    
    or just close it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #22796: [SPARK-25800][Build] Remove all instances of nett...

Posted by lipzhu <gi...@git.apache.org>.
Github user lipzhu closed the pull request at:

    https://github.com/apache/spark/pull/22796


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    Hi, @lipzhu . Could you close this PR and SPARK-25800 issue?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    Removing the depedency is good, but currently
    - It's not clear the effective benefit for Apache Spark by downgrading the dependency in the distribution.
    - Downgrading in binary distributions might cause unnecessary side-effects (or confusions) on the downstream users and apps.
    
    If we can't get rid of the dependency effectivly, we had better keep the current versions at least. In this case, there is no potential side-effect for us to consider.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    Well I think we want to remove the direct dependency on 3.x right @dongjoon-hyun ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #22796: [SPARK-25800][Build] Remove all instances of netty 3.x i...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/22796
  
    You can can figure out what's going on by looking at the output of mvn dependency:tree, potentially with the "-Dverbose" option.
    
    If there is more than one transitive dependency that depends on the old netty, then most probably this dependency exists to enforce that this particular version is used; otherwise IIRC maven's resolution is "nearest first" which may pick the older version.
    
    If there is a single transitive dependency using the old netty, then it can be removed, even though the downgrade is a little weird.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #22796: [SPARK-25800][Build] Remove all instances of nett...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22796#discussion_r227043052
  
    --- Diff: dev/deps/spark-deps-hadoop-2.7 ---
    @@ -148,7 +148,7 @@ metrics-graphite-3.1.5.jar
     metrics-json-3.1.5.jar
     metrics-jvm-3.1.5.jar
     minlog-1.3.0.jar
    -netty-3.9.9.Final.jar
    +netty-3.7.0.Final.jar
    --- End diff --
    
    Ur, @lipzhu . This doesn't look correct to me. We usually do not **downgrade** the dependency like this.
    
    I guess this is also not the recommendation from @srowen .


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org