You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "PandaMonkey (JIRA)" <ji...@apache.org> on 2018/02/24 11:44:01 UTC

[jira] [Updated] (SPARK-23509) Upgrade commons-net from 2.2 to 3.1

     [ https://issues.apache.org/jira/browse/SPARK-23509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

PandaMonkey updated SPARK-23509:
--------------------------------
    Attachment: spark.txt

> Upgrade commons-net from 2.2 to 3.1
> -----------------------------------
>
>                 Key: SPARK-23509
>                 URL: https://issues.apache.org/jira/browse/SPARK-23509
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: Spark Core
>    Affects Versions: 2.4.0
>            Reporter: PandaMonkey
>            Priority: Major
>             Fix For: 2.4.0
>
>         Attachments: spark.txt
>
>
> Hi, after analyzing spark-master\core\pom.xml, we found that Spark-core depends on org.apache.hadoop:hadoop-client:2.6.5, which transitivity introduced commons-net:3.1. At the same time, Spark-core directly depends on a older version of commons-net:2.2. By further look into the source code, these two versions of commons-net have many different features. The dependency conflict problem brings high risks of "NotClassDefFoundError:" or "NoSuchMethodError" issues at runtime. Please notice this problem. Maybe upgrading commons-net from 2.2 to 3.1 is a good choice. Please notice this problem. Hope this report can help you. Thanks!
>  
> Regards,
> Panda



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org