You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "cocoatomo (JIRA)" <ji...@apache.org> on 2014/10/04 20:32:34 UTC

[jira] [Created] (SPARK-3794) Building spark core fails with specific hadoop version

cocoatomo created SPARK-3794:
--------------------------------

             Summary: Building spark core fails with specific hadoop version
                 Key: SPARK-3794
                 URL: https://issues.apache.org/jira/browse/SPARK-3794
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.2.0
         Environment: Mac OS X 10.9.5
            Reporter: cocoatomo
             Fix For: 1.2.0


At the commit cf1d32e3e1071829b152d4b597bf0a0d7a5629a2, building spark core result in compilation error when we specify some hadoop versions.

To reproduce this issue, we should execute following command with <hadoop.version>=1.1.0, 1.1.1, 1.1.2, 1.2.0, 1.2.1, or 2.2.0.

{noformat}
$ cd ./core
$ mvn -Dhadoop.version=<hadoop.version> -DskipTests clean compile
...
[ERROR] /Users/tomohiko/MyRepos/Scala/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:720: value listFilesAndDirs is not a member of object org.apache.commons.io.FileUtils
[ERROR]       val files = FileUtils.listFilesAndDirs(dir, TrueFileFilter.TRUE, TrueFileFilter.TRUE)
[ERROR]                             ^
{noformat}

Because that compilation uses commons-io version 2.1 and FileUtils#listFilesAndDirs method was added at commons-io version 2.2, this compilation already fails.

FileUtils#listFilesAndDirs → http://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/FileUtils.html#listFilesAndDirs%28java.io.File,%20org.apache.commons.io.filefilter.IOFileFilter,%20org.apache.commons.io.filefilter.IOFileFilter%29

Because a hadoop-client in those problematic version depends on commons-io 2.1 not 2.4, we should have assumption that commons-io is version 2.1.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org