You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "kuromatsu nobuyuki (JIRA)" <ji...@apache.org> on 2014/10/30 01:10:33 UTC

[jira] [Commented] (SPARK-4132) Spark uses incompatible HDFS API

    [ https://issues.apache.org/jira/browse/SPARK-4132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14189351#comment-14189351 ] 

kuromatsu nobuyuki commented on SPARK-4132:
-------------------------------------------

Owen, thank you for your indication.
It looks much the same as my trouble.

> Spark uses incompatible HDFS API
> --------------------------------
>
>                 Key: SPARK-4132
>                 URL: https://issues.apache.org/jira/browse/SPARK-4132
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>         Environment: Spark1.1.0 on Hadoop1.2.1
> CentOS 6.3 64bit
>            Reporter: kuromatsu nobuyuki
>            Priority: Minor
>
> When I enable event logging and set it to output to HDFS, initialization fails with 'java.lang.ClassNotFoundException' (see trace below).
> I found that an API incompatibility in org.apache.hadoop.fs.permission.FsPermission between Hadoop 1.0.4 and Hadoop 1.1.0 (and above) causes this error (org.apache.hadoop.fs.permission.FsPermission$2 is used in 1.0.4 but doesn't exist in my 1.2.1 environment).
> I think that the Spark jar file pre-built for Hadoop1.X should be built on Hadoop Stable version(Hadoop 1.2.1).
> 2014-10-24 10:43:22,893 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: 
> readAndProcess threw exception java.lang.RuntimeException: 
> readObject can't find class org.apache.hadoop.fs.permission.FsPermission$2. Count of bytes read: 0
> java.lang.RuntimeException: readObject can't find class org.apache.hadoop.fs.permission.FsPermission$2
>         at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:233)
>         at org.apache.hadoop.ipc.RPC$Invocation.readFields(RPC.java:106)
>         at org.apache.hadoop.ipc.Server$Connection.processData(Server.java:1347)
>         at org.apache.hadoop.ipc.Server$Connection.processOneRpc(Server.java:1326)
>         at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1226)
>         at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
>         at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:701)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.permission.FsPermission$2
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:270)
>         at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
>         at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:231)
>         ... 9 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org