You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/09/11 10:59:34 UTC
[jira] [Resolved] (SPARK-683) Spark 0.7 with Hadoop 1.0 does not
work with current AMI's HDFS installation
[ https://issues.apache.org/jira/browse/SPARK-683?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-683.
-----------------------------
Resolution: Fixed
I think this is likely long since obsolete or fixed, since Spark, Hadoop and AMI Hadoop versions have moved forward, and have not heard of this issue in recent memory.
> Spark 0.7 with Hadoop 1.0 does not work with current AMI's HDFS installation
> ----------------------------------------------------------------------------
>
> Key: SPARK-683
> URL: https://issues.apache.org/jira/browse/SPARK-683
> Project: Spark
> Issue Type: Bug
> Components: EC2
> Affects Versions: 0.7.0
> Reporter: Tathagata Das
>
> A simple saveAsObjectFile() leads to the following error.
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.lang.NoSuchMethodException: org.apache.hadoop.hdfs.protocol.ClientProtocol.create(java.lang.String, org.apache.hadoop.fs.permission.FsPermission, java.lang.String, boolean, boolean, short, long)
> at java.lang.Class.getMethod(Class.java:1622)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:416)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org