You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Prasad <ra...@gmail.com> on 2014/02/20 10:12:07 UTC

Unable to read HDFS file -- SimpleApp.java

After building Spark and on trying to execute the SimpleApp.java from the
overview tutorial, and making changes to read a HDFS file -- i get the below
error while execution

Caused by: java.lang.VerifyError: class
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$SetOwnerRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;

[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project
simple-project: An exception occured while executing the Java class. null:
InvocationTargetException: class
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AddBlockRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet; -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project
simple-project: An exception occured while executing the Java class. null at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)

My environment is
Hadoop ver 2.2.0/Ubuntu 12.0.4 Spark 0.9.0 ...Using Scala version 2.10.3
(OpenJDK Server VM, Java 1.7.0_51)

i built the target with the specifying the correct hadoop verson.

Any help
Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-read-HDFS-file-SimpleApp-java-tp1813.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Unable to read HDFS file -- SimpleApp.java

Posted by Prasad <ra...@gmail.com>.
Check this thread out,
http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2807.html 
-- you have to remove conflicting akka and protbuf versions.

Thanks
Prasad.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-read-HDFS-file-SimpleApp-java-tp1813p2853.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Unable to read HDFS file -- SimpleApp.java

Posted by Prasad <ra...@gmail.com>.
Thanks, i just tried it, but did not help. i am still getting the same error.
Just to let you when i go thru the shell, i am able to access the file.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-read-HDFS-file-SimpleApp-java-tp1813p1839.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Unable to read HDFS file -- SimpleApp.java

Posted by vinay Bajaj <vb...@gmail.com>.
I was also unable to read from HDFS but i was getting some other error. I
dont know this may help you but you can try it out, it worked for me. You
have to set hadoop config for reading form hdfs.

        JavaSparkContext sc = new JavaSparkContext(context);
        String path = "hdfs://url:port/sparkTestingData/smallData.txt";


        Configuration hadoopConfig = sc.hadoopConfiguration();
        hadoopConfig.set("fs.hdfs.impl",
                org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()
        );
        hadoopConfig.set("fs.file.impl",
                org.apache.hadoop.fs.LocalFileSystem.class.getName()
        );
        JavaRDD<String> lines = sc.textFile(path);



On Thu, Feb 20, 2014 at 2:42 PM, Prasad <ra...@gmail.com>wrote:

> After building Spark and on trying to execute the SimpleApp.java from the
> overview tutorial, and making changes to read a HDFS file -- i get the
> below
> error while execution
>
> Caused by: java.lang.VerifyError: class
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$SetOwnerRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project
> simple-project: An exception occured while executing the Java class. null:
> InvocationTargetException: class
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AddBlockRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet; -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on
> project
> simple-project: An exception occured while executing the Java class. null
> at
>
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
>
> My environment is
> Hadoop ver 2.2.0/Ubuntu 12.0.4 Spark 0.9.0 ...Using Scala version 2.10.3
> (OpenJDK Server VM, Java 1.7.0_51)
>
> i built the target with the specifying the correct hadoop verson.
>
> Any help
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-read-HDFS-file-SimpleApp-java-tp1813.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>