You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Prasad <ra...@gmail.com> on 2014/02/28 17:51:43 UTC

Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it. 

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
        at java.lang.Class.privateGetPublicMethods(Class.java:2651)
        at java.lang.Class.privateGetPublicMethods(Class.java:2661)
        at java.lang.Class.getMethods(Class.java:1467)
        at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
        at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
        at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Ognen Duzlevski <og...@plainvanillagames.com>.
I run a 2.2.0 based HDFS cluster and I use Spark-0.9.0 without any 
problems to read the files.
Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
> Hi
> I am getting the protobuf error.... while reading HDFS file using spark
> 0.9.0 -- i am running on hadoop 2.2.0 .
>
> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>
> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
> /home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
> /home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
> /home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>
>
> Can someone please let me know if you faced these issues and how u fixed it.
>
> Thanks
> Prasad.
> Caused by: java.lang.VerifyError: class
> org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>          at java.lang.ClassLoader.defineClass1(Native Method)
>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>          at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>          at java.security.AccessController.doPrivileged(Native Method)
>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>          at java.lang.Class.getDeclaredMethods0(Native Method)
>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>          at java.lang.Class.getMethods(Class.java:1467)
>          at
> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>          at
> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>          at
> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>
>
> Caused by: java.lang.reflect.InvocationTargetException
>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>          at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>          at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>          at java.lang.reflect.Method.invoke(Method.java:606)
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
Using protobuf 2.5 can lead to some major issues with spark, see

http://mail-archives.apache.org/mod_mbox/spark-user/201401.mbox/%3CCAB89JJUY0SqkKOKCidETGLrZRJ2ZLat3PhBvPJoxxCY9soqaqg@mail.gmail.com%3E

Moving protobuf 2.5 jar after the spark jar can help with your error, but
then you'll face the

WARN ClusterScheduler: Initial job has not accepted any resources;...

error which is still an unresolved issue in spark.

I had to downgrade protobuf in my app to 2.4.1 to get it work on spark.
This is not ideal as protobuf 2.5 comes with better performance.


On Fri, Feb 28, 2014 at 4:51 PM, Prasad <ra...@gmail.com>wrote:

> Hi
> I am getting the protobuf error.... while reading HDFS file using spark
> 0.9.0 -- i am running on hadoop 2.2.0 .
>
> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>
> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
> protobuf-java*.jar
>
> /home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
>
> /home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>
> /home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
>
> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
>
> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
>
> /home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>
>
> Can someone please let me know if you faced these issues and how u fixed
> it.
>
> Thanks
> Prasad.
> Caused by: java.lang.VerifyError: class
>
> org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>         at java.lang.Class.getMethods(Class.java:1467)
>         at
> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>         at
> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
>         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>
>
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Arpit Tak <ar...@sigmoidanalytics.com>.
I too stuck on same issue , but on shark (0.9 with spark-0.9 ) on
hadoop-2.2.0 .

On rest hadoop versions , it works perfect....

Regards,
Arpit Tak


On Wed, Apr 16, 2014 at 11:18 PM, Aureliano Buendia <bu...@gmail.com>wrote:

> Is this resolved in spark 0.9.1?
>
>
> On Tue, Apr 15, 2014 at 6:55 PM, anant <an...@kix.in> wrote:
>
>> I've received the same error with Spark built using Maven. It turns out
>> that
>> mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
>> runtime. Protobuf included by Akka is shaded and doesn't cause any
>> problems.
>>
>> The solution is to update the mesos dependency to 0.18.0 in spark's
>> pom.xml.
>> Rebuilding the JAR with this configuration solves the issue.
>>
>> -Anant
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p4286.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
Is this resolved in spark 0.9.1?


On Tue, Apr 15, 2014 at 6:55 PM, anant <an...@kix.in> wrote:

> I've received the same error with Spark built using Maven. It turns out
> that
> mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
> runtime. Protobuf included by Akka is shaded and doesn't cause any
> problems.
>
> The solution is to update the mesos dependency to 0.18.0 in spark's
> pom.xml.
> Rebuilding the JAR with this configuration solves the issue.
>
> -Anant
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p4286.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by anant <an...@kix.in>.
I've received the same error with Spark built using Maven. It turns out that
mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
runtime. Protobuf included by Akka is shaded and doesn't cause any problems.

The solution is to update the mesos dependency to 0.18.0 in spark's pom.xml.
Rebuilding the JAR with this configuration solves the issue.

-Anant



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p4286.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Prasad <ra...@gmail.com>.
hi,
Yes, i did. 
PARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly
Further, when i use the spark-shell, i can read the same file and it works
fine.
Thanks
Prasad.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2199.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by giive chen <th...@gmail.com>.
Hi Prasad

Sorry for missing your reply.
https://gist.github.com/thegiive/10791823
Here it is.

Wisely Chen


On Fri, Apr 4, 2014 at 11:57 PM, Prasad <ra...@gmail.com>wrote:

> Hi Wisely,
> Could you please post your pom.xml here.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p3770.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Prasad <ra...@gmail.com>.
Hi Wisely,
Could you please post your pom.xml here.

Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p3770.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by giive chen <th...@gmail.com>.
Hi

I am quite beginner in spark and I have similar issue last week. I don't
know if my issue is the same as yours. I found that my program's jar
contain protobuf and when I remove this dependency on my program's pom.xml,
rebuild my program and it works.

Here is how I solved my own issue.

Environment:

Spark 0.9, HDFS (Hadoop 2.3), Scala 2.10. My spark is hadoop 2 HDP2
prebuild version from http://spark.apache.org/downloads.html. I don't build
spark by my own.

Problem :

I use spark 0.9 example folder's word count program to connect my hdfs file
which is build on hadoop 2.3. The running command is "./bin/run-example
org.apache.spark.examples.WordCount"
It show "Caused by: java.lang.VerifyError". I survey a lot on web but
cannot get any workable solution.

How I Solve my issue

I found that if I use spark 0.9's spark-shell and it can connect hdfs file
without this problem. But if I use run-example command, it show
java.lang.VerifyError.
I think the main reason is these two command(spark-shell and run-example)'s
classpath is different.

Run-Example's classpath is $SPARK_HOME
/examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.0-incubating.jar::$SPARK_HOME/conf:$SPARK_HOME/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar

Spark-Home's classpath
is :$SPARK_HOME/conf:$SPARK_HOME/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar

The class path difference is
$SPARK_HOME/examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.0-incubating.jar
and it is build by exmaple program. When I look into  this jar file, I
found that it contain two protobuf which I don't know where it is from. I
remove all dependency from my example pom.xml and left only one
dependncy "spark-core".
I rebuild it and it success.

I don't know if my issue is the same as yours. I hope it can help.

Wisely Chen



On Wed, Mar 26, 2014 at 7:10 AM, Patrick Wendell <pw...@gmail.com> wrote:

> Starting with Spark 0.9 the protobuf dependency we use is shaded and
> cannot interfere with other protobuf libaries including those in
> Hadoop. Not sure what's going on in this case. Would someone who is
> having this problem post exactly how they are building spark?
>
> - Patrick
>
> On Fri, Mar 21, 2014 at 3:49 PM, Aureliano Buendia <bu...@gmail.com>
> wrote:
> >
> >
> >
> > On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
> > <og...@plainvanillagames.com> wrote:
> >>
> >>
> >> On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
> >>>
> >>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
> >>>>
> >>>> Is there a reason for spark using the older akka?
> >>>>
> >>>>
> >>>>
> >>>>
> >>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
> >>>>
> >>>> The problem is in akka remote. It contains files compiled with 2.4.*.
> >>>> When
> >>>>
> >>>> you run it with 2.5.* in classpath it fails like above.
> >>>>
> >>>>
> >>>>
> >>>> Looks like moving to akka 2.3 will solve this issue. Check this issue
> -
> >>>>
> >>>>
> >>>>
> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket
> :
> >>>>
> >>>>
> >>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
> >>>> thi produce more complications?
> >>
> >> I am not sure I remember what the context was around this but I run
> 0.9.0
> >> with hadoop 2.2.0 just fine.
> >
> >
> > The problem is that spark depends on an older version of akka, which
> depends
> > on an older version of protobuf (2.4).
> >
> > This means people cannot use protobuf 2.5 with spark.
> >
> >>
> >> Ognen
> >
> >
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Patrick Wendell <pw...@gmail.com>.
Starting with Spark 0.9 the protobuf dependency we use is shaded and
cannot interfere with other protobuf libaries including those in
Hadoop. Not sure what's going on in this case. Would someone who is
having this problem post exactly how they are building spark?

- Patrick

On Fri, Mar 21, 2014 at 3:49 PM, Aureliano Buendia <bu...@gmail.com> wrote:
>
>
>
> On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
> <og...@plainvanillagames.com> wrote:
>>
>>
>> On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
>>>
>>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>>>
>>>> Is there a reason for spark using the older akka?
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
>>>>
>>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>>> When
>>>>
>>>> you run it with 2.5.* in classpath it fails like above.
>>>>
>>>>
>>>>
>>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>>
>>>>
>>>> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>>>>
>>>>
>>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>>> thi produce more complications?
>>
>> I am not sure I remember what the context was around this but I run 0.9.0
>> with hadoop 2.2.0 just fine.
>
>
> The problem is that spark depends on an older version of akka, which depends
> on an older version of protobuf (2.4).
>
> This means people cannot use protobuf 2.5 with spark.
>
>>
>> Ognen
>
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Gary Malouf <ma...@gmail.com>.
Can anyone verify the claims from Aureliano regarding the Akka dependency
protobuf collision?  Our team has a major need to upgrade to protobuf 2.5.0
up the pipe and Spark seems to be the blocker here.


On Fri, Mar 21, 2014 at 6:49 PM, Aureliano Buendia <bu...@gmail.com>wrote:

>
>
>
> On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski <
> ognen@plainvanillagames.com> wrote:
>
>>
>> On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
>>
>>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>>
>>>> Is there a reason for spark using the older akka?
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
>>>>
>>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>>> When
>>>>
>>>> you run it with 2.5.* in classpath it fails like above.
>>>>
>>>>
>>>>
>>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>>
>>>> https://www.assembla.com/spaces/akka/tickets/3154-use-
>>>> protobuf-version-2-5-0#/activity/ticket:
>>>>
>>>>
>>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>>> thi produce more complications?
>>>>
>>> I am not sure I remember what the context was around this but I run
>> 0.9.0 with hadoop 2.2.0 just fine.
>>
>
> The problem is that spark depends on an older version of akka, which
> depends on an older version of protobuf (2.4).
>
> This means people cannot use protobuf 2.5 with spark.
>
>
>> Ognen
>>
>
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Gary Malouf <ma...@gmail.com>.
Can anyone verify the claims from Aureliano regarding the Akka dependency
protobuf collision?  Our team has a major need to upgrade to protobuf 2.5.0
up the pipe and Spark seems to be the blocker here.


On Fri, Mar 21, 2014 at 6:49 PM, Aureliano Buendia <bu...@gmail.com>wrote:

>
>
>
> On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski <
> ognen@plainvanillagames.com> wrote:
>
>>
>> On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
>>
>>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>>
>>>> Is there a reason for spark using the older akka?
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
>>>>
>>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>>> When
>>>>
>>>> you run it with 2.5.* in classpath it fails like above.
>>>>
>>>>
>>>>
>>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>>
>>>> https://www.assembla.com/spaces/akka/tickets/3154-use-
>>>> protobuf-version-2-5-0#/activity/ticket:
>>>>
>>>>
>>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>>> thi produce more complications?
>>>>
>>> I am not sure I remember what the context was around this but I run
>> 0.9.0 with hadoop 2.2.0 just fine.
>>
>
> The problem is that spark depends on an older version of akka, which
> depends on an older version of protobuf (2.4).
>
> This means people cannot use protobuf 2.5 with spark.
>
>
>> Ognen
>>
>
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski <
ognen@plainvanillagames.com> wrote:

>
> On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
>
>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>
>>> Is there a reason for spark using the older akka?
>>>
>>>
>>>
>>>
>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
>>>
>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>> When
>>>
>>> you run it with 2.5.* in classpath it fails like above.
>>>
>>>
>>>
>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>
>>> https://www.assembla.com/spaces/akka/tickets/3154-use-
>>> protobuf-version-2-5-0#/activity/ticket:
>>>
>>>
>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>> thi produce more complications?
>>>
>> I am not sure I remember what the context was around this but I run 0.9.0
> with hadoop 2.2.0 just fine.
>

The problem is that spark depends on an older version of akka, which
depends on an older version of protobuf (2.4).

This means people cannot use protobuf 2.5 with spark.


> Ognen
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Ognen Duzlevski <og...@plainvanillagames.com>.
On 3/18/14, 4:49 AM, dmpour23@gmail.com wrote:
> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>> Is there a reason for spark using the older akka?
>>
>>
>>
>>
>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
>>
>> The problem is in akka remote. It contains files compiled with 2.4.*. When
>>
>> you run it with 2.5.* in classpath it fails like above.
>>
>>
>>
>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>
>> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>>
>>
>> Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?
I am not sure I remember what the context was around this but I run 
0.9.0 with hadoop 2.2.0 just fine.
Ognen

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by dm...@gmail.com.
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
> Is there a reason for spark using the older akka?
> 
> 
> 
> 
> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:
> 
> The problem is in akka remote. It contains files compiled with 2.4.*. When
> 
> you run it with 2.5.* in classpath it fails like above.
> 
> 
> 
> Looks like moving to akka 2.3 will solve this issue. Check this issue -
> 
> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
> 
> 
> 
> 
> 
> 
> 
> 
> --
> 
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
> 
> 
> 
> 
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
Is there a reason for spark using the older akka?


On Sun, Mar 2, 2014 at 1:53 PM, 1esha <al...@gmail.com> wrote:

> The problem is in akka remote. It contains files compiled with 2.4.*. When
> you run it with 2.5.* in classpath it fails like above.
>
> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>
> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket
> :
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by 1esha <al...@gmail.com>.
The problem is in akka remote. It contains files compiled with 2.4.*. When
you run it with 2.5.* in classpath it fails like above.

Looks like moving to akka 2.3 will solve this issue. Check this issue -
https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Egor Pahomov <pa...@gmail.com>.
Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
library 2.5 - that's true. What I meant: You serialized message with class
generated with protobuf 2.4.1. Now you can read that message with class
generated with protobuf 2.5.0 from same .proto.


2014-03-01 0:00 GMT+04:00 Egor Pahomov <pa...@gmail.com>:

> In that same pom
>
>     <profile>
>       <id>yarn</id>
>
>       <properties>
>         <hadoop.major.version>2</hadoop.major.version>
>         <hadoop.version>2.2.0</hadoop.version>
>         <protobuf.version>2.5.0</protobuf.version>
>       </properties>
>
>       <modules>
>         <module>yarn</module>
>       </modules>
>
>     </profile>
>
>
>
> 2014-02-28 23:46 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:
>
>
>>
>>
>> On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <pa...@gmail.com>wrote:
>>
>>> Spark 0.9 uses protobuf 2.5.0
>>>
>>
>> Spark 0.9 uses 2.4.1:
>>
>>
>> https://github.com/apache/incubator-spark/blob/4d880304867b55a4f2138617b30600b7fa013b14/pom.xml#L118
>>
>> Is there another pom for when hadoop 2.2 is used? I don't see another
>> branch for hadooop 2.2.
>>
>>
>>> Hadoop 2.2 uses protobuf 2.5.0
>>> protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
>>>
>>
>> Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
>> library 2.5. This is what the OP's error message is about.
>>
>>
>>> So there is not any reason why you can't read some messages from hadoop
>>> 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class
>>> path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your
>>> classpath. Use excludes or whatever to get rid of 2.4.1.
>>>
>>> Personally, I spend 3 days to move my project to protobuf 2.5.0 from
>>> 2.4.1. But it has to be done for the whole your project.
>>>
>>> 2014-02-28 21:49 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:
>>>
>>> Doesn't hadoop 2.2 also depend on protobuf 2.4?
>>>>
>>>>
>>>> On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
>>>> ognen@plainvanillagames.com> wrote:
>>>>
>>>>> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
>>>>> support?
>>>>>
>>>>> Ognen
>>>>>
>>>>> On 2/28/14, 10:51 AM, Prasad wrote:
>>>>>
>>>>>> Hi
>>>>>> I am getting the protobuf error.... while reading HDFS file using
>>>>>> spark
>>>>>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>>>>>
>>>>>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>>>>>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>>>>>
>>>>>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>>>>>> protobuf-java*.jar
>>>>>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>>>>>> java/2.4.1/protobuf-java-2.4.1.jar
>>>>>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>>>>>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>>>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>>>>>> bundles/protobuf-java-2.5.0.jar
>>>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>>>>>> protobuf-java-2.4.1-shaded.jar
>>>>>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>>>>>> bundles/protobuf-java-2.5.0.jar
>>>>>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>>>>>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>>>>>
>>>>>>
>>>>>> Can someone please let me know if you faced these issues and how u
>>>>>> fixed it.
>>>>>>
>>>>>> Thanks
>>>>>> Prasad.
>>>>>> Caused by: java.lang.VerifyError: class
>>>>>> org.apache.hadoop.security.proto.SecurityProtos$
>>>>>> GetDelegationTokenRequestProto
>>>>>> overrides final method
>>>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>>>>>          at
>>>>>> java.security.SecureClassLoader.defineClass(
>>>>>> SecureClassLoader.java:142)
>>>>>>          at java.net.URLClassLoader.defineClass(URLClassLoader.
>>>>>> java:449)
>>>>>>          at java.net.URLClassLoader.access$100(URLClassLoader.
>>>>>> java:71)
>>>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>>>          at java.security.AccessController.doPrivileged(Native
>>>>>> Method)
>>>>>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:
>>>>>> 354)
>>>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>>>          at java.lang.Class.privateGetDeclaredMethods(
>>>>>> Class.java:2531)
>>>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>>>>>          at java.lang.Class.getMethods(Class.java:1467)
>>>>>>          at
>>>>>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>>>>>          at
>>>>>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>>>>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>>>>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>>>>>          at
>>>>>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>>>>>> ProtobufRpcEngine.java:92)
>>>>>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>>>>>
>>>>>>
>>>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>>> Method)
>>>>>>          at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(
>>>>>> NativeMethodAccessorImpl.java:57)
>>>>>>          at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>>>> DelegatingMethodAccessorImpl.java:43)
>>>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> View this message in context: http://apache-spark-user-list.
>>>>>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>>>>>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>> Nabble.com.
>>>>>>
>>>>>
>>>>> --
>>>>> Some people, when confronted with a problem, think "I know, I'll use
>>>>> regular expressions." Now they have two problems.
>>>>> -- Jamie Zawinski
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>>
>>> *Sincerely yours Egor PakhomovScala Developer, Yandex*
>>>
>>
>>
>
>
> --
>
>
>
> *Sincerely yours Egor PakhomovScala Developer, Yandex*
>



-- 



*Sincerely yoursEgor PakhomovScala Developer, Yandex*

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Egor Pahomov <pa...@gmail.com>.
In that same pom

    <profile>
      <id>yarn</id>
      <properties>
        <hadoop.major.version>2</hadoop.major.version>
        <hadoop.version>2.2.0</hadoop.version>
        <protobuf.version>2.5.0</protobuf.version>
      </properties>
      <modules>
        <module>yarn</module>
      </modules>

    </profile>



2014-02-28 23:46 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:

>
>
>
> On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <pa...@gmail.com>wrote:
>
>> Spark 0.9 uses protobuf 2.5.0
>>
>
> Spark 0.9 uses 2.4.1:
>
>
> https://github.com/apache/incubator-spark/blob/4d880304867b55a4f2138617b30600b7fa013b14/pom.xml#L118
>
> Is there another pom for when hadoop 2.2 is used? I don't see another
> branch for hadooop 2.2.
>
>
>> Hadoop 2.2 uses protobuf 2.5.0
>> protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
>>
>
> Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
> library 2.5. This is what the OP's error message is about.
>
>
>> So there is not any reason why you can't read some messages from hadoop
>> 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class
>> path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your
>> classpath. Use excludes or whatever to get rid of 2.4.1.
>>
>> Personally, I spend 3 days to move my project to protobuf 2.5.0 from
>> 2.4.1. But it has to be done for the whole your project.
>>
>> 2014-02-28 21:49 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:
>>
>> Doesn't hadoop 2.2 also depend on protobuf 2.4?
>>>
>>>
>>> On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
>>> ognen@plainvanillagames.com> wrote:
>>>
>>>> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
>>>> support?
>>>>
>>>> Ognen
>>>>
>>>> On 2/28/14, 10:51 AM, Prasad wrote:
>>>>
>>>>> Hi
>>>>> I am getting the protobuf error.... while reading HDFS file using spark
>>>>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>>>>
>>>>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>>>>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>>>>
>>>>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>>>>> protobuf-java*.jar
>>>>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>>>>> java/2.4.1/protobuf-java-2.4.1.jar
>>>>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>>>>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>>>>> bundles/protobuf-java-2.5.0.jar
>>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>>>>> protobuf-java-2.4.1-shaded.jar
>>>>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>>>>> bundles/protobuf-java-2.5.0.jar
>>>>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>>>>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>>>>
>>>>>
>>>>> Can someone please let me know if you faced these issues and how u
>>>>> fixed it.
>>>>>
>>>>> Thanks
>>>>> Prasad.
>>>>> Caused by: java.lang.VerifyError: class
>>>>> org.apache.hadoop.security.proto.SecurityProtos$
>>>>> GetDelegationTokenRequestProto
>>>>> overrides final method
>>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>>>>          at
>>>>> java.security.SecureClassLoader.defineClass(
>>>>> SecureClassLoader.java:142)
>>>>>          at java.net.URLClassLoader.defineClass(URLClassLoader.
>>>>> java:449)
>>>>>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>>          at java.security.AccessController.doPrivileged(Native Method)
>>>>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>>>>          at java.lang.Class.getMethods(Class.java:1467)
>>>>>          at
>>>>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>>>>          at
>>>>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>>>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>>>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>>>>          at
>>>>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>>>>> ProtobufRpcEngine.java:92)
>>>>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>>>>
>>>>>
>>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>> Method)
>>>>>          at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(
>>>>> NativeMethodAccessorImpl.java:57)
>>>>>          at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>>> DelegatingMethodAccessorImpl.java:43)
>>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context: http://apache-spark-user-list.
>>>>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>>>>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>
>>>> --
>>>> Some people, when confronted with a problem, think "I know, I'll use
>>>> regular expressions." Now they have two problems.
>>>> -- Jamie Zawinski
>>>>
>>>>
>>>
>>
>>
>> --
>>
>>
>>
>> *Sincerely yours Egor PakhomovScala Developer, Yandex*
>>
>
>


-- 



*Sincerely yoursEgor PakhomovScala Developer, Yandex*

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <pa...@gmail.com>wrote:

> Spark 0.9 uses protobuf 2.5.0
>

Spark 0.9 uses 2.4.1:

https://github.com/apache/incubator-spark/blob/4d880304867b55a4f2138617b30600b7fa013b14/pom.xml#L118

Is there another pom for when hadoop 2.2 is used? I don't see another
branch for hadooop 2.2.


> Hadoop 2.2 uses protobuf 2.5.0
> protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
>

Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
library 2.5. This is what the OP's error message is about.


> So there is not any reason why you can't read some messages from hadoop
> 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class
> path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your
> classpath. Use excludes or whatever to get rid of 2.4.1.
>
> Personally, I spend 3 days to move my project to protobuf 2.5.0 from
> 2.4.1. But it has to be done for the whole your project.
>
> 2014-02-28 21:49 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:
>
> Doesn't hadoop 2.2 also depend on protobuf 2.4?
>>
>>
>> On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
>> ognen@plainvanillagames.com> wrote:
>>
>>> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
>>> support?
>>>
>>> Ognen
>>>
>>> On 2/28/14, 10:51 AM, Prasad wrote:
>>>
>>>> Hi
>>>> I am getting the protobuf error.... while reading HDFS file using spark
>>>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>>>
>>>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>>>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>>>
>>>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>>>> protobuf-java*.jar
>>>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>>>> java/2.4.1/protobuf-java-2.4.1.jar
>>>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>>>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>>>> bundles/protobuf-java-2.5.0.jar
>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>>>> protobuf-java-2.4.1-shaded.jar
>>>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>>>> bundles/protobuf-java-2.5.0.jar
>>>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>>>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>>>
>>>>
>>>> Can someone please let me know if you faced these issues and how u
>>>> fixed it.
>>>>
>>>> Thanks
>>>> Prasad.
>>>> Caused by: java.lang.VerifyError: class
>>>> org.apache.hadoop.security.proto.SecurityProtos$
>>>> GetDelegationTokenRequestProto
>>>> overrides final method
>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>>>          at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>          at java.net.URLClassLoader.defineClass(URLClassLoader.
>>>> java:449)
>>>>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>          at java.security.AccessController.doPrivileged(Native Method)
>>>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>>>          at java.lang.Class.getMethods(Class.java:1467)
>>>>          at
>>>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>>>          at
>>>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>>>          at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>>>> ProtobufRpcEngine.java:92)
>>>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>>>
>>>>
>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>          at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(
>>>> NativeMethodAccessorImpl.java:57)
>>>>          at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>> DelegatingMethodAccessorImpl.java:43)
>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context: http://apache-spark-user-list.
>>>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>>>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>
>>> --
>>> Some people, when confronted with a problem, think "I know, I'll use
>>> regular expressions." Now they have two problems.
>>> -- Jamie Zawinski
>>>
>>>
>>
>
>
> --
>
>
>
> *Sincerely yours Egor PakhomovScala Developer, Yandex*
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Egor Pahomov <pa...@gmail.com>.
Spark 0.9 uses protobuf 2.5.0
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
So there is not any reason why you can't read some messages from hadoop 2.2
with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of
course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath.
Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1.
But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia <bu...@gmail.com>:

> Doesn't hadoop 2.2 also depend on protobuf 2.4?
>
>
> On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
> ognen@plainvanillagames.com> wrote:
>
>> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
>> support?
>>
>> Ognen
>>
>> On 2/28/14, 10:51 AM, Prasad wrote:
>>
>>> Hi
>>> I am getting the protobuf error.... while reading HDFS file using spark
>>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>>
>>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>>
>>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>>> protobuf-java*.jar
>>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>>> java/2.4.1/protobuf-java-2.4.1.jar
>>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>>> bundles/protobuf-java-2.5.0.jar
>>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>>> protobuf-java-2.4.1-shaded.jar
>>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>>> bundles/protobuf-java-2.5.0.jar
>>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>>
>>>
>>> Can someone please let me know if you faced these issues and how u fixed
>>> it.
>>>
>>> Thanks
>>> Prasad.
>>> Caused by: java.lang.VerifyError: class
>>> org.apache.hadoop.security.proto.SecurityProtos$
>>> GetDelegationTokenRequestProto
>>> overrides final method
>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>>          at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>>>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>          at java.security.AccessController.doPrivileged(Native Method)
>>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>>          at java.lang.Class.getMethods(Class.java:1467)
>>>          at
>>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>>          at
>>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>>          at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>>> ProtobufRpcEngine.java:92)
>>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>>
>>>
>>> Caused by: java.lang.reflect.InvocationTargetException
>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>          at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(
>>> NativeMethodAccessorImpl.java:57)
>>>          at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>> DelegatingMethodAccessorImpl.java:43)
>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context: http://apache-spark-user-list.
>>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>
>> --
>> Some people, when confronted with a problem, think "I know, I'll use
>> regular expressions." Now they have two problems.
>> -- Jamie Zawinski
>>
>>
>


-- 



*Sincerely yoursEgor PakhomovScala Developer, Yandex*

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Aureliano Buendia <bu...@gmail.com>.
Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
ognen@plainvanillagames.com> wrote:

> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
> support?
>
> Ognen
>
> On 2/28/14, 10:51 AM, Prasad wrote:
>
>> Hi
>> I am getting the protobuf error.... while reading HDFS file using spark
>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>
>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>
>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>> protobuf-java*.jar
>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>> java/2.4.1/protobuf-java-2.4.1.jar
>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>> bundles/protobuf-java-2.5.0.jar
>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>> protobuf-java-2.4.1-shaded.jar
>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>> bundles/protobuf-java-2.5.0.jar
>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>
>>
>> Can someone please let me know if you faced these issues and how u fixed
>> it.
>>
>> Thanks
>> Prasad.
>> Caused by: java.lang.VerifyError: class
>> org.apache.hadoop.security.proto.SecurityProtos$
>> GetDelegationTokenRequestProto
>> overrides final method
>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>          at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>          at java.security.AccessController.doPrivileged(Native Method)
>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>          at java.lang.Class.getMethods(Class.java:1467)
>>          at
>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>          at
>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>> ProtobufRpcEngine.java:92)
>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>
>>
>> Caused by: java.lang.reflect.InvocationTargetException
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(
>> NativeMethodAccessorImpl.java:57)
>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> DelegatingMethodAccessorImpl.java:43)
>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.
>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
> --
> Some people, when confronted with a problem, think "I know, I'll use
> regular expressions." Now they have two problems.
> -- Jamie Zawinski
>
>

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Posted by Ognen Duzlevski <og...@plainvanillagames.com>.
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 
support?
Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
> Hi
> I am getting the protobuf error.... while reading HDFS file using spark
> 0.9.0 -- i am running on hadoop 2.2.0 .
>
> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>
> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
> /home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
> /home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
> /home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>
>
> Can someone please let me know if you faced these issues and how u fixed it.
>
> Thanks
> Prasad.
> Caused by: java.lang.VerifyError: class
> org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>          at java.lang.ClassLoader.defineClass1(Native Method)
>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>          at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>          at java.security.AccessController.doPrivileged(Native Method)
>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>          at java.lang.Class.getDeclaredMethods0(Native Method)
>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>          at java.lang.Class.getMethods(Class.java:1467)
>          at
> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>          at
> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>          at
> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>
>
> Caused by: java.lang.reflect.InvocationTargetException
>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>          at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>          at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>          at java.lang.reflect.Method.invoke(Method.java:606)
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

-- 
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski