You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Vipul Pandey <vi...@gmail.com> on 2014/04/01 21:23:06 UTC
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
That's all I do.
On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
> Vidal - could you show exactly what flags/commands you are using when you build spark to produce this assembly?
>
>
> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>
> No i'm not. Although I see that protobuf libraries are directly pulled into the 0.9.0 assembly jar - I do see the shaded version as well.
> e.g. below for Message.class
>
> -bash-4.1$ jar -ftv ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep protobuf | grep /Message.class
> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
> 508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>
>
>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>
> I did have another one which I moved to the end of classpath - even ran partial code without that dependency but it still failed whenever I use the jar with ScalaBuf dependency.
> Spark version is 0.9.0
>
>
> ~Vipul
>
> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>
>> - Patrick
>>
>>
>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com> wrote:
>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. any word on this one?
>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>
>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
>> > on each of the spark worker nodes.
>> > The message is compiled using 2.5 but then on runtime it is being
>> > de-serialized by 2.4.1 as I'm getting the following exception
>> >
>> > java.lang.VerifyError (java.lang.VerifyError: class
>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>> > java.lang.ClassLoader.defineClass1(Native Method)
>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> >
>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>> > use different version of protobuf in the application.
>> >
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>>
>
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Patrick Wendell <pw...@gmail.com>.
Okay so I think the issue here is just a conflict between your application
code and the Hadoop code.
Hadoop 2.0.0 depends on protobuf 2.4.0a:
https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.0-alpha/hadoop-project/pom.xml
Your code is depending on protobuf 2.5.X
The protobuf library is not binary compatible between these two versions
(unfortunately). This means that your application will have to shade
protobuf 2.5.X or you will have to upgrade to a version of Hadoop that is
compatible.
On Wed, Apr 9, 2014 at 1:03 PM, Kanwaldeep <ka...@gmail.com> wrote:
> Any update on this? We are still facing this issue.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p4015.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Kanwaldeep <ka...@gmail.com>.
Any update on this? We are still facing this issue.
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p4015.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Vipul Pandey <vi...@gmail.com>.
Any word on this one ?
On Apr 2, 2014, at 12:26 AM, Vipul Pandey <vi...@gmail.com> wrote:
> I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus generated also has both shaded and real version of protobuf classes
>
> Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv ./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep proto | grep /Message
> 1190 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageOrBuilder.class
> 2913 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message$Builder.class
> 704 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLite.class
> 1904 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLite$Builder.class
> 257 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLiteOrBuilder.class
> 508 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message.class
> 2661 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message$Builder.class
> 478 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message.class
> 1748 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite$Builder.class
> 668 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite.class
> 245 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLiteOrBuilder.class
> 1112 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageOrBuilder.class
>
>
>
>
>
> On Apr 1, 2014, at 11:44 PM, Patrick Wendell <pw...@gmail.com> wrote:
>
>> It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
>>
>>
>> On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey <vi...@gmail.com> wrote:
>> how do you recommend building that - it says
>> ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No assembly descriptors found. -> [Help 1]
>> upon runnning
>> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>>
>>
>> On Apr 1, 2014, at 4:13 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>>> Do you get the same problem if you build with maven?
>>>
>>>
>>> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vi...@gmail.com> wrote:
>>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>>>
>>> That's all I do.
>>>
>>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
>>>
>>>> Vidal - could you show exactly what flags/commands you are using when you build spark to produce this assembly?
>>>>
>>>>
>>>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>>>
>>>> No i'm not. Although I see that protobuf libraries are directly pulled into the 0.9.0 assembly jar - I do see the shaded version as well.
>>>> e.g. below for Message.class
>>>>
>>>> -bash-4.1$ jar -ftv ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep protobuf | grep /Message.class
>>>> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>>> 508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>>>>
>>>>
>>>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>>>
>>>> I did have another one which I moved to the end of classpath - even ran partial code without that dependency but it still failed whenever I use the jar with ScalaBuf dependency.
>>>> Spark version is 0.9.0
>>>>
>>>>
>>>> ~Vipul
>>>>
>>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>>>
>>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>>>>
>>>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>>>>
>>>>> - Patrick
>>>>>
>>>>>
>>>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com> wrote:
>>>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. any word on this one?
>>>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>>>>
>>>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
>>>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
>>>>> > on each of the spark worker nodes.
>>>>> > The message is compiled using 2.5 but then on runtime it is being
>>>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>>>> >
>>>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> >
>>>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>>>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>>>>> > use different version of protobuf in the application.
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> > --
>>>>> > View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>>>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Vipul Pandey <vi...@gmail.com>.
I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus generated also has both shaded and real version of protobuf classes
Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv ./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep proto | grep /Message
1190 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageOrBuilder.class
2913 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message$Builder.class
704 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLite.class
1904 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLite$Builder.class
257 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLiteOrBuilder.class
508 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message.class
2661 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message$Builder.class
478 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message.class
1748 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite$Builder.class
668 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite.class
245 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLiteOrBuilder.class
1112 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageOrBuilder.class
On Apr 1, 2014, at 11:44 PM, Patrick Wendell <pw...@gmail.com> wrote:
> It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
>
>
> On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey <vi...@gmail.com> wrote:
> how do you recommend building that - it says
> ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No assembly descriptors found. -> [Help 1]
> upon runnning
> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>
>
> On Apr 1, 2014, at 4:13 PM, Patrick Wendell <pw...@gmail.com> wrote:
>
>> Do you get the same problem if you build with maven?
>>
>>
>> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vi...@gmail.com> wrote:
>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>>
>> That's all I do.
>>
>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>>> Vidal - could you show exactly what flags/commands you are using when you build spark to produce this assembly?
>>>
>>>
>>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> No i'm not. Although I see that protobuf libraries are directly pulled into the 0.9.0 assembly jar - I do see the shaded version as well.
>>> e.g. below for Message.class
>>>
>>> -bash-4.1$ jar -ftv ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep protobuf | grep /Message.class
>>> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>> 508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>>>
>>>
>>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>>
>>> I did have another one which I moved to the end of classpath - even ran partial code without that dependency but it still failed whenever I use the jar with ScalaBuf dependency.
>>> Spark version is 0.9.0
>>>
>>>
>>> ~Vipul
>>>
>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>>
>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>>>
>>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>>>
>>>> - Patrick
>>>>
>>>>
>>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com> wrote:
>>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. any word on this one?
>>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>>>
>>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
>>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
>>>> > on each of the spark worker nodes.
>>>> > The message is compiled using 2.5 but then on runtime it is being
>>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>>> >
>>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> >
>>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>>>> > use different version of protobuf in the application.
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >
>>>> > --
>>>> > View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>>
>>>
>>>
>>
>>
>
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Patrick Wendell <pw...@gmail.com>.
It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey <vi...@gmail.com> wrote:
> how do you recommend building that - it says
> ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly
> (default-cli) on project spark-0.9.0-incubating: Error reading assemblies:
> No assembly descriptors found. -> [Help 1]
> upon runnning
> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>
>
> On Apr 1, 2014, at 4:13 PM, Patrick Wendell <pw...@gmail.com> wrote:
>
> Do you get the same problem if you build with maven?
>
>
> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vi...@gmail.com> wrote:
>
>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>>
>> That's all I do.
>>
>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>> Vidal - could you show exactly what flags/commands you are using when you
>> build spark to produce this assembly?
>>
>>
>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>>
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> No i'm not. Although I see that protobuf libraries are directly pulled
>>> into the 0.9.0 assembly jar - I do see the shaded version as well.
>>> e.g. below for Message.class
>>>
>>> -bash-4.1$ jar -ftv
>>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>>> | grep protobuf | grep /Message.class
>>> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>> 508 Sat Dec 14 14:20:38 PST 2013
>>> com/google/protobuf_spark/Message.class
>>>
>>>
>>> Does your project have other dependencies that might be indirectly
>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>>> your dependencies including the exact Spark version and other libraries.
>>>
>>> I did have another one which I moved to the end of classpath - even ran
>>> partial code without that dependency but it still failed whenever I use the
>>> jar with ScalaBuf dependency.
>>> Spark version is 0.9.0
>>>
>>>
>>> ~Vipul
>>>
>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>>
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> Does your project have other dependencies that might be indirectly
>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>>> your dependencies including the exact Spark version and other libraries.
>>>
>>> - Patrick
>>>
>>>
>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com>wrote:
>>>
>>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
>>>> issue. any word on this one?
>>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>>>
>>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming
>>>> 0.9 with
>>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
>>>> deployed
>>>> > on each of the spark worker nodes.
>>>> > The message is compiled using 2.5 but then on runtime it is being
>>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>>> >
>>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>> >
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> >
>>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the
>>>> article -
>>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be
>>>> able to
>>>> > use different version of protobuf in the application.
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >
>>>> > --
>>>> > View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>>> > Sent from the Apache Spark User List mailing list archive at
>>>> Nabble.com <http://nabble.com/>.
>>>>
>>>>
>>>
>>>
>>
>>
>
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Vipul Pandey <vi...@gmail.com>.
how do you recommend building that - it says
ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No assembly descriptors found. -> [Help 1]
upon runnning
mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
On Apr 1, 2014, at 4:13 PM, Patrick Wendell <pw...@gmail.com> wrote:
> Do you get the same problem if you build with maven?
>
>
> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vi...@gmail.com> wrote:
> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>
> That's all I do.
>
> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
>
>> Vidal - could you show exactly what flags/commands you are using when you build spark to produce this assembly?
>>
>>
>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> No i'm not. Although I see that protobuf libraries are directly pulled into the 0.9.0 assembly jar - I do see the shaded version as well.
>> e.g. below for Message.class
>>
>> -bash-4.1$ jar -ftv ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar | grep protobuf | grep /Message.class
>> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>> 508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>>
>>
>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>
>> I did have another one which I moved to the end of classpath - even ran partial code without that dependency but it still failed whenever I use the jar with ScalaBuf dependency.
>> Spark version is 0.9.0
>>
>>
>> ~Vipul
>>
>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> Does your project have other dependencies that might be indirectly pulling in protobuf 2.4.1? It would be helpful if you could list all of your dependencies including the exact Spark version and other libraries.
>>>
>>> - Patrick
>>>
>>>
>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com> wrote:
>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. any word on this one?
>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>>
>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
>>> > on each of the spark worker nodes.
>>> > The message is compiled using 2.5 but then on runtime it is being
>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>> >
>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> >
>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>>> > use different version of protobuf in the application.
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>>
>>
>>
>
>
Re: Using ProtoBuf 2.5 for messages with Spark Streaming
Posted by Patrick Wendell <pw...@gmail.com>.
Do you get the same problem if you build with maven?
On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vi...@gmail.com> wrote:
> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>
> That's all I do.
>
> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pw...@gmail.com> wrote:
>
> Vidal - could you show exactly what flags/commands you are using when you
> build spark to produce this assembly?
>
>
> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vi...@gmail.com> wrote:
>
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>> be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> No i'm not. Although I see that protobuf libraries are directly pulled
>> into the 0.9.0 assembly jar - I do see the shaded version as well.
>> e.g. below for Message.class
>>
>> -bash-4.1$ jar -ftv
>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>> | grep protobuf | grep /Message.class
>> 478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>> 508 Sat Dec 14 14:20:38 PST 2013
>> com/google/protobuf_spark/Message.class
>>
>>
>> Does your project have other dependencies that might be indirectly
>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>> your dependencies including the exact Spark version and other libraries.
>>
>> I did have another one which I moved to the end of classpath - even ran
>> partial code without that dependency but it still failed whenever I use the
>> jar with ScalaBuf dependency.
>> Spark version is 0.9.0
>>
>>
>> ~Vipul
>>
>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>> be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> Does your project have other dependencies that might be indirectly
>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>> your dependencies including the exact Spark version and other libraries.
>>
>> - Patrick
>>
>>
>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vi...@gmail.com>wrote:
>>
>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
>>> issue. any word on this one?
>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <ka...@gmail.com> wrote:
>>>
>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming
>>> 0.9 with
>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
>>> deployed
>>> > on each of the spark worker nodes.
>>> > The message is compiled using 2.5 but then on runtime it is being
>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>> >
>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> >
>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the
>>> article -
>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be
>>> able to
>>> > use different version of protobuf in the application.
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>> > Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com <http://nabble.com/>.
>>>
>>>
>>
>>
>
>