You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kmurph <k....@qub.ac.uk> on 2014/12/12 15:08:16 UTC

Re: Spark 1.1.1, Hadoop 2.6 - Protobuf conflict

I had this problem also with spark 1.1.1.  At the time I was using hadoop
0.20.

To get around it I installed hadoop 2.5.2, and set the protobuf.version to
2.5.0 in the build command like so:
    mvn -Phadoop-2.5 -Dhadoop.version=2.5.2 -Dprotobuf.version=2.5.0
-DskipTests clean package

So I changed spark's pom.xml to read the protobuf.version from the command
line.
If I didn't explicitly set protobuf.version it was picking up an older
version that existed on my filesystem somewhere,

Karen



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-1-1-Hadoop-2-6-Protobuf-conflict-tp20656p20658.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark 1.1.1, Hadoop 2.6 - Protobuf conflict

Posted by Sean Owen <so...@cloudera.com>.
There is no hadoop-2.5 profile. You can use hadoop-2.4 for 2.4+. This
profile already sets protobuf.version to 2.5.0 for this reason. It is
already something you can set on the command line as it is read as a
Maven build property. It does not pick up an older version because
it's somewhere on your system; it uses what is appropriate for your
Hadoop version. No changes needed, you just need to follow
http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
and set one profile and hadoop version.

On Fri, Dec 12, 2014 at 2:08 PM, kmurph <k....@qub.ac.uk> wrote:
> I had this problem also with spark 1.1.1.  At the time I was using hadoop
> 0.20.
>
> To get around it I installed hadoop 2.5.2, and set the protobuf.version to
> 2.5.0 in the build command like so:
>     mvn -Phadoop-2.5 -Dhadoop.version=2.5.2 -Dprotobuf.version=2.5.0
> -DskipTests clean package
>
> So I changed spark's pom.xml to read the protobuf.version from the command
> line.
> If I didn't explicitly set protobuf.version it was picking up an older
> version that existed on my filesystem somewhere,
>
> Karen
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-1-1-Hadoop-2-6-Protobuf-conflict-tp20656p20658.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org