You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Lanny Ripple <la...@spotright.com> on 2015/08/24 21:48:21 UTC

spark and scala-2.11

Hello,

The instructions for building spark against scala-2.11 indicate using
-Dspark-2.11.  When I look in the pom.xml I find a profile named
'spark-2.11' but nothing that would indicate I should set a property.  The
sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
does a simple grep of scala.version (which doesn't change after running dev/
change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
scala library.

Anyone know (from having done it and used it in production) if the build
instructions for spark-1.4.1 against Scala-2.11 are correct?

Thanks.
  -Lanny

Re: spark and scala-2.11

Posted by Lanny Ripple <la...@spotright.com>.
We're going to be upgrading from spark 1.0.2 and using hadoop-1.2.1 so need
to build by hand.  (Yes, I know. Use hadoop-2.x but standard resource
constraints apply.)  I want to build against scala-2.11 and publish to our
artifact repository but finding build/spark-2.10.4 and tracing down what
build/mvn was doing had me concerned that I was missing something.  I'll
hold the course and build it as instructed.

Thanks for the info, all.

PS - Since asked -- PATH=./build/apache-maven-3.2.5/bin:$PATH; build/mvn
-Phadoop-1 -Dhadoop.version=1.2.1 -Dscala-2.11 -DskipTests package

On Mon, Aug 24, 2015 at 2:49 PM, Jonathan Coveney <jc...@gmail.com>
wrote:

> I've used the instructions and it worked fine.
>
> Can you post exactly what you're doing, and what it fails with? Or are you
> just trying to understand how it works?
>
> 2015-08-24 15:48 GMT-04:00 Lanny Ripple <la...@spotright.com>:
>
>> Hello,
>>
>> The instructions for building spark against scala-2.11 indicate using
>> -Dspark-2.11.  When I look in the pom.xml I find a profile named
>> 'spark-2.11' but nothing that would indicate I should set a property.  The
>> sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
>> does a simple grep of scala.version (which doesn't change after running dev/
>> change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
>> scala library.
>>
>> Anyone know (from having done it and used it in production) if the build
>> instructions for spark-1.4.1 against Scala-2.11 are correct?
>>
>> Thanks.
>>   -Lanny
>>
>
>

Re: spark and scala-2.11

Posted by Jonathan Coveney <jc...@gmail.com>.
I've used the instructions and it worked fine.

Can you post exactly what you're doing, and what it fails with? Or are you
just trying to understand how it works?

2015-08-24 15:48 GMT-04:00 Lanny Ripple <la...@spotright.com>:

> Hello,
>
> The instructions for building spark against scala-2.11 indicate using
> -Dspark-2.11.  When I look in the pom.xml I find a profile named
> 'spark-2.11' but nothing that would indicate I should set a property.  The
> sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
> does a simple grep of scala.version (which doesn't change after running dev/
> change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
> scala library.
>
> Anyone know (from having done it and used it in production) if the build
> instructions for spark-1.4.1 against Scala-2.11 are correct?
>
> Thanks.
>   -Lanny
>

Re: spark and scala-2.11

Posted by Sean Owen <so...@cloudera.com>.
The property "scala-2.11" triggers the profile "scala-2.11" -- and
additionally disables the scala-2.10 profile, so that's the way to do
it. But yes, you also need to run the script before-hand to set up the
build for Scala 2.11 as well.

On Mon, Aug 24, 2015 at 8:48 PM, Lanny Ripple <la...@spotright.com> wrote:
> Hello,
>
> The instructions for building spark against scala-2.11 indicate using
> -Dspark-2.11.  When I look in the pom.xml I find a profile named
> 'spark-2.11' but nothing that would indicate I should set a property.  The
> sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
> does a simple grep of scala.version (which doesn't change after running
> dev/change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
> scala library.
>
> Anyone know (from having done it and used it in production) if the build
> instructions for spark-1.4.1 against Scala-2.11 are correct?
>
> Thanks.
>   -Lanny

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org