You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jonathan Chayat <jo...@supersonic.com> on 2014/12/19 12:00:44 UTC

Does Spark 1.2.0 support Scala 2.11?

The following ticket:

https://issues.apache.org/jira/browse/SPARK-1812

for supporting 2.11 have been marked as fixed in 1.2,
but the docs in the Spark site still say that 2.10 is required.

Thanks,
    Jon

Re: Does Spark 1.2.0 support Scala 2.11?

Posted by Sean Owen <so...@cloudera.com>.
You might interpret that as 2.10+. Although 2.10 is still the main
version in use, I think, you can see 2.11 artifacts have been
published: http://search.maven.org/#artifactdetails%7Corg.apache.spark%7Cspark-core_2.11%7C1.2.0%7Cjar

On Fri, Dec 19, 2014 at 11:00 AM, Jonathan Chayat
<jo...@supersonic.com> wrote:
> The following ticket:
>
> https://issues.apache.org/jira/browse/SPARK-1812
>
> for supporting 2.11 have been marked as fixed in 1.2,
> but the docs in the Spark site still say that 2.10 is required.
>
> Thanks,
>     Jon

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Does Spark 1.2.0 support Scala 2.11?

Posted by Gerard Maas <ge...@gmail.com>.
Check out the 'compiling for Scala 2.11'  instructions:

http://spark.apache.org/docs/1.2.0/building-spark.html#building-for-scala-211

-kr, Gerard.

On Fri, Dec 19, 2014 at 12:00 PM, Jonathan Chayat <jonathan.c@supersonic.com
> wrote:
>
> The following ticket:
>
> https://issues.apache.org/jira/browse/SPARK-1812
>
> for supporting 2.11 have been marked as fixed in 1.2,
> but the docs in the Spark site still say that 2.10 is required.
>
> Thanks,
>     Jon
>