You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Vijayasarathy Kannan <kv...@vt.edu> on 2015/03/19 20:17:13 UTC

Issues with SBT and Spark

My current simple.sbt is

name := "SparkEpiFast"

version := "1.0"

scalaVersion := "2.11.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.1" %
"provided"

libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "1.2.1" %
"provided"

While I do "sbt package", it compiles successfully. But while running the
application, I get
"Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;"

However, changing the scala version to 2.10.4 and updating the dependency
lines appropriately resolves the issue (no exception).

Could anyone please point out what I am doing wrong?

Re: Issues with SBT and Spark

Posted by Sean Owen <so...@cloudera.com>.
No, Spark is cross-built for 2.11 too, and those are the deps being
pulled in here. This really does however sounds like a Scala 2.10 vs
2.11 mismatch. Check that, for example, your cluster is using the same
build of Spark and that you did not package Spark with your app

On Thu, Mar 19, 2015 at 3:36 PM, Masf <ma...@gmail.com> wrote:
> Hi
>
> Spark 1.2.1 uses Scala 2.10. Because of this, your program fails with scala
> 2.11
>
> Regards
>
> On Thu, Mar 19, 2015 at 8:17 PM, Vijayasarathy Kannan <kv...@vt.edu> wrote:
>>
>> My current simple.sbt is
>>
>> name := "SparkEpiFast"
>>
>> version := "1.0"
>>
>> scalaVersion := "2.11.4"
>>
>> libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.1" %
>> "provided"
>>
>> libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "1.2.1"
>> % "provided"
>>
>> While I do "sbt package", it compiles successfully. But while running the
>> application, I get
>> "Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;"
>>
>> However, changing the scala version to 2.10.4 and updating the dependency
>> lines appropriately resolves the issue (no exception).
>>
>> Could anyone please point out what I am doing wrong?
>
>
>
>
> --
>
>
> Saludos.
> Miguel Ángel

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Issues with SBT and Spark

Posted by Masf <ma...@gmail.com>.
Hi

Spark 1.2.1 uses Scala 2.10. Because of this, your program fails with scala
2.11

Regards

On Thu, Mar 19, 2015 at 8:17 PM, Vijayasarathy Kannan <kv...@vt.edu> wrote:

> My current simple.sbt is
>
> name := "SparkEpiFast"
>
> version := "1.0"
>
> scalaVersion := "2.11.4"
>
> libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.1" %
> "provided"
>
> libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "1.2.1"
> % "provided"
>
> While I do "sbt package", it compiles successfully. But while running the
> application, I get
> "Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;"
>
> However, changing the scala version to 2.10.4 and updating the dependency
> lines appropriately resolves the issue (no exception).
>
> Could anyone please point out what I am doing wrong?
>



-- 


Saludos.
Miguel Ángel