You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jared Rodriguez <jr...@kitedesk.com> on 2014/04/28 12:59:30 UTC

NoSuchMethodError from Spark Java

I am seeing the following exception from a very basic test project when it
runs on spark local.

java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;

The project is built with Java 1.6, Scala 2.10.3 and spark 0.9.1

The error occurs on mapped.reduce() below.

The code is quite simple:

JavaSparkContext sc = new JavaSparkContext("local[4]", "My Test App");

List<String> rd = buildData();

JavaRDD<String> data = sc.parallelize(rd);


JavaPairRDD<String, List<String>> mapped = data.map(new
PairFunction<String, String, List<String>>() {
@Override
public Tuple2<String, List<String>> call(String value) throws Exception {

// randomly assign matches values between 1 and 4
return new Tuple2<String, List<String>>(value, matches);
}
});

mapped.reduce(new Function2<Tuple2<String, List<String>>, Tuple2<String,
List<String>>, Tuple2<String, List<String>>>() {
@Override
public Tuple2<String, List<String>> call(Tuple2<String, List<String>> t1,
Tuple2<String, List<String>> t2) throws Exception {
System.err.println("REDUCING " + t1 + " with " + t2);
return t1;
}
});



-- 
Jared Rodriguez

Re: NoSuchMethodError from Spark Java

Posted by wxhsdp <wx...@gmail.com>.
Hi, patrick

i checked out https://github.com/apache/spark/ this morning and built
/spark/trunk
with ./sbt/sbt assembly

is it spark 1.0?

so how can i update my sbt file? the latest version in
http://repo1.maven.org/maven2/org/apache/spark/
is 0.9.1

thank you for your help



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5094.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: NoSuchMethodError from Spark Java

Posted by Andras Nemeth <an...@lynxanalytics.com>.
On 30 Apr 2014 06:59, "Patrick Wendell" <pw...@gmail.com> wrote:
>
> The signature of this function was changed in spark 1.0... is there
> any chance that somehow you are actually running against a newer
> version of Spark?
>
> On Tue, Apr 29, 2014 at 8:58 PM, wxhsdp <wx...@gmail.com> wrote:
> > i met with the same question when update to spark 0.9.1
> > (svn checkout https://github.com/apache/spark/)
> >
> > Exception in thread "main" java.lang.NoSuchMethodError:
> >
org.apache.spark.SparkContext$.jarOfClass(Ljava/lang/Class;)Lscala/collection/Seq;
> >         at
org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:38)
> >         at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)
> >
> > sbt.buid:
> > name := "GroupByTest"
> >
> > version := "1.0"
> >
> > scalaVersion := "2.10.4"
> >
> > libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"
> >
> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
> >
> > is there something need to modify?
> >
> >
> >
> >
> > --
> > View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: NoSuchMethodError from Spark Java

Posted by Marcelo Vanzin <va...@cloudera.com>.
Hi,

One thing you can do is set the spark version your project depends on
to "1.0.0-SNAPSHOT" (make sure it matches the version of Spark you're
building); then before building your project, run "sbt publishLocal"
on the Spark tree.

On Wed, Apr 30, 2014 at 12:11 AM, wxhsdp <wx...@gmail.com> wrote:
> i fixed it.
>
> i make my sbt project depend on
> spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
> and it works
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.



-- 
Marcelo

Re: NoSuchMethodError from Spark Java

Posted by wxhsdp <wx...@gmail.com>.
i fixed it. 

i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: NoSuchMethodError from Spark Java

Posted by Patrick Wendell <pw...@gmail.com>.
The signature of this function was changed in spark 1.0... is there
any chance that somehow you are actually running against a newer
version of Spark?

On Tue, Apr 29, 2014 at 8:58 PM, wxhsdp <wx...@gmail.com> wrote:
> i met with the same question when update to spark 0.9.1
> (svn checkout https://github.com/apache/spark/)
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.spark.SparkContext$.jarOfClass(Ljava/lang/Class;)Lscala/collection/Seq;
>         at org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:38)
>         at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)
>
> sbt.buid:
> name := "GroupByTest"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
>
> is there something need to modify?
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: NoSuchMethodError from Spark Java

Posted by wxhsdp <wx...@gmail.com>.
i met with the same question when update to spark 0.9.1
(svn checkout https://github.com/apache/spark/)

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.SparkContext$.jarOfClass(Ljava/lang/Class;)Lscala/collection/Seq;
	at org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:38)
	at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)

sbt.buid:
name := "GroupByTest"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

is there something need to modify?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.