You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "John Snodgrass (JIRA)" <ji...@apache.org> on 2014/05/14 21:41:15 UTC

[jira] [Created] (SPARK-1834) NoSuchMethodError when invoking JavaPairRDD.reduce() in Java

John Snodgrass created SPARK-1834:
-------------------------------------

             Summary: NoSuchMethodError when invoking JavaPairRDD.reduce() in Java
                 Key: SPARK-1834
                 URL: https://issues.apache.org/jira/browse/SPARK-1834
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 0.9.1
         Environment: Redhat Linux, Java 7, Hadoop 2.2, Scala 2.10.4
            Reporter: John Snodgrass


I get a java.lang.NoSuchMethod error when invoking JavaPairRDD.reduce(). Here is the partial stack trace:

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:39)
        at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
        at JavaPairRDDReduceTest.main(JavaPairRDDReduceTest.java:49)    ...

I'm using Spark 0.9.1. I checked to ensure that I'm compiling with the same version of Spark as I am running on the cluster. The reduce() method works fine with JavaRDD, just not with JavaPairRDD. Here is a code snippet that exhibits the problem: 

      ArrayList<Integer> array = new ArrayList<>();
      for (int i = 0; i < 10; ++i) {
        array.add(i);
      }
      JavaRDD<Integer> rdd = javaSparkContext.parallelize(array);

      JavaPairRDD<String, Integer> testRDD = rdd.map(new PairFunction<Integer, String, Integer>() {
        @Override
        public Tuple2<String, Integer> call(Integer t) throws Exception {
          return new Tuple2<>("" + t, t);
        }
      }).cache();
      
      testRDD.reduce(new Function2<Tuple2<String, Integer>, Tuple2<String, Integer>, Tuple2<String, Integer>>() {
        @Override
        public Tuple2<String, Integer> call(Tuple2<String, Integer> arg0, Tuple2<String, Integer> arg1) throws Exception { 
          return new Tuple2<>(arg0._1 + arg1._1, arg0._2 * 10 + arg0._2);
        }
      });





--
This message was sent by Atlassian JIRA
(v6.2#6252)