You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/10/02 14:46:33 UTC
[jira] [Comment Edited] (SPARK-1834) NoSuchMethodError when
invoking JavaPairRDD.reduce() in Java
[ https://issues.apache.org/jira/browse/SPARK-1834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14156477#comment-14156477 ]
Sean Owen edited comment on SPARK-1834 at 10/2/14 12:46 PM:
------------------------------------------------------------
Weird, I can reproduce this. It compiles fine but fails at runtime. Another example, that doesn't even use lambdas:
{code}
@Test
public void pairReduce() {
JavaRDD<Integer> rdd = sc.parallelize(Arrays.asList(1, 1, 2, 3, 5, 8, 13));
JavaPairRDD<Integer,Integer> pairRDD = rdd.mapToPair(
new PairFunction<Integer, Integer, Integer>() {
@Override
public Tuple2<Integer, Integer> call(Integer i) {
return new Tuple2<Integer, Integer>(i, i + 1);
}
});
// See SPARK-1834
Tuple2<Integer, Integer> reduced = pairRDD.reduce(
new Function2<Tuple2<Integer,Integer>, Tuple2<Integer,Integer>, Tuple2<Integer,Integer>>() {
@Override
public Tuple2<Integer, Integer> call(Tuple2<Integer, Integer> t1,
Tuple2<Integer, Integer> t2) {
return new Tuple2<Integer, Integer>(t1._1() + t2._1(), t1._2() + t2._2());
}
});
Assert.assertEquals(33, reduced._1().intValue());
Assert.assertEquals(40, reduced._1().intValue());
}
{code}
but...
{code}
java.lang.NoSuchMethodError: org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
{code}
I decompiled the class and it really looks like the method is there with the expected signature:
{code}
public scala.Tuple2<K, V> reduce(org.apache.spark.api.java.function.Function2<scala.Tuple2<K, V>, scala.Tuple2<K, V>, scala.Tuple2<K, V>>);
{code}
Color me pretty confused.
was (Author: srowen):
Weird, I can reproduce this. I have a new test case for {{JavaAPISuite}} and am investigating. It compiles fine but fails at runtime. I sense Scala shenanigans.
> NoSuchMethodError when invoking JavaPairRDD.reduce() in Java
> ------------------------------------------------------------
>
> Key: SPARK-1834
> URL: https://issues.apache.org/jira/browse/SPARK-1834
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 0.9.1
> Environment: Redhat Linux, Java 7, Hadoop 2.2, Scala 2.10.4
> Reporter: John Snodgrass
>
> I get a java.lang.NoSuchMethod error when invoking JavaPairRDD.reduce(). Here is the partial stack trace:
> Exception in thread "main" java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:39)
> at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
> Caused by: java.lang.NoSuchMethodError: org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
> at JavaPairRDDReduceTest.main(JavaPairRDDReduceTest.java:49) ...
> I'm using Spark 0.9.1. I checked to ensure that I'm compiling with the same version of Spark as I am running on the cluster. The reduce() method works fine with JavaRDD, just not with JavaPairRDD. Here is a code snippet that exhibits the problem:
> ArrayList<Integer> array = new ArrayList<>();
> for (int i = 0; i < 10; ++i) {
> array.add(i);
> }
> JavaRDD<Integer> rdd = javaSparkContext.parallelize(array);
> JavaPairRDD<String, Integer> testRDD = rdd.map(new PairFunction<Integer, String, Integer>() {
> @Override
> public Tuple2<String, Integer> call(Integer t) throws Exception {
> return new Tuple2<>("" + t, t);
> }
> }).cache();
>
> testRDD.reduce(new Function2<Tuple2<String, Integer>, Tuple2<String, Integer>, Tuple2<String, Integer>>() {
> @Override
> public Tuple2<String, Integer> call(Tuple2<String, Integer> arg0, Tuple2<String, Integer> arg1) throws Exception {
> return new Tuple2<>(arg0._1 + arg1._1, arg0._2 * 10 + arg0._2);
> }
> });
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org