You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jack Hu (JIRA)" <ji...@apache.org> on 2014/08/28 06:22:00 UTC

[jira] [Updated] (SPARK-3274) Spark Streaming Java API reports java.lang.ClassCastException when calling collectAsMap on JavaPairDStream

     [ https://issues.apache.org/jira/browse/SPARK-3274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jack Hu updated SPARK-3274:
---------------------------

    Description: 
Reproduce code:

scontext
		.socketTextStream("localhost", 18888)
		.mapToPair(new PairFunction<String, String, String>(){
			public Tuple2<String, String> call(String arg0)
					throws Exception {
				return new Tuple2<String, String>("1", arg0);
			}
		})
		.foreachRDD(new Function2<JavaPairRDD<String, String>, Time, Void>() {
			public Void call(JavaPairRDD<String, String> v1, Time v2) throws Exception {
				System.out.println(v2.toString() + ": " + v1.collectAsMap().toString());
				return null;
			}
		});

Exception:
java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to [Lscala.Tupl
e2;
        at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.s
cala:447)
        at org.apache.spark.api.java.JavaPairRDD.collectAsMap(JavaPairRDD.scala:
464)
        at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:90)
        at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:88)
        at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
DD$2.apply(JavaDStreamLike.scala:282)
        at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
DD$2.apply(JavaDStreamLike.scala:282)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mc
V$sp(ForEachDStream.scala:41)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
rEachDStream.scala:40)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
rEachDStream.scala:40)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.streaming.scheduler.Job.run(Job.scala:32)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobS

  was:
Reproduce code:

scontext
		.socketTextStream("localhost", 18888)
		.mapToPair(new PairFunction<String, String, String>(){
			public Tuple2<String, String> call(String arg0)
					throws Exception {
				// TODO Auto-generated method stub
				return new Tuple2<String, String>("1", arg0);
			}
		})
		.foreachRDD(new Function2<JavaPairRDD<String, String>, Time, Void>() {
			public Void call(JavaPairRDD<String, String> v1, Time v2) throws Exception {
				System.out.println(v2.toString() + ": " + v1.collectAsMap().toString());
				return null;
			}
		});

Exception:
java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to [Lscala.Tupl
e2;
        at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.s
cala:447)
        at org.apache.spark.api.java.JavaPairRDD.collectAsMap(JavaPairRDD.scala:
464)
        at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:90)
        at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:88)
        at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
DD$2.apply(JavaDStreamLike.scala:282)
        at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
DD$2.apply(JavaDStreamLike.scala:282)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mc
V$sp(ForEachDStream.scala:41)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
rEachDStream.scala:40)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
rEachDStream.scala:40)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.streaming.scheduler.Job.run(Job.scala:32)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobS


> Spark Streaming Java API reports java.lang.ClassCastException when calling collectAsMap on JavaPairDStream
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-3274
>                 URL: https://issues.apache.org/jira/browse/SPARK-3274
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 1.0.2
>            Reporter: Jack Hu
>
> Reproduce code:
> scontext
> 		.socketTextStream("localhost", 18888)
> 		.mapToPair(new PairFunction<String, String, String>(){
> 			public Tuple2<String, String> call(String arg0)
> 					throws Exception {
> 				return new Tuple2<String, String>("1", arg0);
> 			}
> 		})
> 		.foreachRDD(new Function2<JavaPairRDD<String, String>, Time, Void>() {
> 			public Void call(JavaPairRDD<String, String> v1, Time v2) throws Exception {
> 				System.out.println(v2.toString() + ": " + v1.collectAsMap().toString());
> 				return null;
> 			}
> 		});
> Exception:
> java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to [Lscala.Tupl
> e2;
>         at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.s
> cala:447)
>         at org.apache.spark.api.java.JavaPairRDD.collectAsMap(JavaPairRDD.scala:
> 464)
>         at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:90)
>         at tuk.usecase.failedcall.FailedCall$1.call(FailedCall.java:88)
>         at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
> DD$2.apply(JavaDStreamLike.scala:282)
>         at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachR
> DD$2.apply(JavaDStreamLike.scala:282)
>         at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mc
> V$sp(ForEachDStream.scala:41)
>         at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
> rEachDStream.scala:40)
>         at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(Fo
> rEachDStream.scala:40)
>         at scala.util.Try$.apply(Try.scala:161)
>         at org.apache.spark.streaming.scheduler.Job.run(Job.scala:32)
>         at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobS



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org