You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "vishal.verma" <vi...@gmail.com> on 2020/03/02 14:39:37 UTC
Java Spark UDF cast exception
*Facing casting issues while working with the spark UDF*
UDF1 mode1 = new UDF1<WrappedArray<Map<Double, Integer>>, String>()
{
@Override
public String call(WrappedArray<Map<Double, Integer>> maps) throws
Exception {
List<Map<Double, Integer>> lis = (List<Map<Double, Integer>>)
JavaConverters.seqAsJavaListConverter(maps).asJava();
java.util.Map<Double,Integer> a= lis.stream().flatMap(map ->
map.entrySet().stream())
.collect(Collectors.toMap(Map.Entry::getKey,
Map.Entry::getValue));
System.out.println(a.get(key));
return "";
}
};
*error:
*
/ Caused by: java.lang.ClassCastException:
scala.collection.immutable.Map$Map1 cannot be cast to java.util.Map at
java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:269)
/
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org