You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by ch0ice <gi...@git.apache.org> on 2018/09/06 10:32:22 UTC
[GitHub] spark issue #181: Use the Executor's ClassLoader in sc.objectFile().
Github user ch0ice commented on the issue:
https://github.com/apache/spark/pull/181
This problem arose again for me, and I reproduced it when I converted byte into protoBuf after redis checked the data.
The following code in the deserialization (Utils deserialize (value, Utils getContextOrSparkClassLoader ())) when there is an error
Map<byte[],byte[]> hResult = jedisUtil.hgetAll((REDIS_LASTPOINT_PREFIX + serviceId));
hResult.forEach((key,value)->{
LocationPackage.LocationMessage lastLocationPackage = Utils.deserialize(value,Utils.getContextOrSparkClassLoader());
if(lastLocationPackage.getLocation().getLocationTime() >= sTime && lastLocationPackage.getLocation().getLocationTime() <= eTime){
String gpsRedisKey = REDIS_GPSLIST_PREFIX + serviceId + "_" + lastLocationPackage.getLocation().getTerminalId();
redisKeys.add(gpsRedisKey);
}
});
ERROR:
java.lang.RuntimeException: Unable to find proto buffer class
at com.google.protobuf.GeneratedMessageLite$SerializedForm.readResolve(GeneratedMessageLite.java:775)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1104)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1810)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.util.Utils$.deserialize(Utils.scala:161)
at org.apache.spark.util.Utils.deserialize(Utils.scala)
at com.location.platform.analysis.util.AnalysisUtil.lambda$getRedisData$0(AnalysisUtil.java:310)
at com.location.platform.analysis.util.AnalysisUtil$$Lambda$17/489334328.accept(Unknown Source)
at java.util.Map.forEach(Map.java:630)
at com.location.platform.analysis.util.AnalysisUtil.getRedisData(AnalysisUtil.java:307)
at com.location.platform.analysis.util.AnalysisUtil.getMongoAndRedisDataset(AnalysisUtil.java:372)
at com.location.platform.analysis.bi.TodayOdometerAnalysis.lambda$main$0(TodayOdometerAnalysis.java:96)
at com.location.platform.analysis.bi.TodayOdometerAnalysis$$Lambda$1/1413730361.accept(Unknown Source)
at java.util.ArrayList.forEach(ArrayList.java:1249)
at com.location.platform.analysis.bi.TodayOdometerAnalysis.main(TodayOdometerAnalysis.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.location.platform.common.protobuf.LocationPackage$LocationMessage
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.google.protobuf.GeneratedMessageLite$SerializedForm.readResolve(GeneratedMessageLite.java:768)
... 28 more
Only if there is a problem with the cluster, local mode is fine.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org