You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by 周千昊 <qh...@apache.org> on 2015/08/13 11:57:25 UTC

serialization issue

Hi,
    I am using spark 1.4 when an issue occurs to me.
    I am trying to use the aggregate function:
    JavaRdd<String> rdd = some rdd;
    HashMap<Long, TypeA> zeroValue = new HashMap();
    // add initial key-value pair for zeroValue
    rdd.aggregate(zeroValue,
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>>(){//implementation},
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>(){//implementation})

    here is the stack trace when i run the application:

Caused by: java.lang.ClassNotFoundException: TypeA
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at java.util.HashMap.readObject(HashMap.java:1180)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
at
org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
at
org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
     *however I have checked that TypeA is in the jar file which is in the
classpath*
*    And when I use an empty HashMap as the zeroValue, the exception has
gone*

Re: serialization issue

Posted by Anish Haldiya <an...@sigmoidanalytics.com>.
While submitting the job, you can use --jars, --driver-classpath etc
configurations to add the jar. Apart from that if you are running the
job as a standalone application, then you can use the sc.addJar option
to add the jar (which will ship this jar into all the executors)

Regards,

Anish


On 8/13/15, 周千昊 <qh...@apache.org> wrote:
> Hi,
>     I am using spark 1.4 when an issue occurs to me.
>     I am trying to use the aggregate function:
>     JavaRdd<String> rdd = some rdd;
>     HashMap<Long, TypeA> zeroValue = new HashMap();
>     // add initial key-value pair for zeroValue
>     rdd.aggregate(zeroValue,
>                    new Function2<HashMap<Long, TypeA>,
>                         String,
>                         HashMap<Long, TypeA>>(){//implementation},
>                    new Function2<HashMap<Long, TypeA>,
>                         String,
>                         HashMap<Long, TypeA>(){//implementation})
>
>     here is the stack trace when i run the application:
>
> Caused by: java.lang.ClassNotFoundException: TypeA
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:274)
> at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at java.util.HashMap.readObject(HashMap.java:1180)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
> at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
> at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
> at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
> at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
> at
> org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
> at
> org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
>      *however I have checked that TypeA is in the jar file which is in the
> classpath*
> *    And when I use an empty HashMap as the zeroValue, the exception has
> gone*
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org