You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sea <26...@qq.com> on 2015/08/13 12:43:28 UTC

回复:please help with ClassNotFoundException

Are you using 1.4.0?  If yes, use 1.4.1




------------------ 原始邮件 ------------------
发件人: "周千昊";<qh...@apache.org>;
发送时间: 2015年8月13日(星期四) 晚上6:04
收件人: "dev"<de...@spark.apache.org>; 

主题: please help with ClassNotFoundException



Hi,    I am using spark 1.4 when an issue occurs to me.
    I am trying to use the aggregate function:
    JavaRdd<String> rdd = some rdd;
    HashMap<Long, TypeA> zeroValue = new HashMap();
    // add initial key-value pair for zeroValue
    rdd.aggregate(zeroValue, 
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>>(){//implementation},
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>(){//implementation})


    here is the stack trace when i run the application:


Caused by: java.lang.ClassNotFoundException: TypeA
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:274)
	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
	at java.util.HashMap.readObject(HashMap.java:1180)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
	at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
	at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
	at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
	at org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
	at org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)

     however I have checked that TypeA is in the jar file which is in the classpath
    And when I use an empty HashMap as the zeroValue, the exception has gone
    Does anyone meet the same problem, or can anyone help me with it?

Re: please help with ClassNotFoundException

Posted by 周千昊 <z....@gmail.com>.
Hi Sea
     I have updated spark to 1.4.1, however the problem still exists, any
idea?

Sea <26...@qq.com>于2015年8月14日周五 上午12:36写道:

> Yes, I guess so. I see this bug before.
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "周千昊";<z....@gmail.com>;
> *发送时间:* 2015年8月13日(星期四) 晚上9:30
> *收件人:* "Sea"<26...@qq.com>; "dev@spark.apache.org"<
> dev@spark.apache.org>;
> *主题:* Re: please help with ClassNotFoundException
>
> Hi sea
>     Is it the same issue as
> https://issues.apache.org/jira/browse/SPARK-8368
>
> Sea <26...@qq.com>于2015年8月13日周四 下午6:52写道:
>
>> Are you using 1.4.0?  If yes, use 1.4.1
>>
>>
>> ------------------ 原始邮件 ------------------
>> *发件人:* "周千昊";<qh...@apache.org>;
>> *发送时间:* 2015年8月13日(星期四) 晚上6:04
>> *收件人:* "dev"<de...@spark.apache.org>;
>> *主题:* please help with ClassNotFoundException
>>
>> Hi,
>>     I am using spark 1.4 when an issue occurs to me.
>>     I am trying to use the aggregate function:
>>     JavaRdd<String> rdd = some rdd;
>>     HashMap<Long, TypeA> zeroValue = new HashMap();
>>     // add initial key-value pair for zeroValue
>>     rdd.aggregate(zeroValue,
>>                    new Function2<HashMap<Long, TypeA>,
>>                         String,
>>                         HashMap<Long, TypeA>>(){//implementation},
>>                    new Function2<HashMap<Long, TypeA>,
>>                         String,
>>                         HashMap<Long, TypeA>(){//implementation})
>>
>>     here is the stack trace when i run the application:
>>
>> Caused by: java.lang.ClassNotFoundException: TypeA
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> at java.lang.Class.forName0(Native Method)
>> at java.lang.Class.forName(Class.java:274)
>> at
>> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
>> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>> at java.util.HashMap.readObject(HashMap.java:1180)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>> at
>> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
>> at
>> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
>> at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
>> at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
>> at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
>> at
>> org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
>> at
>> org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
>>      *however I have checked that TypeA is in the jar file which is in
>> the classpath*
>> *    And when I use an empty HashMap as the zeroValue, the exception has
>> gone*
>> *    Does anyone meet the same problem, or can anyone help me with it?*
>>
> --
> Best Regard
> ZhouQianhao
>
-- 
Best Regard
ZhouQianhao

Re: please help with ClassNotFoundException

Posted by Sea <26...@qq.com>.
Yes, I guess so. I see this bug before.




------------------ 原始邮件 ------------------
发件人: "周千昊";<z....@gmail.com>;
发送时间: 2015年8月13日(星期四) 晚上9:30
收件人: "Sea"<26...@qq.com>; "dev@spark.apache.org"<de...@spark.apache.org>; 

主题: Re: please help with ClassNotFoundException



Hi sea    Is it the same issue as https://issues.apache.org/jira/browse/SPARK-8368


Sea <26...@qq.com>于2015年8月13日周四 下午6:52写道:

Are you using 1.4.0?  If yes, use 1.4.1




------------------ 原始邮件 ------------------
发件人: "周千昊";<qh...@apache.org>;
发送时间: 2015年8月13日(星期四) 晚上6:04
收件人: "dev"<de...@spark.apache.org>; 

主题: please help with ClassNotFoundException




Hi,    I am using spark 1.4 when an issue occurs to me.
    I am trying to use the aggregate function:
    JavaRdd<String> rdd = some rdd;
    HashMap<Long, TypeA> zeroValue = new HashMap();
    // add initial key-value pair for zeroValue
    rdd.aggregate(zeroValue, 
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>>(){//implementation},
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>(){//implementation})


    here is the stack trace when i run the application:


Caused by: java.lang.ClassNotFoundException: TypeA
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:274)
	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
	at java.util.HashMap.readObject(HashMap.java:1180)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
	at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
	at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
	at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
	at org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
	at org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)

     however I have checked that TypeA is in the jar file which is in the classpath
    And when I use an empty HashMap as the zeroValue, the exception has gone
    Does anyone meet the same problem, or can anyone help me with it?



-- 

Best RegardZhouQianhao

Re: please help with ClassNotFoundException

Posted by 周千昊 <z....@gmail.com>.
Hi sea
    Is it the same issue as https://issues.apache.org/jira/browse/SPARK-8368

Sea <26...@qq.com>于2015年8月13日周四 下午6:52写道:

> Are you using 1.4.0?  If yes, use 1.4.1
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "周千昊";<qh...@apache.org>;
> *发送时间:* 2015年8月13日(星期四) 晚上6:04
> *收件人:* "dev"<de...@spark.apache.org>;
> *主题:* please help with ClassNotFoundException
>
> Hi,
>     I am using spark 1.4 when an issue occurs to me.
>     I am trying to use the aggregate function:
>     JavaRdd<String> rdd = some rdd;
>     HashMap<Long, TypeA> zeroValue = new HashMap();
>     // add initial key-value pair for zeroValue
>     rdd.aggregate(zeroValue,
>                    new Function2<HashMap<Long, TypeA>,
>                         String,
>                         HashMap<Long, TypeA>>(){//implementation},
>                    new Function2<HashMap<Long, TypeA>,
>                         String,
>                         HashMap<Long, TypeA>(){//implementation})
>
>     here is the stack trace when i run the application:
>
> Caused by: java.lang.ClassNotFoundException: TypeA
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:274)
> at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at java.util.HashMap.readObject(HashMap.java:1180)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
> at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
> at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
> at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
> at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
> at
> org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
> at
> org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
>      *however I have checked that TypeA is in the jar file which is in
> the classpath*
> *    And when I use an empty HashMap as the zeroValue, the exception has
> gone*
> *    Does anyone meet the same problem, or can anyone help me with it?*
>
-- 
Best Regard
ZhouQianhao