You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Donghai Yu (JIRA)" <ji...@apache.org> on 2016/06/20 10:44:05 UTC

[jira] [Commented] (SPARK-16065) Throw a exception "java.lang.ClassNotFoundException" when run the spark-submit

    [ https://issues.apache.org/jira/browse/SPARK-16065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15339319#comment-15339319 ] 

Donghai Yu commented on SPARK-16065:
------------------------------------

Yes, the classes are in that jar. and "new this class directly" is successful.

> Throw a exception "java.lang.ClassNotFoundException" when run the spark-submit
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-16065
>                 URL: https://issues.apache.org/jira/browse/SPARK-16065
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>         Environment: Spark 1.6
> jdk1.7.0_80
>            Reporter: Donghai Yu
>
> When run the case if we don't use the spark-submit, it can work. it is failed if use spark-submit and the jar included the Record is in "--jars".
> Then I did a simple testing and write below small program in main to test it. when readObject, the exception happened. and it is wrote into file successfully.
>     try {
>       var is: ObjectInputStream = null
>       try {
>         var recordList: List[EagerRecord] = null
>         val record = new EagerRecord()
>         recordList = record :: recordList
>         val in: ObjectOutputStream = new ObjectOutputStream(new BufferedOutputStream(new FileOutputStream("//home/hadoop/1")))
>         in.writeObject(new SerializableRecordSet(recordList, ""))
>         in.close()
>         is = new ObjectInputStream(new BufferedInputStream(new FileInputStream("//home/hadoop/1")))
>         is.readObject()
>       } finally {
>         //IOUtils.close(is)
>       }
>     } catch {
>       case ex: Exception => {
>         println(ex)
>         throw ex
>       }
>     } finally {
>       //...
>     }
> // Error message:
> $spark-submit --master yarn-client --num-executors 5 --executor-memory 6G --class test --jars record.jar test.jar
> java.lang.ClassNotFoundException: com.EagerRecord
> Exception in thread "main" java.lang.ClassNotFoundException: com.EagerRecord
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:278)
> 	at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:625)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> 	at test$.main(test.scala:22)
> 	at test.main(test.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org