You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/07/02 10:15:04 UTC

[jira] [Resolved] (SPARK-8729) Spark app unable to instantiate the classes using the reflection

     [ https://issues.apache.org/jira/browse/SPARK-8729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-8729.
------------------------------
    Resolution: Not A Problem

> Spark app unable to instantiate the classes using the reflection
> ----------------------------------------------------------------
>
>                 Key: SPARK-8729
>                 URL: https://issues.apache.org/jira/browse/SPARK-8729
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager
>    Affects Versions: 1.3.0
>            Reporter: Murthy Chelankuri
>            Priority: Critical
>
> SPARK 1.3.0 unable to instantiate the classes using the reflection (using Class.forName). It says class not found even that class is available in the list jars.
> The following is the expection i am getting by the executors
> java.lang.ClassNotFoundException: com.abc.mq.msg.ObjectEncoder
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:264)
>     at kafka.utils.Utils$.createObject(Utils.scala:438)
>     at kafka.producer.Producer.<init>(Producer.scala:61)
> The application is working fine with out any issues with 1.2.0 version. 
> I am planing to upgrade to 1.3.0 and found it its not working.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org