You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael McCarthy (JIRA)" <ji...@apache.org> on 2016/09/23 21:49:20 UTC

[jira] [Comment Edited] (SPARK-10713) SPARK_DIST_CLASSPATH ignored on Mesos executors

    [ https://issues.apache.org/jira/browse/SPARK-10713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15517638#comment-15517638 ] 

Michael McCarthy edited comment on SPARK-10713 at 9/23/16 9:48 PM:
-------------------------------------------------------------------

I'm also seeing this issue. Executors get lost with the following error:

{noformat}
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
{noformat}

{{SPARK_DIST_CLASSPATH}} contains the hadoop libraries, which contain this dependency. Using spark-1.6.1-bin-without-hadoop and mesos-0.25.

Does anyone know a workaround?


was (Author: michael.mccarthy):
I'm also seeing this issue. Executors get lost with the following error:

{noformat}
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
{noformat}

{{SPARK_DIST_CLASSPATH}} contains the hadoop libraries, which contain this dependency. Does anyone know a workaround?

> SPARK_DIST_CLASSPATH ignored on Mesos executors
> -----------------------------------------------
>
>                 Key: SPARK-10713
>                 URL: https://issues.apache.org/jira/browse/SPARK-10713
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Mesos
>    Affects Versions: 1.5.0
>            Reporter: Dara Adib
>            Priority: Minor
>
> If I set the environment variable SPARK_DIST_CLASSPATH, the jars are included on the driver, but not on Mesos executors. Docs: https://spark.apache.org/docs/latest/hadoop-provided.html
> I see SPARK_DIST_CLASSPATH mentioned in these files:
> launcher/src/main/java/org/apache/spark/launcher/AbstractCommandBuilder.java
> project/SparkBuild.scala
> yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
> But not the Mesos executor (or should it be included by the launcher library?):
> spark/core/src/main/scala/org/apache/spark/executor/Executor.scala



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org