You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2015/10/06 19:03:26 UTC

[jira] [Resolved] (SPARK-10944) org/slf4j/Logger is not provided in spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar

     [ https://issues.apache.org/jira/browse/SPARK-10944?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-10944.
------------------------------------
    Resolution: Not A Problem

Please read the documentation: https://spark.apache.org/docs/latest/hadoop-provided.html

(BTW I did not find a direct link to those docs, had to google them... might be something to look at.)

The Hadoop version in the jar name does not mean Hadoop classes are included, it means Spark was compiled against that specific version of Hadoop.

> org/slf4j/Logger is not provided in spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar
> -----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10944
>                 URL: https://issues.apache.org/jira/browse/SPARK-10944
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.5.1
>         Environment: Mac OS/Java 8/Spark 1.5.1 without hadoop
>            Reporter: Pranas Baliuka
>              Labels: easyfix, patch
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> Attempt to run Spark cluster on Mac OS machine fails
> Invocation:
> {code}
> # cd $SPARK_HOME
> Imin:spark-1.5.1-bin-without-hadoop pranas$ ./sbin/start-master.sh
> {code}
> Output:
> {code}
> starting org.apache.spark.deploy.master.Master, logging to /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out
> failed to launch org.apache.spark.deploy.master.Master:
>   	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>   	... 7 more
> full log in /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out
> {code}
> Log:
> {code}
> # Options read when launching programs locally with
> # ./bin/run-example or ./bin/spark-submit
> Spark Command: /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/bin/java -cp /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../conf/:/Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip Imin.local --port 7077 --webui-port 8080
> ========================================
> Error: A JNI error has occurred, please check your installation and try again
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
>         at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
>         at java.lang.Class.getMethod0(Class.java:3018)
>         at java.lang.Class.getMethod(Class.java:1784)
>         at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
>         at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> {code}
> Proposed short term fix:
> Bundle all required 3rd party libs to the uberjar and/or fix  start-up script to include required 3rd party libs.
> Long term quality improvement proposal: Introduce integration tests to check distribution before releasing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org