You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/08/18 14:59:45 UTC
[jira] [Updated] (SPARK-10057) Faill to load class
org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-10057:
------------------------------
Description:
Some loggings are dropped, because it can't load class "org.slf4j.impl.StaticLoggerBinder"
{code}
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
{code}
was:
Some loggings are dropped, because it can't load class "org.slf4j.impl.StaticLoggerBinder"
{code}
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
{code}
Component/s: Spark Core
I can't reproduce this, in an app or in the shell. It's a classpath problem. How do you make this occur?
> Faill to load class org.slf4j.impl.StaticLoggerBinder
> -----------------------------------------------------
>
> Key: SPARK-10057
> URL: https://issues.apache.org/jira/browse/SPARK-10057
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.5.0
> Reporter: Davies Liu
>
> Some loggings are dropped, because it can't load class "org.slf4j.impl.StaticLoggerBinder"
> {code}
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org