You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Florent Pellerin (JIRA)" <ji...@apache.org> on 2014/11/12 19:47:36 UTC

[jira] [Created] (SPARK-4371) Spark crashes with JBoss Logging 3.6.1

Florent Pellerin created SPARK-4371:
---------------------------------------

             Summary: Spark crashes with JBoss Logging 3.6.1
                 Key: SPARK-4371
                 URL: https://issues.apache.org/jira/browse/SPARK-4371
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.1.0
            Reporter: Florent Pellerin


When using JBoss-logging which itself depends on slf4j 1.6.1,
Since SLF4JBridgeHandler.removeHandlersForRootLogger() was added in slf4j 1.6.5,
Since spark/Logging.scala is doing at line 147:
bridgeClass.getMethod("removeHandlersForRootLogger").invoke(null)

Spark is crashing:
java.lang.ExceptionInInitializerError: null
        at java.lang.Class.getMethod(Class.java:1670)
        at org.apache.spark.Logging$.<init>(Logging.scala:147)
        at org.apache.spark.Logging$.<clinit>(Logging.scala)
        at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:104)
        at org.apache.spark.Logging$class.log(Logging.scala:51)
        at org.apache.spark.SecurityManager.log(SecurityManager.scala:143)
        at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
        at org.apache.spark.SecurityManager.logInfo(SecurityManager.scala:143)
        at org.apache.spark.SecurityManager.setViewAcls(SecurityManager.scala:208)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:167)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:151)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)

I suggest Spark should at least silently swallow the exception.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org