You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/10/12 17:56:00 UTC

[jira] [Resolved] (SPARK-40489) Spark 3.3.0 breaks with SFL4J 2.

     [ https://issues.apache.org/jira/browse/SPARK-40489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-40489.
----------------------------------
    Resolution: Duplicate

Spark is on slf4j 2.x now, so I think SPARK-40511 supersedes this

> Spark 3.3.0 breaks with SFL4J 2.
> --------------------------------
>
>                 Key: SPARK-40489
>                 URL: https://issues.apache.org/jira/browse/SPARK-40489
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.3.0
>            Reporter: Garret Wilson
>            Priority: Major
>
> Spark breaks fundamentally with SLF4J 2.x because it uses {{StaticLoggerBinder}}.
> SLF4J is the logging facade that is meant to shield the application from the implementation, whether it be Log4J or Logback or whatever. Historically SLF4J 1.x used a bad approach to configuration: it used a {{StaticLoggerBinder}} (a global static singleton instance) rather than the Java {{ServiceLoader}} mechanism.
> SLF4J 2.x, which has been in development for years, has been released. It finally switches to use the {{ServiceLoader}} mechanism. As [described in the FAQ|https://www.slf4j.org/faq.html#changesInVersion200], the API should be compatible; an application just needs to use the latest Log4J/Logback implementation which has the service loader.
> *Above all the application must _not_ use the low-level {{StaticLoggerBinder}} method, because it has been removed!*
> Unfortunately [{{org.apache.spark.internal.Logging}}|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/Logging.scala] uses {{StaticLoggerBinder}} and completely breaks any environment using SLF4J 2.x. For example, in my application, I have pulled in the SLF4J 2.x API and pulled in the Logback 1.4.x libraries (I'm not even using Log4J). Spark breaks completely just trying to get a Spark session:
> {noformat}
> Caused by: java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
>         at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j2(Logging.scala:232)
>         at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:129)
>         at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:115)
>         at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:109)
>         at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:84)
>         at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:106)
>         at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
>         at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:84)
>         at org.apache.spark.internal.Logging.log(Logging.scala:53)
>         at org.apache.spark.internal.Logging.log$(Logging.scala:51)
>         at org.apache.spark.SparkContext.log(SparkContext.scala:84)
>         at org.apache.spark.internal.Logging.logInfo(Logging.scala:61)
>         at org.apache.spark.internal.Logging.logInfo$(Logging.scala:60)
>         at org.apache.spark.SparkContext.logInfo(SparkContext.scala:84)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:195)
>         at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
>         at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
>         at scala.Option.getOrElse(Option.scala:201)
>         at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
> {noformat}
> This is because Spark is playing low-level tricks to find out if the logging platform is Log4J, and relying on {{StaticLoggerBinder}} to do it.
> {code}
>   private def isLog4j2(): Boolean = {
>     // This distinguishes the log4j 1.2 binding, currently
>     // org.slf4j.impl.Log4jLoggerFactory, from the log4j 2.0 binding, currently
>     // org.apache.logging.slf4j.Log4jLoggerFactory
>     val binderClass = StaticLoggerBinder.getSingleton.getLoggerFactoryClassStr
>     "org.apache.logging.slf4j.Log4jLoggerFactory".equals(binderClass)
>   }
> {code}
> Whatever the wisdom of Spark's relying on Log4J-specific functionality, Spark should not be using {{StaticLoggerBinder}} to do that detection. There are many other approaches. (The code itself suggest one approach: {{LogManager.getRootLogger.asInstanceOf[Log4jLogger]}}. You could check to see if the root logger actually is a {{Log4jLogger}}. There may be even better approaches.)
> The other big problem is relying on the Log4J classes themselves. By relying on those classes, you force me to bring in Log4J as a dependency, which in the latest versions will register themselves with the service loader mechanism, causing conflicting SLF4J implementations.
> It is paramount that you:
> * Remove all reliance ton {{StaticLoggerBinder}}. If you must must must use it, please check for it using reflection!
> * Remove all static references to the Log4J classes. (In an ideal world you wouldn't even be doing Log4J-specific things anyway.) If you must must must do Log4J-specific things, access the classes via reflection; don't statically link them in the code.
> The current situation absolutely (and unnecessarily) 100% breaks the use of SLF4J 2.x.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org