You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@logging.apache.org by "Jitin Dominic (Jira)" <ji...@apache.org> on 2022/02/16 10:27:00 UTC
[jira] [Resolved] (LOG4J2-3403) java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
[ https://issues.apache.org/jira/browse/LOG4J2-3403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jitin Dominic resolved LOG4J2-3403.
-----------------------------------
Resolution: Fixed
> java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
> ---------------------------------------------------------------------
>
> Key: LOG4J2-3403
> URL: https://issues.apache.org/jira/browse/LOG4J2-3403
> Project: Log4j 2
> Issue Type: Question
> Components: SLF4J Bridge
> Affects Versions: 2.17.1
> Reporter: Jitin Dominic
> Priority: Major
>
> I'm using the latest (v2.17.1) log4j2 dependencies in my grails 2.5.4 application. Following are the dependencies used:
> * log4j-api
> * log4j-core
> * log4j-1.2-api
> * log4j-slf4j-impl
>
> We also have spark dependencies in our application for generating parquet files:
> {code:java}
> compile ("org.apache.spark:spark-core_2.12:3.1.2") {
> exclude group: 'org.slf4j'
> }
> compile ("org.apache.spark:spark-sql_2.12:3.1.2") {
> exclude group: 'org.slf4j'
> }
> compile ("org.apache.spark:spark-catalyst_2.12:3.1.2") {
> exclude group: 'org.slf4j'
> } {code}
>
> When we try to generate parquet file, we get following exception:
> {code:java}
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:112)
> at org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:567)
> at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
> at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
> at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
> at org.apache.spark.sql.DataFrameWriter$parquet.call(Unknown Source)
> .
> .
> .
> Caused by: java.lang.NoClassDefFoundError: org/slf4j/bridge/SLF4JBridgeHandler
> at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.<init>(ParquetFileFormat.scala:63)
> ... 102 more
> Caused by: java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
> at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
> ... 103 more
> {code}
>
> This is happening because we have excluded _org.slf4j_ from the _spark_ dependencies as mentioned above.
> When we don't exclude _org.slf4j_ from the _spark_ dependencies, we keep getting warning regarding multiple bindings of {_}slf4j{_}.
>
> {code:java}
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:<LOCAL_PATH>/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:<LOCAL_PATH>lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] {code}
>
> Can someone confirm if there's some issue with the configuration or is it a bug?
>
>
--
This message was sent by Atlassian Jira
(v8.20.1#820001)