You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@logging.apache.org by "Jitin Dominic (Jira)" <ji...@apache.org> on 2022/02/15 12:15:00 UTC

[jira] [Updated] (LOG4J2-3403) java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler

     [ https://issues.apache.org/jira/browse/LOG4J2-3403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jitin Dominic updated LOG4J2-3403:
----------------------------------
    Description: 
I'm using the latest (v2.17.1) log4j2 dependencies in my grails 2.5.4 application. Following are the dependencies used:
 * log4j-api
 * log4j-core
 * log4j-1.2-api
 * log4j-slf4j-impl

 

We also have spark dependencies in our application for generating parquet files: 
{code:java}
compile ("org.apache.spark:spark-core_2.12:3.1.2") {
  exclude group: 'org.slf4j'
}

compile ("org.apache.spark:spark-sql_2.12:3.1.2") {
 exclude group: 'org.slf4j'
}

compile ("org.apache.spark:spark-catalyst_2.12:3.1.2") {
 exclude group: 'org.slf4j'
} {code}
 

When we try to generate parquet file, we get following exception:
{code:java}
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:112)
    at org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:567)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
    at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
    at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
    at org.apache.spark.sql.DataFrameWriter$parquet.call(Unknown Source)
.
.
.
Caused by: java.lang.NoClassDefFoundError: org/slf4j/bridge/SLF4JBridgeHandler
    at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.<init>(ParquetFileFormat.scala:63)
    ... 102 more
Caused by: java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    ... 103 more
 {code}
 

This is happening because we have excluded _org.slf4j_ from the spark dependencies as mentioned above.

When we try to include {_}org.slf4j{_}, we keep getting warning regarding multiple bindings of {_}slf4j{_}.

 

Can someone confirm if there's some issue with the configuration or is it a bug?

 

 

  was:
I'm using the latest (v2.17.1) log4j2 dependencies in my grails 2.5.4 application. Following are the dependencies used:
 * log4j-api
 * log4j-core
 * log4j-1.2-api
 * log4j-slf4j-impl

 

We also have spark dependencies in our application for generating parquet files: 
{code:java}
compile ("org.apache.spark:spark-core_2.12:3.1.2") {
  exclude group: 'org.slf4j'
}

compile ("org.apache.spark:spark-sql_2.12:3.1.2") {
 exclude group: 'org.slf4j'
}

compile ("org.apache.spark:spark-catalyst_2.12:3.1.2") {
 exclude group: 'org.slf4j'
} {code}
 

When we try to generate parquet file, we get following exception:
{code:java}
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:112)
    at org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:567)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
    at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
    at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
    at org.apache.spark.sql.DataFrameWriter$parquet.call(Unknown Source)
.
.
.
Caused by: java.lang.NoClassDefFoundError: org/slf4j/bridge/SLF4JBridgeHandler
    at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.<init>(ParquetFileFormat.scala:63)
    ... 102 more
Caused by: java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    ... 103 more
 {code}
 

This is happening because we have org.slf4j from the spark dependencies as mentioned above.

When we try to include org.slf4j, we keep getting warning regarding multiple bindings of slf4j.

 

Can someone confirm if there's some issue with the configuration or is it a bug?

 

 


> java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
> ---------------------------------------------------------------------
>
>                 Key: LOG4J2-3403
>                 URL: https://issues.apache.org/jira/browse/LOG4J2-3403
>             Project: Log4j 2
>          Issue Type: Question
>            Reporter: Jitin Dominic
>            Priority: Major
>
> I'm using the latest (v2.17.1) log4j2 dependencies in my grails 2.5.4 application. Following are the dependencies used:
>  * log4j-api
>  * log4j-core
>  * log4j-1.2-api
>  * log4j-slf4j-impl
>  
> We also have spark dependencies in our application for generating parquet files: 
> {code:java}
> compile ("org.apache.spark:spark-core_2.12:3.1.2") {
>   exclude group: 'org.slf4j'
> }
> compile ("org.apache.spark:spark-sql_2.12:3.1.2") {
>  exclude group: 'org.slf4j'
> }
> compile ("org.apache.spark:spark-catalyst_2.12:3.1.2") {
>  exclude group: 'org.slf4j'
> } {code}
>  
> When we try to generate parquet file, we get following exception:
> {code:java}
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:112)
>     at org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:567)
>     at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
>     at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
>     at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
>     at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
>     at org.apache.spark.sql.DataFrameWriter$parquet.call(Unknown Source)
> .
> .
> .
> Caused by: java.lang.NoClassDefFoundError: org/slf4j/bridge/SLF4JBridgeHandler
>     at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.<init>(ParquetFileFormat.scala:63)
>     ... 102 more
> Caused by: java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>     ... 103 more
>  {code}
>  
> This is happening because we have excluded _org.slf4j_ from the spark dependencies as mentioned above.
> When we try to include {_}org.slf4j{_}, we keep getting warning regarding multiple bindings of {_}slf4j{_}.
>  
> Can someone confirm if there's some issue with the configuration or is it a bug?
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)