You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@storm.apache.org by "Ethan Li (JIRA)" <ji...@apache.org> on 2018/04/05 18:58:00 UTC

[jira] [Closed] (STORM-2838) Replace log4j-over-slf4j with log4j-1.2-api

     [ https://issues.apache.org/jira/browse/STORM-2838?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ethan Li closed STORM-2838.
---------------------------
    Resolution: Won't Fix

> Replace log4j-over-slf4j with log4j-1.2-api
> -------------------------------------------
>
>                 Key: STORM-2838
>                 URL: https://issues.apache.org/jira/browse/STORM-2838
>             Project: Apache Storm
>          Issue Type: Improvement
>            Reporter: Ethan Li
>            Assignee: Ethan Li
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> I tried to setup HdfsBlobStore and an exception shows up when I launch the nimbus.
> {code:java}
> Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError.
> {code}
> Found an explanation: https://www.slf4j.org/codes.html#log4jDelegationLoop
> This is because storm and hadoop use different logging system:
> {code:java}
> Storm:  log4j-over-slf4j --> slf4j --> log4j2  or slf4j --> log4j2
> Hadoop:  slf4j --> log4j1.2  or   log4j1.2 
> (note: --> means redirecting)
> {code}
> When we add hadoop common lib classpath to nimbus,  log4j-over-slf4j.jar and slf4j-log4j12.jar coexist. 
> One way to let storm work with hadoop is to replace log4j-over-slf4j in storm with log4j1.2. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)