You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@storm.apache.org by Ethanlm <gi...@git.apache.org> on 2017/11/30 17:09:55 UTC

[GitHub] storm pull request #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1...

GitHub user Ethanlm opened a pull request:

    https://github.com/apache/storm/pull/2441

    [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

    https://issues.apache.org/jira/browse/STORM-2838
    
    To let storm work with HDFS, we need to replace log4j-over-slf4j with log4j-1.2-api. Otherwise we will get an error: 
    ```
    Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError.
    ```


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/Ethanlm/storm STORM-2838

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/storm/pull/2441.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2441
    
----
commit 40ff4cb96676a8b454a4874b546dfdab7d39f08d
Author: Ethan Li <et...@gmail.com>
Date:   2017-11-30T16:48:45Z

    [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

----


---

[GitHub] storm issue #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

Posted by revans2 <gi...@git.apache.org>.
Github user revans2 commented on the issue:

    https://github.com/apache/storm/pull/2441
  
    @srdo I agree that we don't want slf4j-log4j12 on the classpath whenever possible. The use case we have is adding hadoop to our daemon classpath with all of the configs and everything else needed to really make it work properly from an existing Hadoop install.  The simplest way to make that happen is to call `hadoop classpath` but with that there is no simple way to remove anything from that classpath because wild cards are supported.
    
    If you really don't want to switch the jars we use we can find an alternative approach but it is not nearly going to be as simple of a solution.


---

[GitHub] storm issue #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

Posted by Ethanlm <gi...@git.apache.org>.
Github user Ethanlm commented on the issue:

    https://github.com/apache/storm/pull/2441
  
    @srdo  Thanks for the review.
    We decided to set the configs to only point to the existing hadoop config files (no jars from existing hadoop) to avoid this issue for now. Closing this until any further issues appear.


---

[GitHub] storm issue #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

Posted by Ethanlm <gi...@git.apache.org>.
Github user Ethanlm commented on the issue:

    https://github.com/apache/storm/pull/2441
  
    Thanks @revans2 for the reply


---

[GitHub] storm pull request #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1...

Posted by Ethanlm <gi...@git.apache.org>.
Github user Ethanlm closed the pull request at:

    https://github.com/apache/storm/pull/2441


---

[GitHub] storm issue #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

Posted by srdo <gi...@git.apache.org>.
Github user srdo commented on the issue:

    https://github.com/apache/storm/pull/2441
  
    I think this is the wrong approach. We should exclude slf4j-log4j12 from whichever dependency is including it instead. We want all logging to go to log4j 2 and our worker log, making this change would mean that all logging done to the log4j12 API is lost unless the user adds a log4j12 configuration file to Storm.


---

[GitHub] storm issue #2441: [STORM-2838] Replace log4j-over-slf4j with log4j-1.2-api

Posted by srdo <gi...@git.apache.org>.
Github user srdo commented on the issue:

    https://github.com/apache/storm/pull/2441
  
    Thanks for explaining @revans2. 
    
    Please correct me if I get any of this wrong, I'm not very familiar with Hadoop: 
    
    The issue is that you want the daemons to use HdfsBlobStore, which requires you to put storm-hdfs on the daemon classpath, and additionally add the output of `hadoop classpath` to the daemon classpath in order to load the Hadoop jars and configuration files. For any topology code you'd just be handling excluding slf4j-log4j12 through the usual Maven exclusion mechanism, but for the daemons you can't easily remove slf4j-log4j12 because you'd have to filter `hadoop classpath` somehow.
    
    I don't like putting log4j-1.2-api on the classpath as a fix for this because we'll lose whatever is logged to the log4j 1 interface. There is also a second related issue that is not fixed by adding log4j-1.2-api; we'll have multiple SLF4J bindings on the daemon classpath. Since Hadoop adds slf4j-log4j12 (log4j 1) and Storm ships with log4j-slf4j-impl (log4j 2), we can't be sure which logging implementation SLF4J will bind to (and where the daemon log will go). Per the SLF4J documentation regarding multiple bindings: 
    
    > The warning emitted by SLF4J is just that, a warning. Even when multiple bindings are present, SLF4J will pick one logging framework/implementation and bind with it. The way SLF4J picks a binding is determined by the JVM and for all practical purposes should be considered random. As of version 1.6.6, SLF4J will name the framework/implementation class it is actually bound to.
    
    It seems someone else encountered this issue at https://issues.apache.org/jira/browse/HADOOP-13344. It might make sense to lobby a bit to get that feature added. If we're going to add log4j-1.2-api to the daemon classpath I'd like it to be only temporarily until the linked issue makes it in, and then log4j-1.2-api would probably need to be only on the daemon classpath and not in any of the external modules or the worker classpath to avoid interfering with topologies that use log4j-over-slf4j. 
    
    An alternative temporary fix would be writing a script or program that could expand the Hadoop classpath wildcards that contain slf4j-log4j12* and remove that jar? That way we could provide the functionality of https://issues.apache.org/jira/browse/HADOOP-13344 ourselves until that feature makes it in.


---