You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brent (Jira)" <ji...@apache.org> on 2022/04/14 23:54:00 UTC

[jira] [Commented] (SPARK-37814) Migrating from log4j 1 to log4j 2

    [ https://issues.apache.org/jira/browse/SPARK-37814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17522589#comment-17522589 ] 

Brent commented on SPARK-37814:
-------------------------------

[~kabhwan] [~dongjoon] I happened to notice your conversation about seeing what Hadoop does with regards to maintenance versions and I was just looking at their GitHub and Jira a little while ago.  They did indeed move to Reload4j for their 3.3.x, 3.2.x and 2.10.x release lines (while I believe they're moving to Logback for 3.4.x and beyond).

For reference, here is the Jira:  https://issues.apache.org/jira/browse/HADOOP-18088

And here are the pull requests:
 * Hadoop 2.10.2: [https://github.com/apache/hadoop/pull/4151]
 * Hadoop 3.2.4: [https://github.com/apache/hadoop/pull/4084]
 * Hadoop 3.3.4: [https://github.com/apache/hadoop/pull/4052]

If you think this is a good path forward for the Spark project, I'd be happy to make a Jira or GitHub issue for it if no one has yet.

> Migrating from log4j 1 to log4j 2
> ---------------------------------
>
>                 Key: SPARK-37814
>                 URL: https://issues.apache.org/jira/browse/SPARK-37814
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Build
>    Affects Versions: 3.3.0
>            Reporter: L. C. Hsieh
>            Assignee: L. C. Hsieh
>            Priority: Major
>              Labels: releasenotes
>             Fix For: 3.3.0
>
>
> This is umbrella ticket for all tasks related to migrating to log4j2.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org