You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "lior.c" <li...@taboola.com> on 2015/03/11 16:40:03 UTC

Using Log4j2 in spark executors

Hi,

I'd like to allow using log4j2 in executor code.
As spark contains dependencies to log4j 1.2, I would like to support spark
build with log4j2 instead of log4j 1.2. 
To accomplish that, I suggest creating a new profile for log4j2 in
spark-parent.
The default profile (log4j12), would include dependencies for log4j and
slf4j-log4j12 with default scope (I would remove the dependencies from sub
modules of spark-parent).
The log4j2 profile would instead include same dependencies with scope
provided (to avoid the shading plugin add those jars as a result of
transitive dependencies from other jars that depend on log4j 12), and in
addition would include the dependencies required to log4j2.

Already tested it and it seems to work properly, and I would like to offer
it as a pull request.
What do you think of this solution?

Lior



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Using-Log4j2-in-spark-executors-tp11009.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org