You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by mlemay <ml...@gmail.com> on 2015/08/06 22:12:21 UTC

log4j custom appender ClassNotFoundException with spark 1.4.1

Hi, I'm trying to use a custom log4j appender in my log4j.properties.
It was perfectly working under spark 1.3.1 but it is now broken.  

The appender is shaded/bundled in my fat-jar.

Note: I've seen that spark 1.3.1 is using a different class loader..
See my SO post:
http://stackoverflow.com/questions/31856532/spark-unable-to-load-custom-log4j-properties-from-fat-jar-resources


log4j: Trying to find [log4j.xml] using context classloader
sun.misc.Launcher$AppClassLoader@4e25154f.
log4j: Trying to find [log4j.xml] using
sun.misc.Launcher$AppClassLoader@4e25154f class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader
sun.misc.Launcher$AppClassLoader@4e25154f.
log4j: Using URL
[file:/C:/Apps/Spark/spark-1.4.1-bin-hadoop2.6/conf/log4j.properties] for
automatic log4j configuration.
log4j: Reading configuration from URL
file:/C:/Apps/Spark/spark-1.4.1-bin-hadoop2.6/conf/log4j.properties
log4j: Parsing for [root] with value=[WARN, console, redis].
log4j: Level token is [WARN].
log4j: Category root set to WARN
log4j: Parsing appender named "console".
log4j: Parsing layout options for "console".
log4j: Setting property [conversionPattern] to [%d{yy/MM/dd HH:mm:ss} %p
%c{1}: %m%n].
log4j: End of parsing for "console".
log4j: Setting property [target] to [System.err].
log4j: Parsed "console" options.
*log4j: Parsing appender named "redis".
log4j:ERROR Could not instantiate class
[com.ryantenney.log4j.FailoverRedisAppender].
java.lang.ClassNotFoundException: com.ryantenney.log4j.FailoverRedisAppender
*        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
        at
org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
        at
org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
        at
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
        at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
        at
org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
        at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
        at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
        at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at
org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
        at
org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
        at
org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
        at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:275)
        at
org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$18.apply(Utils.scala:2262)
        at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$18.apply(Utils.scala:2262)
        at scala.util.Try$.apply(Try.scala:161)
        at
org.apache.spark.util.SparkShutdownHookManager.install(Utils.scala:2262)
        at org.apache.spark.util.Utils$.<init>(Utils.scala:88)
        at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
        at
org.apache.spark.deploy.SparkSubmitArguments.handleUnknown(SparkSubmitArguments.scala:432)
        at
org.apache.spark.launcher.SparkSubmitOptionParser.parse(SparkSubmitOptionParser.java:174)
        at
org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:91)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:107)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j:ERROR Could not instantiate appender named "redis".





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: log4j custom appender ClassNotFoundException with spark 1.4.1

Posted by mlemay <ml...@gmail.com>.
Offending commit is :

[SPARK-6014] [core] Revamp Spark shutdown hooks, fix shutdown races.
https://github.com/apache/spark/commit/e72c16e30d85cdc394d318b5551698885cfda9b8





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24171.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: log4j custom appender ClassNotFoundException with spark 1.4.1

Posted by mlemay <ml...@gmail.com>.
One possible solution is to spark-submit with --driver-class-path and list
all recursive dependencies.  This is fragile and error prone.

Non-working alternatives (used in SparkSubmit.scala AFTER arguments parser
is initialized):

spark-submit --packages ...
spark-submit --jars ...
spark-defaults.conf (spark.driver.extraJavaOptions, spark.jars,
spark.driver.extraClassPath,
...)

On Fri, Aug 7, 2015 at 8:57 AM, mlemay [via Apache Spark User List] <
ml-node+s1001560n24169h68@n3.nabble.com> wrote:

> That starts to smell...
>
> When analyzing SparkSubmit.scala, we can see than one of the firsts thing
> it does is to parse arguments. This uses Utils object and triggers
> initialization of member variables.  One such variable is
> ShutdownHookManager (which didn't exists in spark 1.3) with the later log4j
> initialization.
>
> setContextClassLoader is set only a few steps after argument parsing in
> submit > doRunMain > runMain..
>
> That pretty much sums it up:
> spark.util.Utils has a new static dependency on log4j that triggers it's
> initialization before the call to
> setContextClassLoader(MutableURLClassLoader)
>
> Anyone has a workaround to make this work in 1.4.1?
>
>
> ------------------------------
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html
> To unsubscribe from log4j custom appender ClassNotFoundException with
> spark 1.4.1, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=24159&code=bWxlbWF5QGdtYWlsLmNvbXwyNDE1OXwtMTk2MTgzMjQzNg==>
> .
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24170.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

Posted by mlemay <ml...@gmail.com>.
That starts to smell...

When analyzing SparkSubmit.scala, we can see than one of the firsts thing it
does is to parse arguments. This uses Utils object and triggers
initialization of member variables.  One such variable is
ShutdownHookManager (which didn't exists in spark 1.3) with the later log4j
initialization.

setContextClassLoader is set only a few steps after argument parsing in
submit > doRunMain > runMain..

That pretty much sums it up: 
spark.util.Utils has a new static dependency on log4j that triggers it's
initialization before the call to
setContextClassLoader(MutableURLClassLoader)

Anyone has a workaround to make this work in 1.4.1? 




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: log4j custom appender ClassNotFoundException with spark 1.4.1

Posted by mlemay <ml...@gmail.com>.
Looking at the callstack and diffs between 1.3.1 and 1.4.1-rc4, I see
something that could be relevant to the issue.

1) Callstack tells that log4j manager gets initialized and uses default java
context class loader. This context class loader should probably be
MutableURLClassLoader from spark but it's not.  We can assume that
currentThread.setContextClassLoader has not been called yet.

2) Still in the callstack, we can see that ShutdownHookManager is the class
object responsible to trigger log4j initialization.

3) Looking at the diffs between 1.3 and 1.4, we can see that this
ShutdownHookManager is a new class object.

With this information, is it possible that ShutdownHookManager makes log4j
initialize too early?  By that, I mean before spark gets the chance to set
it's MutableURLClassLoader on thread context?

Let me know if it does not make sense.

Mike




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24168.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org