You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Michael Chang <mi...@tellapart.com> on 2014/06/05 03:09:47 UTC

Using log4j.xml

Has anyone tried to use a log4j.xml instead of a log4j.properties with
spark 0.9.1?  I'm trying to run spark streaming on yarn and i've set the
environment variable SPARK_LOG4J_CONF to a log4j.xml file instead of a
log4j.properties file, but spark seems to be using the default
log4j.properties

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/mnt/var/hadoop/1/yarn/local/filecache/12/spark-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
14/06/05 00:36:04 INFO ApplicationMaster: Using Spark's default log4j
profile: org/apache/spark/log4j-defaults.properties

Thanks,
Mike