You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Bacon (JIRA)" <ji...@apache.org> on 2017/03/13 16:26:41 UTC

[jira] [Commented] (SPARK-16784) Configurable log4j settings

    [ https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15907766#comment-15907766 ] 

Josh Bacon commented on SPARK-16784:
------------------------------------

From what I've seen, this limitation is also experienced on the Spark Standalone Cluster Manager.

There doesn't appear to be a way to write custom log4j files for driver and executor JVMs, on a per application/submission basis. Spark appears to provision configuration files (via --files option) after the driver/executor JVMs are already started. A work-around exists by including log4j files in the classpath of your application's uber jar (e.i. /src/main/resources/), and then appending the following spark-submit options: 

--driver-java-options '-Dlog4j.configuration=jar:file:your-application-uber-.jar!/your-custom-driver-log4j.properties -Dlog4j.debug'

Unfortunately this does not appear to work for executor log4j, because the executor JVM appears to start before provisioning your-application-uber.jar file, in the case of the driver, the provisioning takes place before driver JVM starts so you're able to reference the relative uber jar file path of the driver's working directory.
 
THIS DOESN'T WORK:
--conf 'spark.executor.extraJavaOptions=-Dlog4j.configuration=jar:file:your-application.jar!/your-custom-executor-log4j.properties -Dlog4j.debug'

I'm not familiar with the internals, but if this warrants a new jira ticket, let me know and I can create one and will work out a proper description!
Thanks



> Configurable log4j settings
> ---------------------------
>
>                 Key: SPARK-16784
>                 URL: https://issues.apache.org/jira/browse/SPARK-16784
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 2.0.0, 2.1.0
>            Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This is easy in client mode.  I just modify log4j.properties.  It's difficult in cluster mode, because I need to modify the log4j.properties in the distribution in which the driver runs.  I'd like a way of setting this dynamically, such as a java system property.  Some brief searching showed that log4j doesn't seem to accept such a property, but I'd like to open up this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org