You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/05/01 16:52:02 UTC

[jira] [Commented] (SPARK-25065) Driver and executors pick the wrong logging configuration file.

    [ https://issues.apache.org/jira/browse/SPARK-25065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17097501#comment-17097501 ] 

Apache Spark commented on SPARK-25065:
--------------------------------------

User 'ScrapCodes' has created a pull request for this issue:
https://github.com/apache/spark/pull/27735

> Driver and executors pick the wrong logging configuration file.
> ---------------------------------------------------------------
>
>                 Key: SPARK-25065
>                 URL: https://issues.apache.org/jira/browse/SPARK-25065
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.4.4, 3.0.0
>            Reporter: Prashant Sharma
>            Priority: Major
>
> Currently, when running in kubernetes mode, it sets necessary configuration properties by creating a spark.properties file and mounting a conf dir.
> The shipped Dockerfile, do not copy conf to the image, and this is on purpose and that is well understood. However, one would like to have his custom logging configuration file in the image conf directory.
> In order to achieve this, it is not enough to copy it in the spark's conf dir of the resultant image, as it is reset during kubernetes mount conf volume step.
>  
> In order to reproduce, please add {code}-Dlog4j.debug{code} to {code:java}spark.(executor|driver).extraJavaOptions{code}. This way, it was found the provided log4j file is not picked and the one coming from kubernetes client jar was picked up by the driver process.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org