You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Prashant Sharma (JIRA)" <ji...@apache.org> on 2018/08/09 07:41:00 UTC

[jira] [Updated] (SPARK-25065) Provide a way to add a custom logging configuration file.

     [ https://issues.apache.org/jira/browse/SPARK-25065?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Prashant Sharma updated SPARK-25065:
------------------------------------
    Description: 
Currently, when running in kubernetes mode, it sets necessary configuration properties by creating a spark.properties file and mounting a conf dir.

The shipped Dockerfile, do not copy conf to the image, and this is on purpose and that is well understood. However, one would like to have his custom logging configuration file in the image conf directory.

In order to achieve this, it is not enough to copy it in the spark's conf dir of the resultant image, as it is reset during kubernetes mount conf volume step.

 

In order to reproduce, please add -Dlog4j.debug to 

  was:
Currently, when running in kubernetes mode, it sets necessary configuration properties by creating a spark.properties file and mounting a conf dir.

The shipped Dockerfile, do not copy conf to the image, and this is on purpose and that is well understood. However, one would like to have his custom logging configuration file in the image conf directory.

In order to achieve this, it is not enough to copy it in the spark's conf dir of the resultant image, as it is reset during kubernetes mount conf volume step.


> Provide a way to add a custom logging configuration file.
> ---------------------------------------------------------
>
>                 Key: SPARK-25065
>                 URL: https://issues.apache.org/jira/browse/SPARK-25065
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.3.1
>            Reporter: Prashant Sharma
>            Priority: Major
>
> Currently, when running in kubernetes mode, it sets necessary configuration properties by creating a spark.properties file and mounting a conf dir.
> The shipped Dockerfile, do not copy conf to the image, and this is on purpose and that is well understood. However, one would like to have his custom logging configuration file in the image conf directory.
> In order to achieve this, it is not enough to copy it in the spark's conf dir of the resultant image, as it is reset during kubernetes mount conf volume step.
>  
> In order to reproduce, please add -Dlog4j.debug to 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org