You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/01/24 20:02:00 UTC

[jira] [Assigned] (SPARK-30626) [K8S] Spark driver pod doesn't have SPARK_APPLICATION_ID env

     [ https://issues.apache.org/jira/browse/SPARK-30626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun reassigned SPARK-30626:
-------------------------------------

    Assignee: Jiaxin Shan

> [K8S] Spark driver pod doesn't have SPARK_APPLICATION_ID env
> ------------------------------------------------------------
>
>                 Key: SPARK-30626
>                 URL: https://issues.apache.org/jira/browse/SPARK-30626
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.4.4, 3.0.0
>            Reporter: Jiaxin Shan
>            Assignee: Jiaxin Shan
>            Priority: Minor
>
> This should be a minor improvement.
> The use case is we want to look up environment variables and create application folder and redirect driver logs to application folder.  Executors has it and we want to make a change to driver as well. 
>  
> {code:java}
> Limits:
>  cpu: 1024m
>  memory: 896Mi
>  Requests:
>  cpu: 1
>  memory: 896Mi
> Environment:
>  SPARK_DRIVER_BIND_ADDRESS: (v1:status.podIP)
>  SPARK_LOCAL_DIRS: /var/data/spark-9c315655-aba4-47fb-821c-30268d02af7e
>  SPARK_CONF_DIR: /opt/spark/conf{code}
>  
> [https://github.com/apache/spark/blob/afe70b3b5321439318a456c7d19b7074171a286a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStep.scala#L73-L79]
> We need SPARK_APPLICATION_ID inside the pod to organize logs 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org