You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/10/06 14:43:00 UTC

[jira] [Assigned] (SPARK-29233) Add regex expression checks for executorEnv in K8S mode

     [ https://issues.apache.org/jira/browse/SPARK-29233?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen reassigned SPARK-29233:
------------------------------------

    Assignee: merrily01

> Add regex expression checks for executorEnv in K8S mode
> -------------------------------------------------------
>
>                 Key: SPARK-29233
>                 URL: https://issues.apache.org/jira/browse/SPARK-29233
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 3.0.0
>            Reporter: merrily01
>            Assignee: merrily01
>            Priority: Minor
>         Attachments: error.jpeg, podEnv.jpeg
>
>
> In k8s mode, there are some naming regular expression requirements and checks for the key of pod environment variables, such as:
>  * In version k8s v-1.10.5, the naming rules for pod env satisfy the requirements of regular expressions: [-. _a-zA-Z] [-. _a-zA-Z0-9].*
>  * In version k8s v-1.6, the naming rules for pod env satisfy the requirement of regular expressions: [A-Za-z_] [A-Za-z0-9_]*
> Therefore, in spark on k8s mode, spark should add relevant regular checking rules when creating executor Env, and stricter validation rules should be used to filter out illegal key values that do not meet the requirements of the naming rules of Env for creating pod by k8s. Otherwise, it will lead to the problem that the pod can not be created normally and the tasks will be suspended.
>  
> To solve the problem above, a regular validation to executorEnv is added and committed. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org