You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/07/01 08:44:00 UTC
[jira] [Assigned] (SPARK-35969) Make the pod prefix more readable
and tallied with K8S DNS Label Names
[ https://issues.apache.org/jira/browse/SPARK-35969?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-35969:
------------------------------------
Assignee: Apache Spark
> Make the pod prefix more readable and tallied with K8S DNS Label Names
> ----------------------------------------------------------------------
>
> Key: SPARK-35969
> URL: https://issues.apache.org/jira/browse/SPARK-35969
> Project: Spark
> Issue Type: Improvement
> Components: Kubernetes
> Affects Versions: 3.2.0
> Reporter: Kent Yao
> Assignee: Apache Spark
> Priority: Major
>
> By default, the executor pod prefix is generated by the app name. It handles characters that match [^a-z0-9\\-] differently. The '.' and all whitespaces will be converted to '-', but other ones to empty string. Especially, characters like '_', '|' are commonly used as a word separator in many languages.
> According to the K8S DNS Label Names, see [https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#dns-label-names,] we can convert all special characters to `-`.
>
> {code:scala}
> scala> "time.is%the¥most$valuable_——————thing,it's about time.".replaceAll("[^a-z0-9\\-]", "-").replaceAll("-+", "-")
> res9: String = time-is-the-most-valuable-thing-it-s-about-time-
> scala> "time.is%the¥most$valuable_——————thing,it's about time.".replaceAll("\\s+", "-").replaceAll("\\.", "-").replaceAll("[^a-z0-9\\-]", "").replaceAll("-+", "-")
> res10: String = time-isthemostvaluablethingits-about-time-
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org