You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "John (Jira)" <ji...@apache.org> on 2021/02/24 02:16:00 UTC

[jira] [Created] (SPARK-34513) Kubernetes Spark Driver Pod Name Length Limitation

John created SPARK-34513:
----------------------------

             Summary: Kubernetes Spark Driver Pod Name Length Limitation
                 Key: SPARK-34513
                 URL: https://issues.apache.org/jira/browse/SPARK-34513
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 3.0.1, 3.0.0
            Reporter: John


Hi,

We are using Spark in Airflow with the k8s-master. Airflow is attaching to our spark-driver pod a unique id utilizing the k8s-subdomain convention '.'

This creates rather long pod-names. 

We noticed an issue with pod names in total (pod name + airflow attached uuid) exceeding 63 chars. Usually pod names can be up to 253 chars long. However Spark seems to have an issue with driver pod names which are longer than 63 characters.

In our case the driver pod name is exactly 65 chars long, but Spark is omitting the last 2 chars in its error message. I assume internally Spark is loosing those two characters. Reducing our Driver Pod Name to just 63 charts fixed the issue.

Here you can see the actual pod name (row 1) and the pod name from the Spark Error log (row 2)
ab-aaaaaa-bbbbbbbb-cccccc-dddddd.3s092032c69f4639adff835a826e0120
ab-aaaaaa-bbbbbbbb-cccccc-dddddd.3s092032c69f4639adff835a826e01
[2021-02-20 00:30:06,289] \{pod_launcher.py:136} INFO - Exception in thread "main" org.apache.spark.SparkException: No pod was found named Some(ab-aaaaaa-bbbbbbbb-cccccc-dddddd.3s092032c69f4639adff835a826e01) in the cluster in the namespace airflow-ns (this was supposed to be the driver pod.).
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org