You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Anmol Khurana (JIRA)" <ji...@apache.org> on 2018/11/01 23:42:00 UTC

[jira] [Created] (SPARK-25922) [K8] Spark Driver/Executor "spark-app-selector" label mismatch

Anmol Khurana created SPARK-25922:
-------------------------------------

             Summary: [K8] Spark Driver/Executor "spark-app-selector" label mismatch
                 Key: SPARK-25922
                 URL: https://issues.apache.org/jira/browse/SPARK-25922
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 2.4.0
         Environment: Spark 2.4.0 RC4
            Reporter: Anmol Khurana


Hi,

 

I have been testing Spark 2.4.0 RC4 on Kubernetes  to run Python Spark Applications and running into an issue where the AppId label on the driver and executors mis-match. I am using the [https://github.com/GoogleCloudPlatform/spark-on-k8s-operator] to run these applications. 

I see a spark.app.id of the form spark-* as  "spark-app-selector" label  ([https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Constants.scala#L22|https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Constants.scala#L22)])  on the driver as well as in the K8 config-map which gets created for the driver via spark-submit . My guess is this is coming from [https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala#L211] 

But when the driver actually comes up (https://github.com/apache/spark/blob/master/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh#L99) and brings up executors etc. , I see that the "spark-app-selector" label on the executors as well as the spark.app.Id config within the user-code on the driver is something of the form spark-application-* ( probably from [https://github.com/apache/spark/blob/b19a28dea098c7d6188f8540429c50f42952d678/core/src/main/scala/org/apache/spark/SparkContext.scala#L511] & [https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26)] 

We were consuming this "spark-app-selector" label on the Driver Pod to get the App Id and use it to look-up the app in SparkHistory server (among other use-cases). but due to this mis-match, this logic no longer works. This was working fine in Spark 2.2 fork for Kubernetes which i was using earlier. 

Is this expected behavior and if yes, what's the correct way to fetch the applicationId from outside the application ?  

 

Let me know if I can provide any more details or if I am doing something wrong.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org