You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/15 18:10:00 UTC

[jira] [Resolved] (SPARK-25922) [K8] Spark Driver/Executor "spark-app-selector" label mismatch

     [ https://issues.apache.org/jira/browse/SPARK-25922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-25922.
------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.4.1

Issue resolved by pull request 23779
[https://github.com/apache/spark/pull/23779]

> [K8] Spark Driver/Executor "spark-app-selector" label mismatch
> --------------------------------------------------------------
>
>                 Key: SPARK-25922
>                 URL: https://issues.apache.org/jira/browse/SPARK-25922
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4.0 RC4
>            Reporter: Anmol Khurana
>            Priority: Major
>             Fix For: 2.4.1
>
>
> Hi,
> I have been testing Spark 2.4.0 RC4 on Kubernetes  to run Python Spark Applications and running into an issue where the AppId label on the driver and executors mis-match. I am using the [https://github.com/GoogleCloudPlatform/spark-on-k8s-operator] to run these applications. 
> I see a spark.app.id of the form spark-* as  "spark-app-selector" label on the driver as well as in the K8 config-map which gets created for the driver via spark-submit . My guess is this is coming from [https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala#L211] 
> But when the driver actually comes up and brings up executors etc. , I see that the "spark-app-selector" label on the executors as well as the spark.app.Id config within the user-code on the driver is something of the form spark-application-* ( probably from [https://github.com/apache/spark/blob/b19a28dea098c7d6188f8540429c50f42952d678/core/src/main/scala/org/apache/spark/SparkContext.scala#L511] & [https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26|https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26)] )
> We were consuming this "spark-app-selector" label on the Driver Pod to get the App Id and use it to look-up the app in SparkHistory server (among other use-cases). but due to this mis-match, this logic no longer works. This was working fine in Spark 2.2 fork for Kubernetes which i was using earlier. Is this expected behavior and if yes, what's the correct way to fetch the applicationId from outside the application ?  
> Let me know if I can provide any more details or if I am doing something wrong. Here is an example run with different *spark-app-selector* label on the driver/executor : 
>  
> {code:java}
> Name: pyfiles-driver
> Namespace: default
> Priority: 0
> PriorityClassName: <none>
> Start Time: Thu, 01 Nov 2018 18:19:46 -0700
> Labels: spark-app-selector=spark-b78bb10feebf4e2d98c11d7b6320e18f
>  spark-role=driver
>  sparkoperator.k8s.io/app-name=pyfiles
>  sparkoperator.k8s.io/launched-by-spark-operator=true
>  version=2.4.0
> Status: Running
> Name: pyfiles-1541121585642-exec-1
> Namespace: default
> Priority: 0
> PriorityClassName: <none>
> Start Time: Thu, 01 Nov 2018 18:24:02 -0700
> Labels: spark-app-selector=spark-application-1541121829445
>  spark-exec-id=1
>  spark-role=executor
>  sparkoperator.k8s.io/app-name=pyfiles
>  sparkoperator.k8s.io/launched-by-spark-operator=true
>  version=2.4.0
> Status: Pending
> {code}
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org