You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by manish gupta <to...@gmail.com> on 2019/10/23 13:49:10 UTC

Spark executor pods not getting killed after task completion

Hi

I am trying to run spark submit on kubernetes. I am able to achieve the
desired results in a way that driver and executors are getting launched as
per the given configuration and my job is able to run successfully.

*But even after job completion spark driver pod is always in Running state
and none of the executor pods are getting killed whereas when I run a
simple SparkPi application to test it with the same image executors are
getting killed and the driver shows the status as Completed.*

Can someone please guide me on this issue.

Regards
Manish Gupta

Re: Spark executor pods not getting killed after task completion

Posted by manishgupta88 <to...@gmail.com>.
Issue got resolved after closing the sparkcontext.
https://stackoverflow.com/questions/57964848/spark-job-in-kubernetes-stuck-in-running-state



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org