You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/04/09 04:55:00 UTC

[jira] [Updated] (SPARK-34674) Spark app on k8s doesn't terminate without call to sparkContext.stop() method

     [ https://issues.apache.org/jira/browse/SPARK-34674?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-34674:
----------------------------------
    Fix Version/s:     (was: 3.1.2)
                       (was: 3.2.0)

> Spark app on k8s doesn't terminate without call to sparkContext.stop() method
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-34674
>                 URL: https://issues.apache.org/jira/browse/SPARK-34674
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.1.1
>            Reporter: Sergey Kotlov
>            Assignee: Sergey Kotlov
>            Priority: Major
>
> Hello!
>  I have run into a problem that if I don't call the method sparkContext.stop() explicitly, then a Spark driver process doesn't terminate even after its Main method has been completed. This behaviour is different from spark on yarn, where the manual sparkContext stopping is not required.
>  It looks like, the problem is in using non-daemon threads, which prevent the driver jvm process from terminating.
>  At least I see two non-daemon threads, if I don't call sparkContext.stop():
> {code:java}
> Thread[OkHttp kubernetes.default.svc,5,main]
> Thread[OkHttp kubernetes.default.svc Writer,5,main]
> {code}
> Could you tell please, if it is possible to solve this problem?
> Docker image from the official release of spark-3.1.1 hadoop3.2 is used.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org