You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/03/18 21:40:00 UTC

[jira] [Comment Edited] (SPARK-34674) Spark app on k8s doesn't terminate without call to sparkContext.stop() method

    [ https://issues.apache.org/jira/browse/SPARK-34674?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17304472#comment-17304472 ] 

Dongjoon Hyun edited comment on SPARK-34674 at 3/18/21, 9:39 PM:
-----------------------------------------------------------------

I reopen this because the affected version is not the same according to [~Kotlov]. The following is a copy of my comment at the other JIRA.

1. Does your example work with any Spark versions before? Then, what is the latest Spark version?
2. BTW, `sparkContext.stop()` or `spark.stop()` should be called by App. I don't think your use case is a legit Spark example although it might be a behavior change across Spark versions.


was (Author: dongjoon):
I reopen this because the affected version is not the same. The following is a copy of my comment at the other JIRA.

1. Does your example work with any Spark versions before? Then, what is the latest Spark version?
2. BTW, `sparkContext.stop()` or `spark.stop()` should be called by App. I don't think your use case is a legit Spark example although it might be a behavior change across Spark versions.

> Spark app on k8s doesn't terminate without call to sparkContext.stop() method
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-34674
>                 URL: https://issues.apache.org/jira/browse/SPARK-34674
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.1.1
>            Reporter: Sergey
>            Priority: Major
>
> Hello!
>  I have run into a problem that if I don't call the method sparkContext.stop() explicitly, then a Spark driver process doesn't terminate even after its Main method has been completed. This behaviour is different from spark on yarn, where the manual sparkContext stopping is not required.
>  It looks like, the problem is in using non-daemon threads, which prevent the driver jvm process from terminating.
>  At least I see two non-daemon threads, if I don't call sparkContext.stop():
> {code:java}
> Thread[OkHttp kubernetes.default.svc,5,main]
> Thread[OkHttp kubernetes.default.svc Writer,5,main]
> {code}
> Could you tell please, if it is possible to solve this problem?
> Docker image from the official release of spark-3.1.1 hadoop3.2 is used.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org