You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 03:59:24 UTC

[jira] [Updated] (SPARK-18977) Heavy udf is not stopped by cancelJobGroup

     [ https://issues.apache.org/jira/browse/SPARK-18977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-18977:
---------------------------------
    Labels: bulk-closed  (was: )

> Heavy udf is not stopped by cancelJobGroup
> ------------------------------------------
>
>                 Key: SPARK-18977
>                 URL: https://issues.apache.org/jira/browse/SPARK-18977
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.6.2
>            Reporter: Vitaly Gerasimov
>            Priority: Major
>              Labels: bulk-closed
>
> Let's say we have a heavy udf that processing during a long time. When I try to run a job in job group that execute this udf and call cancelJobGroup(), the job is still continue processing.
> {code}
> # ./spark-shell
> > import scala.concurrent.Future
> > import scala.concurrent.ExecutionContext.Implicits.global
> > sc.setJobGroup("test-group", "udf-test")
> > sqlContext.udf.register("sleep", (times: Int) => { (1 to times).toList.foreach{ _ =>  print("sleep..."); Thread.sleep(10000) }; 1L })
> > Future { Thread.sleep(50000); sc.cancelJobGroup("test-group") }
> > sqlContext.sql("SELECT sleep(10)").collect()
> {code}
> It returns:
> {code}
> sleep...sleep...sleep...sleep...sleep...org.apache.spark.SparkException: Job 0 cancelled part of cancelled job group test-group
> > sleep...sleep...sleep...sleep...sleep...16/12/22 14:36:44 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): TaskKilled (killed intentionally)
> {code}
> It seems unexpectedly for me, but if I don't know something and it works as expected feel free to close the issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org