You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiaoming W (Jira)" <ji...@apache.org> on 2020/12/08 02:50:00 UTC

[jira] [Created] (SPARK-33699) Spark web ui kill application but the thread still exists background

Xiaoming W created SPARK-33699:
----------------------------------

             Summary: Spark web ui kill application but the thread still exists background
                 Key: SPARK-33699
                 URL: https://issues.apache.org/jira/browse/SPARK-33699
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, Web UI
    Affects Versions: 2.4.3
         Environment: spark-2.4.3

CentOS 7.0

JDK 1.8.0_201
            Reporter: Xiaoming W


When I kill an application on the Web UI (which I submit with standalone-client mode), it seems to be killed already; But when I use 'jps' command I can still see the application running background. This is my demo code to reappear this problem.
{code:java}
// code placeholder
val rdd = sparkSession.sparkContext.parallelize(List(1, 2, 3, 4)).repartition(4)

//section 1: 
//do something for a long time
val rdd2 = rdd.map(x => { 
  for (i <- 1 to 300) {
    for (j <- 1 to 999999999) {
    }

    if (i % 10 == 0) {
      println(i + " rdd map process running!")
    }

  }

  x * 2
})
rdd2.take(10).foreach(println(_))

//section 2: 
//do something for a long time in driver
for (i <- 1 to 500) {
  for (j <- 1 to 999999999) {
    }

  if (i % 10 == 0) {
    println(i + " main process running!")
  }
}

{code}
And, 
 # If I kill the application on web ui when section 1 rdd.map process, it can be stopped clearly;
 # If I kill the application on web ui when section 2 do something in the driver, and then it stll running background.

So, is it a bug of spark and how to solve it ?

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org