You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:21:31 UTC
[jira] [Updated] (SPARK-8663) Dirver will be hang if there is a job
submit during SparkContex stop Interval
[ https://issues.apache.org/jira/browse/SPARK-8663?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-8663:
--------------------------------
Labels: bulk-closed (was: )
> Dirver will be hang if there is a job submit during SparkContex stop Interval
> -----------------------------------------------------------------------------
>
> Key: SPARK-8663
> URL: https://issues.apache.org/jira/browse/SPARK-8663
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.2.2, 1.3.0
> Environment: SUSE Linux Enterprise Server 11 SP3 (x86_64)
> Reporter: yuemeng
> Priority: Major
> Labels: bulk-closed
>
> Driver process will be hang if a job had submit during sc.stop Interval.This interval mean from start stop SparkContext to finish .
> The probability of this situation is very small,but If present, will cause driver process never exit.
> Reproduce step:
> 1)modify source code to make SparkContext stop() method sleep 2s
> in my situation,i make DAGScheduler stop method sleep 2s
> 2)submit an application ,code like:
> object DriverThreadTest {
> def main(args: Array[String]) {
> val sconf = new SparkConf().setAppName("TestJobWaitor")
> val sc= new SparkContext(sconf)
> Thread.sleep(5000)
> val t = new Thread {
> override def run() {
> while (true) {
> try {
> val rdd = sc.parallelize( 1 to 1000)
> var i = 0
> println("calcfunc start")
> while ( i < 10){
> i+=1
> rdd.count
> }
> println("calcfunc end")
> }catch{
> case e: Exception =>
> e.printStackTrace()
> }
> }
> }
> }
>
> t.start()
>
> val t2 = new Thread {
> override def run() {
> Thread.sleep(2000)
> println("stop sc thread")
> sc.stop()
> println("sc already stoped")
> }
> }
> t2.start()
> }
> }
> driver will be never exit
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org