You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuhongqiang (JIRA)" <ji...@apache.org> on 2016/11/07 06:32:58 UTC

[jira] [Created] (SPARK-18297) Fail if SparkContext run a new Thread in yarn-cluster

liuhongqiang created SPARK-18297:
------------------------------------

             Summary: Fail if SparkContext run a new Thread in yarn-cluster
                 Key: SPARK-18297
                 URL: https://issues.apache.org/jira/browse/SPARK-18297
             Project: Spark
          Issue Type: Bug
          Components: YARN
            Reporter: liuhongqiang


program:

public static void main(String[] args) {
       Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(new Thread(new Runnable() {
            @Override
            public void run() {
                SparkConf conf = new SparkConf();
                conf.setAppName("SparkDemo");
                JavaSparkContext sparkContext = new JavaSparkContext(conf);
                JavaRDD<String> array = sparkContext.parallelize(Lists.newArrayList("1", "2", "3", "4"));
                System.out.println(array.count());
            }
        }), 0, 5000, TimeUnit.MILLISECONDS);
    }


log:

16/11/02 11:16:47 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
16/11/02 11:16:47 INFO yarn.ApplicationMaster: Waiting for spark context initialization
16/11/02 11:16:47 INFO yarn.ApplicationMaster: Waiting for spark context initialization ... 
16/11/02 11:16:47 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, exitCode: 0

problem:

mainMethod.invoke(null, userArgs.toArray)
finish(FinalApplicationStatus.SUCCEEDED, ApplicationMaster.EXIT_SUCCESS)

main method was finished, but sub thread may be not finished.
so should not invoke finish to shutdown dirver thread.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org