You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Artem Aliev (JIRA)" <ji...@apache.org> on 2017/01/25 12:47:26 UTC

[jira] [Closed] (SPARK-19362) master UI kill link stops spark context but leave it active

     [ https://issues.apache.org/jira/browse/SPARK-19362?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Artem Aliev closed SPARK-19362.
-------------------------------
    Resolution: Duplicate

> master UI kill link stops spark context but leave it active
> -----------------------------------------------------------
>
>                 Key: SPARK-19362
>                 URL: https://issues.apache.org/jira/browse/SPARK-19362
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.2
>            Reporter: Artem Aliev
>
> SparkGremlinComupter recreates internal SparkContext in case of error or if it was killed.  https://issues.apache.org/jira/browse/TINKERPOP-1271
> By the way, If Spark application was killed from Master UI with (kill) link, the SparkContext is not recreated.
> It is marked as stopped but not removed from active context holder.
> So there is no way to create new spark context, getOrCreate() method return old stopped context.
> The root of the problem is: 
> sc.stop() method is called from event-loop thread. It interrupt the thread, and so it interrupt its own execution and do not clean active context.
> You can use following example to reproduce the problem.
> 1. Start test application
> 2. kill application when it asked.
> 3, see error
> {code}
> import org.apache.spark.{SparkConf, SparkContext}
> object StopTest {
>   def main(args: Array[String]) {
>     val sc = test
>     println("you have 15 sec to stop the context")
>     Thread.sleep(15000)
>     if(sc.isStopped) println ("SparkContext was stopped")
>     test
>   }
>   def test  = {
>       println("Starting context")
>      val conf = new SparkConf().setAppName("Simple Spark Application")
>      val sc = new SparkContext(new SparkConf())
>      sc.parallelize(Seq(1,2,3)).count()
>      sc
>   }
> }
> {code} 
> Output:
> {code}
> SparkContext was stopped
> Starting context
> Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
> org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
> StopTest$.test(StopTest.scala:14)
> StopTest$.main(StopTest.scala:5)
> StopTest.main(StopTest.scala)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:497)
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2223)
> 	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2219)
> 	at scala.Option.foreach(Option.scala:257)
> 	at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2219)
> 	at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2292)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:86)
> 	at StopTest$.test(StopTest.scala:14)
> 	at StopTest$.main(StopTest.scala:9)
> 	at StopTest.main(StopTest.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org