You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/05/12 04:57:59 UTC

[jira] [Commented] (SPARK-7553) Add methods to maintain a singleton StreamingContext

    [ https://issues.apache.org/jira/browse/SPARK-7553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14539140#comment-14539140 ] 

Apache Spark commented on SPARK-7553:
-------------------------------------

User 'tdas' has created a pull request for this issue:
https://github.com/apache/spark/pull/6070

> Add methods to maintain a singleton StreamingContext 
> -----------------------------------------------------
>
>                 Key: SPARK-7553
>                 URL: https://issues.apache.org/jira/browse/SPARK-7553
>             Project: Spark
>          Issue Type: New Feature
>          Components: Streaming
>            Reporter: Tathagata Das
>            Assignee: Tathagata Das
>            Priority: Blocker
>
> In a REPL/notebook environment, its very easy to lose a reference to a StreamingContext by overriding the variable name. So if you happen to execute the following commands
> {{
> val ssc = new StreamingContext(...)  // cmd 1
> ssc.start()                     // cmd 2
> ...
> val ssc = new StreamingContext(...)   // accidentally run cmd 1 again
> }}
> The value of ssc will be overwritten. Now you can neither start the new context (as only one context can be started), nor stop the previous context (as the reference is lost).
> Hence its best to maintain a singleton reference to the active context, so that we never loose reference for the active context. 
> Since this problem occurs useful in REPL environments, its best to add this as an Experimental support in the Scala API only so that it can be used in Scala REPLs and notebooks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org