You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/24 02:42:39 UTC
[jira] [Updated] (SPARK-6077) Multiple spark streaming tabs on UI
when reuse the same sparkcontext
[ https://issues.apache.org/jira/browse/SPARK-6077?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-6077:
-----------------------------
Assignee: zhichao-li
> Multiple spark streaming tabs on UI when reuse the same sparkcontext
> --------------------------------------------------------------------
>
> Key: SPARK-6077
> URL: https://issues.apache.org/jira/browse/SPARK-6077
> Project: Spark
> Issue Type: Bug
> Components: Streaming, Web UI
> Reporter: zhichao-li
> Assignee: zhichao-li
> Priority: Minor
> Fix For: 1.3.1, 1.4.0
>
>
> Currently we would create a new streaming tab for each streamingContext even if there's already one on the same sparkContext which would cause duplicate StreamingTab created and none of them is taking effect.
> snapshot: https://www.dropbox.com/s/t4gd6hqyqo0nivz/bad%20multiple%20streamings.png?dl=0
> How to reproduce:
> 1)
> import org.apache.spark.SparkConf
> import org.apache.spark.streaming.{Seconds, StreamingContext}
> import org.apache.spark.storage.StorageLevel
> val ssc = new StreamingContext(sc, Seconds(1))
> val lines = ssc.socketTextStream("localhost", 9999, StorageLevel.MEMORY_AND_DISK_SER)
> val words = lines.flatMap(_.split(" "))
> val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
> wordCounts.print()
> ssc.start()
> .....
> 2)
> ssc.stop(false)
> val ssc = new StreamingContext(sc, Seconds(1))
> val lines = ssc.socketTextStream("localhost", 9999, StorageLevel.MEMORY_AND_DISK_SER)
> val words = lines.flatMap(_.split(" "))
> val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
> wordCounts.print()
> ssc.start()
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org