You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by "Anne Rutten (JIRA)" <ji...@apache.org> on 2017/06/07 14:04:18 UTC
[jira] [Created] (ZEPPELIN-2626) StreamingContext does not get shut
down when zeppelin shuts down
Anne Rutten created ZEPPELIN-2626:
-------------------------------------
Summary: StreamingContext does not get shut down when zeppelin shuts down
Key: ZEPPELIN-2626
URL: https://issues.apache.org/jira/browse/ZEPPELIN-2626
Project: Zeppelin
Issue Type: Bug
Affects Versions: 0.7.1
Reporter: Anne Rutten
Priority: Minor
if i set up a structuredstream:
{quote}
val ds1 = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", "localhost:9092")
.option("subscribe", "hatespeech01")
.load()
ds1.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
.as[(String, String)]
.writeStream
.format("console")
.start()
.awaitTermination()
{quote}
and then terminate Zeppelin (because the stream blocks the Zeppelincontext and i can't stop the stream), the corresponding sparkContext doesn't get terminated.
If i restart Zeppelin, the sparkContext doesn't get re-initialised, but isn't reachable anymore. If i kill the process associated with spark and restart Zeppelin everything again works as expected.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)