You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/05/20 23:40:59 UTC
[jira] [Commented] (SPARK-7600) Stopping Streaming Context
(sometimes) crashes master
[ https://issues.apache.org/jira/browse/SPARK-7600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14553178#comment-14553178 ]
Josh Rosen commented on SPARK-7600:
-----------------------------------
Is event logging enabled? This could be a duplicate of SPARK-6270
> Stopping Streaming Context (sometimes) crashes master
> -----------------------------------------------------
>
> Key: SPARK-7600
> URL: https://issues.apache.org/jira/browse/SPARK-7600
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Affects Versions: 1.3.1
> Reporter: Marius Soutier
>
> In my streaming job (that uses actorStreams) I'm stopping the SparkStreaming context via ssc.stop(stopSparkContext = true, stopGracefully = true). Sometimes this leads to the Spark master being in a permanent error state that just displays an error page instead of the UI.
> The following is being logged when trying to access the master UI:
> 15/05/13 15:57:15 WARN jetty.servlet.ServletHandler: /
> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at org.apache.spark.deploy.master.ui.MasterPage.render(MasterPage.scala:47)
> at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:69)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
> at org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
> at org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
> at org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
> at org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
> at org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
> at org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
> at org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
> at org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
> at org.spark-project.jetty.server.Server.handle(Server.java:370)
> at org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
> at org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
> at org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
> at org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644)
> at org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
> at org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
> at org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
> at org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
> at org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
> at org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
> at java.lang.Thread.run(Thread.java:745)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org