You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Chris Bogan (JIRA)" <ji...@apache.org> on 2019/01/16 03:27:00 UTC
[jira] [Updated] (SPARK-26518) UI Application Info Race Condition
Can Throw NoSuchElement
[ https://issues.apache.org/jira/browse/SPARK-26518?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chris Bogan updated SPARK-26518:
--------------------------------
Attachment: 15476091405552344590691778159589.jpg
> UI Application Info Race Condition Can Throw NoSuchElement
> ----------------------------------------------------------
>
> Key: SPARK-26518
> URL: https://issues.apache.org/jira/browse/SPARK-26518
> Project: Spark
> Issue Type: Bug
> Components: Web UI
> Affects Versions: 2.3.0, 2.4.0
> Reporter: Russell Spitzer
> Priority: Trivial
> Attachments: 15476091405552344590691778159589.jpg
>
>
> There is a slight race condition in the [AppStatusStore|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/status/AppStatusStore.scala#L39]
> Which calls `next` on the returned store even if it is empty which i can be for a short period of time after the UI is up but before the store is populated.
> {code}
> <head>
> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
> <title>Error 500 Server Error</title>
> </head>
> <body><h2>HTTP ERROR 500</h2>
> <p>Problem accessing /jobs/. Reason:
> <pre> Server Error</pre></p><h3>Caused by:</h3><pre>java.util.NoSuchElementException
> at java.util.Collections$EmptyIterator.next(Collections.java:4189)
> at org.apache.spark.util.kvstore.InMemoryStore$InMemoryIterator.next(InMemoryStore.java:281)
> at org.apache.spark.status.AppStatusStore.applicationInfo(AppStatusStore.scala:38)
> at org.apache.spark.ui.jobs.AllJobsPage.render(AllJobsPage.scala:275)
> at org.apache.spark.ui.WebUI$$anonfun$3.apply(WebUI.scala:86)
> at org.apache.spark.ui.WebUI$$anonfun$3.apply(WebUI.scala:86)
> at org.apache.spark.ui.JettyUtils$$anon$3.doGet(JettyUtils.scala:90)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> at org.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:865)
> at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:535)
> at org.spark_project.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
> at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317)
> at org.spark_project.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
> at org.spark_project.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219)
> at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:724)
> at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:219)
> at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> at org.spark_project.jetty.server.Server.handle(Server.java:531)
> at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:352)
> at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
> at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281)
> at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:102)
> at org.spark_project.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
> at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762)
> at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680)
> at java.lang.Thread.run(Thread.java:748)
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org