You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/02/05 22:18:00 UTC

[jira] [Resolved] (SPARK-23330) Spark UI SQL executions page throws NPE

     [ https://issues.apache.org/jira/browse/SPARK-23330?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-23330.
------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

Issue resolved by pull request 20502
[https://github.com/apache/spark/pull/20502]

> Spark UI SQL executions page throws NPE
> ---------------------------------------
>
>                 Key: SPARK-23330
>                 URL: https://issues.apache.org/jira/browse/SPARK-23330
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 2.3.0
>            Reporter: Jiang Xingbo
>            Assignee: Jiang Xingbo
>            Priority: Blocker
>             Fix For: 2.3.0
>
>
> Spark UI SQL executions page throws the following error and the page crashes:
> {code}
>  HTTP ERROR 500
>  Problem accessing /SQL/. Reason:
> Server Error
>  Caused by:
>  java.lang.NullPointerException
>  at scala.collection.immutable.StringOps$.length$extension(StringOps.scala:47)
>  at scala.collection.immutable.StringOps.length(StringOps.scala:47)
>  at scala.collection.IndexedSeqOptimized$class.isEmpty(IndexedSeqOptimized.scala:27)
>  at scala.collection.immutable.StringOps.isEmpty(StringOps.scala:29)
>  at scala.collection.TraversableOnce$class.nonEmpty(TraversableOnce.scala:111)
>  at scala.collection.immutable.StringOps.nonEmpty(StringOps.scala:29)
>  at org.apache.spark.sql.execution.ui.ExecutionTable.descriptionCell(AllExecutionsPage.scala:182)
>  at org.apache.spark.sql.execution.ui.ExecutionTable.row(AllExecutionsPage.scala:155)
>  at org.apache.spark.sql.execution.ui.ExecutionTable$$anonfun$8.apply(AllExecutionsPage.scala:204)
>  at org.apache.spark.sql.execution.ui.ExecutionTable$$anonfun$8.apply(AllExecutionsPage.scala:204)
>  at org.apache.spark.ui.UIUtils$$anonfun$listingTable$2.apply(UIUtils.scala:339)
>  at org.apache.spark.ui.UIUtils$$anonfun$listingTable$2.apply(UIUtils.scala:339)
>  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>  at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
>  at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>  at org.apache.spark.ui.UIUtils$.listingTable(UIUtils.scala:339)
>  at org.apache.spark.sql.execution.ui.ExecutionTable.toNodeSeq(AllExecutionsPage.scala:203)
>  at org.apache.spark.sql.execution.ui.AllExecutionsPage.render(AllExecutionsPage.scala:67)
>  at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:82)
>  at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:82)
>  at org.apache.spark.ui.JettyUtils$$anon$3.doGet(JettyUtils.scala:90)
>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
>  at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
>  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:584)
>  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
>  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
>  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
>  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>  at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
>  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
>  at org.eclipse.jetty.server.Server.handle(Server.java:534)
>  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
>  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
>  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
>  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108)
>  at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
>  at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
>  at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
>  at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
>  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
>  at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
>  at java.lang.Thread.run(Thread.java:748)
> {code}
>  Seems the bug is imported by [https://github.com/apache/spark/pull/19681/files#diff-a74d84702d8d47d5269e96740a55a3caR63]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org