You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/07/23 00:53:04 UTC

[jira] [Assigned] (SPARK-8889) showDagViz will cause java.lang.OutOfMemoryError: Java heap space

     [ https://issues.apache.org/jira/browse/SPARK-8889?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-8889:
-----------------------------------

    Assignee: Apache Spark

> showDagViz will cause java.lang.OutOfMemoryError: Java heap space
> -----------------------------------------------------------------
>
>                 Key: SPARK-8889
>                 URL: https://issues.apache.org/jira/browse/SPARK-8889
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 1.4.0
>         Environment: Spark 1.4.0
> Hadoop 2.2.0
>            Reporter: cen yuhai
>            Assignee: Apache Spark
>             Fix For: 1.4.2
>
>
> HTTP ERROR 500
> Problem accessing /history/app-20150708101140-0018/jobs/job/. Reason:
>     Server Error
> Caused by:
> java.lang.OutOfMemoryError: Java heap space
> 	at java.util.Arrays.copyOf(Arrays.java:2367)
> 	at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
> 	at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
> 	at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415)
> 	at java.lang.StringBuilder.append(StringBuilder.java:132)
> 	at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:207)
> 	at org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$2.apply(RDDOperationGraph.scala:192)
> 	at org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$2.apply(RDDOperationGraph.scala:191)
> 	at scala.collection.immutable.Stream.foreach(Stream.scala:547)
> 	at org.apache.spark.ui.scope.RDDOperationGraph$.org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph(RDDOperationGraph.scala:191)
> 	at org.apache.spark.ui.scope.RDDOperationGraph$.makeDotFile(RDDOperationGraph.scala:170)
> 	at org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:361)
> 	at org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:357)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.immutable.List.foreach(List.scala:318)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:105)
> 	at org.apache.spark.ui.UIUtils$.showDagViz(UIUtils.scala:357)
> 	at org.apache.spark.ui.UIUtils$.showDagVizForJob(UIUtils.scala:335)
> 	at org.apache.spark.ui.jobs.JobPage.render(JobPage.scala:317)
> 	at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> 	at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> 	at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:69)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
> 	at org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
> 	at org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
> 	at org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
> 	at org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
> 	at org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
> 	at org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org