You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "meiyoula (JIRA)" <ji...@apache.org> on 2015/06/16 10:54:00 UTC

[jira] [Created] (SPARK-8391) showDagViz throws OutOfMemoryError, and the whole jobPage throws ERROR

meiyoula created SPARK-8391:
-------------------------------

             Summary: showDagViz throws OutOfMemoryError, and the whole jobPage throws ERROR
                 Key: SPARK-8391
                 URL: https://issues.apache.org/jira/browse/SPARK-8391
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: meiyoula


When the job is big, and has so many DAG nodes and edges.showDagViz throws ERROR, then the whole jobPage render is down. I think it's unsuitable. An element node can't down the whole page.

Below is the exception stack trace:
bq.
java.lang.OutOfMemoryError: Requested array size exceeds VM limit
        at java.util.Arrays.copyOf(Arrays.java:3332)
        at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:137)
        at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:121)
        at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:421)
        at java.lang.StringBuilder.append(StringBuilder.java:136)
        at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:207)
        at org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$makeDotFile$1.apply(RDDOperationGraph.scala:171)
        at org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$makeDotFile$1.apply(RDDOperationGraph.scala:171)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
        at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
        at org.apache.spark.ui.scope.RDDOperationGraph$.makeDotFile(RDDOperationGraph.scala:171)
        at org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:389)
        at org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:385)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at org.apache.spark.ui.UIUtils$.showDagViz(UIUtils.scala:385)
        at org.apache.spark.ui.UIUtils$.showDagVizForJob(UIUtils.scala:363)
        at org.apache.spark.ui.jobs.JobPage.render(JobPage.scala:317)
        at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
        at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
        at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:75)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
        at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
        at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1496)
        at com.huawei.spark.web.filter.SessionTimeoutFilter.doFilter(SessionTimeoutFilter.java:80)
        at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
        at org.jasig.cas.client.util.HttpServletRequestWrapperFilter.doFilter(HttpServletRequestWrapperFilter.java:75



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org