You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/08/07 08:13:12 UTC

[jira] [Created] (SPARK-2899) Doc generation is not working in new SBT Build

Patrick Wendell created SPARK-2899:
--------------------------------------

             Summary: Doc generation is not working in new SBT Build
                 Key: SPARK-2899
                 URL: https://issues.apache.org/jira/browse/SPARK-2899
             Project: Spark
          Issue Type: Sub-task
          Components: Build
            Reporter: Patrick Wendell
            Assignee: Prashant Sharma


I noticed there are some errors when building the docs:

{code}
[error] /home/ubuntu/release/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:120: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.TaskUIData
[error]  required: org.apache.spark.ui.jobs.UIData.TaskUIData
[error]       stageData.taskData.put(taskInfo.taskId, new TaskUIData(taskInfo))
[error]                                               ^
[error] /home/ubuntu/release/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:142: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.ExecutorSummary
[error]  required: org.apache.spark.ui.jobs.UIData.ExecutorSummary
[error]       val execSummary = execSummaryMap.getOrElseUpdate(info.executorId, new ExecutorSummary)
[error]                                                                         ^
[error] /home/ubuntu/release/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:171: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.TaskUIData
[error]  required: org.apache.spark.ui.jobs.UIData.TaskUIData
[error]       val taskData = stageData.taskData.getOrElseUpdate(info.taskId, new TaskUIData(info))
{code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org