You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/08/01 20:09:44 UTC

[jira] [Resolved] (SPARK-2099) Report TaskMetrics for running tasks

     [ https://issues.apache.org/jira/browse/SPARK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Patrick Wendell resolved SPARK-2099.
------------------------------------

       Resolution: Fixed
    Fix Version/s: 1.1.0

Issue resolved by pull request 1056
[https://github.com/apache/spark/pull/1056]

> Report TaskMetrics for running tasks
> ------------------------------------
>
>                 Key: SPARK-2099
>                 URL: https://issues.apache.org/jira/browse/SPARK-2099
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Sandy Ryza
>            Assignee: Sandy Ryza
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> Spark currently collects a set of helpful task metrics, like shuffle bytes written, GC time, and displays them on the app web UI.  These are only collected and displayed for tasks that have completed.  This makes them unsuited to perhaps the situation where they would be most useful - determining what's going wrong in currently running tasks.
> Reporting metrics progrss for running tasks would probably require adding an executor->driver heartbeat that reports metrics for all tasks currently running on the executor.



--
This message was sent by Atlassian JIRA
(v6.2#6252)