You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Ash (JIRA)" <ji...@apache.org> on 2014/09/05 23:11:28 UTC

[jira] [Commented] (SPARK-2099) Report TaskMetrics for running tasks

    [ https://issues.apache.org/jira/browse/SPARK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123584#comment-14123584 ] 

Andrew Ash commented on SPARK-2099:
-----------------------------------

I just gave this a runthrough and most of the metrics above are live-updated but the GC one isn't.  I think because it's not included in updateAggregateMetrics here: https://github.com/apache/spark/pull/1056/files#diff-1f32bcb61f51133bd0959a4177a066a5R175

Should I open a new ticket to make GC time live-updated?  That's the metric I was most excited about to see live-updating.

> Report TaskMetrics for running tasks
> ------------------------------------
>
>                 Key: SPARK-2099
>                 URL: https://issues.apache.org/jira/browse/SPARK-2099
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Sandy Ryza
>            Assignee: Sandy Ryza
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> Spark currently collects a set of helpful task metrics, like shuffle bytes written, GC time, and displays them on the app web UI.  These are only collected and displayed for tasks that have completed.  This makes them unsuited to perhaps the situation where they would be most useful - determining what's going wrong in currently running tasks.
> Reporting metrics progrss for running tasks would probably require adding an executor->driver heartbeat that reports metrics for all tasks currently running on the executor.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org