You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/01/19 21:18:39 UTC

[jira] [Commented] (SPARK-12895) Implement TaskMetrics using accumulators

    [ https://issues.apache.org/jira/browse/SPARK-12895?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15107358#comment-15107358 ] 

Apache Spark commented on SPARK-12895:
--------------------------------------

User 'andrewor14' has created a pull request for this issue:
https://github.com/apache/spark/pull/10835

> Implement TaskMetrics using accumulators
> ----------------------------------------
>
>                 Key: SPARK-12895
>                 URL: https://issues.apache.org/jira/browse/SPARK-12895
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>
> We need to first do this before we can avoid sending TaskMetrics from the executors to the driver. After we do this, we can send only accumulator updates instead of both that AND TaskMetrics.
> By the end of this issue TaskMetrics will be a wrapper of accumulators. It will be only syntactic sugar for setting these accumulators.
> But first, we need to express everything in TaskMetrics as accumulators.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org