You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/01/18 21:57:39 UTC

[jira] [Assigned] (SPARK-12887) Do not expose var's in TaskMetrics

     [ https://issues.apache.org/jira/browse/SPARK-12887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-12887:
------------------------------------

    Assignee: Apache Spark  (was: Andrew Or)

> Do not expose var's in TaskMetrics
> ----------------------------------
>
>                 Key: SPARK-12887
>                 URL: https://issues.apache.org/jira/browse/SPARK-12887
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Andrew Or
>            Assignee: Apache Spark
>
> TaskMetrics has a bunch of var's, some are fully public, some are private[spark]. This is bad coding style that makes it easy to accidentally overwrite previously set metrics. This has happened a few times in the past and caused bugs that were difficult to debug.
> Instead, we should have get-or-create semantics, which are more readily understandable. This makes sense in the case of TaskMetrics because these are just aggregated metrics that we want to collect throughout the task, so it doesn't matter *who*'s incrementing them.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org