You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/16 14:34:39 UTC
[jira] [Resolved] (SPARK-1972) Add support for setting and
visualizing custom task-related metrics
[ https://issues.apache.org/jira/browse/SPARK-1972?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-1972.
------------------------------
Resolution: Won't Fix
> Add support for setting and visualizing custom task-related metrics
> -------------------------------------------------------------------
>
> Key: SPARK-1972
> URL: https://issues.apache.org/jira/browse/SPARK-1972
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 0.9.1
> Reporter: Kalpit Shah
> Original Estimate: 72h
> Remaining Estimate: 72h
>
> Various RDDs may want to set/track custom metrics for improved monitoring and performance tuning. For e.g.:
> 1. A Task involving a JdbcRDD may want to track some metric related to JDBC execution.
> 2. A Task involving a user-defined RDD may want to track some metric specific to user's application.
> We currently use TaskMetrics for tracking task-related metrics, which provides no way of tracking custom task-related metrics. It is not good to introduce a new field in TaskMetric everytime we want to track a custom metric. That approach would be cumbersome and ugly. Besides, some of these custom metrics may only make sense for a specific RDD-subclass. Therefore, we need TaskMetrics to provide a generic way to allow RDD-subclasses to track custom metrics when computing partitions.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org