You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sreepathi Prasanna (JIRA)" <ji...@apache.org> on 2014/10/10 23:26:34 UTC

[jira] [Commented] (SPARK-3901) Add SocketSink capability for Spark metrics

    [ https://issues.apache.org/jira/browse/SPARK-3901?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14167541#comment-14167541 ] 

Sreepathi Prasanna commented on SPARK-3901:
-------------------------------------------

For this, we need a SocketReporter class in coda hale which I have submitted a request for. 

https://github.com/dropwizard/metrics/pull/685

Once this is reviewed and merged into coda hale, we can use a socketsink class to send the metrics over socket. 

> Add SocketSink capability for Spark metrics
> -------------------------------------------
>
>                 Key: SPARK-3901
>                 URL: https://issues.apache.org/jira/browse/SPARK-3901
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 1.0.0, 1.1.0
>            Reporter: Sreepathi Prasanna
>            Priority: Minor
>             Fix For: 1.1.1
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Spark depends on Coda hale metrics library to collect metrics. Today we can send metrics to console, csv and jmx. We use chukwa as a monitoring framework to monitor the hadoop services. To extend the the framework to collect spark metrics, we need additional socketsink capability which is not there at the moment in Spark. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org