You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by abshkmodi <ab...@gmail.com> on 2015/06/17 08:00:12 UTC

Read/write metrics for jobs which use S3

I mostly use Amazon S3 for reading input data and writing output data for my
spark jobs. I want to know the numbers of bytes read & written by my job
from S3.

In hadoop, there are FileSystemCounters for this, is there something similar
in spark ? If there is, can you please guide me on how to use it ?

I saw there are some read/write metrics in TaskMetrics.scala. Is there a way
to get this by specifying a DataReadMethod in TaskMetrics.scala ? 



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Read-write-metrics-for-jobs-which-use-S3-tp12766.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org