You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by arindam choudhury <ar...@gmail.com> on 2012/01/24 16:27:17 UTC
getting hadoop job resource usage statistics
Hi,
How can I get CPU, memory, network and disk usage statistics of a hadoop
job?
Thanks,
Arindam
Re: getting hadoop job resource usage statistics
Posted by Jabir Ahmed <ja...@gmail.com>.
http://$jobtrackerhost:50030/metrics
http://$namenode:50070/metrics
should give you enough metrics? also you could use ganglia to collect them
jabir
On Tue, Jan 24, 2012 at 10:17 PM, Arun C Murthy <ac...@hortonworks.com> wrote:
> You can currently get CPU & memory stats for each task and aggregated
> stats per job via MapReduce Counters.
>
> Arun
>
> On Jan 24, 2012, at 7:27 AM, arindam choudhury wrote:
>
> > Hi,
> >
> > How can I get CPU, memory, network and disk usage statistics of a hadoop
> > job?
> >
> > Thanks,
> > Arindam
>
>
--
"The best way to make your dreams come true is to wake up." - Paul Valery
--
Phone: +91 99162 93063
IM/e-mail: jabirahmed@yahoo.com , jabirahmed@gmail.com
Re: getting hadoop job resource usage statistics
Posted by Arun C Murthy <ac...@hortonworks.com>.
You can currently get CPU & memory stats for each task and aggregated stats per job via MapReduce Counters.
Arun
On Jan 24, 2012, at 7:27 AM, arindam choudhury wrote:
> Hi,
>
> How can I get CPU, memory, network and disk usage statistics of a hadoop
> job?
>
> Thanks,
> Arindam