You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@chukwa.apache.org by "Huang, Jie" <ji...@intel.com> on 2012/07/17 04:16:39 UTC
ShuffleInput is not under the "mapred" context since Hadoop 1.0
metrics2
Hi all,
Trying to upgrade Hadoop to 1.0.x, we found that the shuffleInput metric (for each reduce task) is not under the “mapred” context, according to the following implementation in ReduceTask.java. Currently, the ShuffleInput metric is under “default” context, since it is not set explicitly.
Consequently, we cannot have “Hadoop_mapred_shuffleInput” directory under repos/*/ folder.
{
final MetricsRegistry registry = new MetricsRegistry("shuffleInput");
final MetricMutableCounterLong inputBytes =
registry.newCounter("shuffle_input_bytes", "", 0L);
final MetricMutableCounterInt failedFetches =
registry.newCounter("shuffle_failed_fetches", "", 0);
final MetricMutableCounterInt successFetches =
registry.newCounter("shuffle_success_fetches", "", 0);
private volatile int threadsBusy = 0;
@SuppressWarnings("deprecation")
ShuffleClientInstrumentation(JobConf conf) {
registry.tag("user", "User name", conf.getUser())
.tag("jobName", "Job name", conf.getJobName())
.tag("jobId", "Job ID", ReduceTask.this.getJobID().toString())
.tag("taskId", "Task ID", getTaskID().toString())
.tag("sessionId", "Session ID", conf.getSessionId());
}
}
Thank you && Best Regards,
Grace (Huang Jie)
---------------------------------------------------------------------
SSG PRC Cloud Computing
Intel Asia-Pacific Research & Development Ltd.
No. 880 Zi Xing Road
Shanghai, PRC, 200241
Phone: (86-21) 61166031
RE: ShuffleInput is not under the "mapred" context since Hadoop 1.0
metrics2
Posted by "Huang, Jie" <ji...@intel.com>.
We'd better to do the modification in Hadoop side, like:
{
final MetricsRegistry registry = new MetricsRegistry("shuffleInput");
final MetricMutableCounterLong inputBytes =
registry.newCounter("shuffle_input_bytes", "", 0L);
final MetricMutableCounterInt failedFetches =
registry.newCounter("shuffle_failed_fetches", "", 0);
final MetricMutableCounterInt successFetches =
registry.newCounter("shuffle_success_fetches", "", 0);
private volatile int threadsBusy = 0;
@SuppressWarnings("deprecation")
ShuffleClientInstrumentation(JobConf conf) {
registry.setContext("mapred") <<< fix point
.tag("user", "User name", conf.getUser())
.tag("jobName", "Job name", conf.getJobName())
.tag("jobId", "Job ID", ReduceTask.this.getJobID().toString())
.tag("taskId", "Task ID", getTaskID().toString())
.tag("sessionId", "Session ID", conf.getSessionId());
}
}
Thank you && Best Regards,
Grace (Huang Jie)
-----Original Message-----
From: Ariel Rabkin [mailto:asrabkin@gmail.com]
Sent: Tuesday, July 17, 2012 10:33 AM
To: chukwa-dev@incubator.apache.org
Subject: Re: ShuffleInput is not under the "mapred" context since Hadoop 1.0 metrics2
Aha. Do you have a proposed fix for the problem?
--Ari
On Mon, Jul 16, 2012 at 10:16 PM, Huang, Jie <ji...@intel.com> wrote:
> Hi all,
>
> Trying to upgrade Hadoop to 1.0.x, we found that the shuffleInput metric (for each reduce task) is not under the “mapred” context, according to the following implementation in ReduceTask.java. Currently, the ShuffleInput metric is under “default” context, since it is not set explicitly.
>
> Consequently, we cannot have “Hadoop_mapred_shuffleInput” directory under repos/*/ folder.
--
Ari Rabkin asrabkin@gmail.com
Princeton Computer Science Department
Re: ShuffleInput is not under the "mapred" context since Hadoop 1.0 metrics2
Posted by Ariel Rabkin <as...@gmail.com>.
Aha. Do you have a proposed fix for the problem?
--Ari
On Mon, Jul 16, 2012 at 10:16 PM, Huang, Jie <ji...@intel.com> wrote:
> Hi all,
>
> Trying to upgrade Hadoop to 1.0.x, we found that the shuffleInput metric (for each reduce task) is not under the “mapred” context, according to the following implementation in ReduceTask.java. Currently, the ShuffleInput metric is under “default” context, since it is not set explicitly.
>
> Consequently, we cannot have “Hadoop_mapred_shuffleInput” directory under repos/*/ folder.
--
Ari Rabkin asrabkin@gmail.com
Princeton Computer Science Department