You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "minchengbo (Jira)" <ji...@apache.org> on 2020/10/28 05:54:00 UTC
[jira] [Updated] (HADOOP-17333) MetricsRecordFiltered error
[ https://issues.apache.org/jira/browse/HADOOP-17333?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
minchengbo updated HADOOP-17333:
--------------------------------
Description:
Got sink exception,when set datanode.sink.ganglia.metric.filter.exclude=metricssystem in hadoop-metrics2.properties ,
java.lang.ClassCastException: org.apache.hadoop.metrics2.impl.MetricsRecordFiltered$1 cannot be cast to java.util.Collection
at org.apache.hadoop.metrics2.sink.ganglia.GangliaSink30.putMetrics(GangliaSink30.java:165)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:184)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:43)
at org.apache.hadoop.metrics2.impl.SinkQueue.consumeAll(SinkQueue.java:87)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.publishMetricsFromQueue(MetricsSinkAdapter.java:135)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter$1.run(MetricsSinkAdapter.java:89)
//////////////////////////////////////////////////
This case can show the exception
public static void main(String[] args) {
List<AbstractMetric> metricsd=new LinkedList<AbstractMetric>();
MetricsInfo info=MsInfo.ProcessName;
long timestamp=System.currentTimeMillis();
List<MetricsTag> tags=new LinkedList<>();
org.apache.hadoop.metrics2.impl.MetricsRecordImpl recordimp = new MetricsRecordImpl(info, timestamp, tags, metricsd);
MetricsFilter filter=new RegexFilter();
MetricsRecordFiltered recordfilter=new MetricsRecordFiltered(recordimp,filter);
SubsetConfiguration conf=new SubsetConfiguration(new PropertyListConfiguration(),"test");
conf.addProperty(AbstractGangliaSink.SUPPORT_SPARSE_METRICS_PROPERTY, true);
GangliaSink30 ganliasink=new GangliaSink30();
ganliasink.init(conf);
ganliasink.putMetrics(recordfilter);
}
///////////////////////////////////////////////////////////////
The root cause is:
Gets a Iterable object in MetricsRecordFiltered.java:
@Override public Iterable<AbstractMetric> metrics() {
return new Iterable<AbstractMetric>() {
final Iterator<AbstractMetric> it = delegate.metrics().iterator();
@Override public Iterator<AbstractMetric> iterator() {
return new AbstractIterator<AbstractMetric>() {
@Override public AbstractMetric computeNext() {
while (it.hasNext()) {
AbstractMetric next = it.next();
if (filter.accepts(next.name())) {
return next;
}
}
return (AbstractMetric)endOfData();
}
};
}
};
}
but convert to Collection in GangliaSink30.java line 164
Collection<AbstractMetric> metrics = (Collection<AbstractMetric>) record
.metrics();
was:
Got sink exception,when set datanode.sink.ganglia.metric.filter.exclude=metricssystem in hadoop-metrics2.properties ,
java.lang.ClassCastException: org.apache.hadoop.metrics2.impl.MetricsRecordFiltered$1 cannot be cast to java.util.Collection
at org.apache.hadoop.metrics2.sink.ganglia.GangliaSink30.putMetrics(GangliaSink30.java:165)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:184)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:43)
at org.apache.hadoop.metrics2.impl.SinkQueue.consumeAll(SinkQueue.java:87)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.publishMetricsFromQueue(MetricsSinkAdapter.java:135)
at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter$1.run(MetricsSinkAdapter.java:89)
Environment:
was:
This case can show the exception
public static void main(String[] args) {
List<AbstractMetric> metricsd=new LinkedList<AbstractMetric>();
MetricsInfo info=MsInfo.ProcessName;
long timestamp=System.currentTimeMillis();
List<MetricsTag> tags=new LinkedList<>();
org.apache.hadoop.metrics2.impl.MetricsRecordImpl recordimp = new MetricsRecordImpl(info, timestamp, tags, metricsd);
MetricsFilter filter=new RegexFilter();
MetricsRecordFiltered recordfilter=new MetricsRecordFiltered(recordimp,filter);
SubsetConfiguration conf=new SubsetConfiguration(new PropertyListConfiguration(),"test");
conf.addProperty(AbstractGangliaSink.SUPPORT_SPARSE_METRICS_PROPERTY, true);
GangliaSink30 ganliasink=new GangliaSink30();
ganliasink.init(conf);
ganliasink.putMetrics(recordfilter);
}
///////////////////////////////////////////////////////////////
The root cause is:
Gets a Iterable object in MetricsRecordFiltered.java:
@Override public Iterable<AbstractMetric> metrics() {
return new Iterable<AbstractMetric>() {
final Iterator<AbstractMetric> it = delegate.metrics().iterator();
@Override public Iterator<AbstractMetric> iterator() {
return new AbstractIterator<AbstractMetric>() {
@Override public AbstractMetric computeNext() {
while (it.hasNext()) {
AbstractMetric next = it.next();
if (filter.accepts(next.name())) {
return next;
}
}
return (AbstractMetric)endOfData();
}
};
}
};
}
but convert to Collection in GangliaSink30.java line 164
Collection<AbstractMetric> metrics = (Collection<AbstractMetric>) record
.metrics();
> MetricsRecordFiltered error
> ---------------------------
>
> Key: HADOOP-17333
> URL: https://issues.apache.org/jira/browse/HADOOP-17333
> Project: Hadoop Common
> Issue Type: Bug
> Components: common
> Affects Versions: 3.2.1
> Environment:
> Reporter: minchengbo
> Priority: Minor
>
> Got sink exception,when set datanode.sink.ganglia.metric.filter.exclude=metricssystem in hadoop-metrics2.properties ,
> java.lang.ClassCastException: org.apache.hadoop.metrics2.impl.MetricsRecordFiltered$1 cannot be cast to java.util.Collection
> at org.apache.hadoop.metrics2.sink.ganglia.GangliaSink30.putMetrics(GangliaSink30.java:165)
> at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:184)
> at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.consume(MetricsSinkAdapter.java:43)
> at org.apache.hadoop.metrics2.impl.SinkQueue.consumeAll(SinkQueue.java:87)
> at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter.publishMetricsFromQueue(MetricsSinkAdapter.java:135)
> at org.apache.hadoop.metrics2.impl.MetricsSinkAdapter$1.run(MetricsSinkAdapter.java:89)
> //////////////////////////////////////////////////
> This case can show the exception
> public static void main(String[] args) {
> List<AbstractMetric> metricsd=new LinkedList<AbstractMetric>();
> MetricsInfo info=MsInfo.ProcessName;
> long timestamp=System.currentTimeMillis();
> List<MetricsTag> tags=new LinkedList<>();
> org.apache.hadoop.metrics2.impl.MetricsRecordImpl recordimp = new MetricsRecordImpl(info, timestamp, tags, metricsd);
> MetricsFilter filter=new RegexFilter();
> MetricsRecordFiltered recordfilter=new MetricsRecordFiltered(recordimp,filter);
> SubsetConfiguration conf=new SubsetConfiguration(new PropertyListConfiguration(),"test");
> conf.addProperty(AbstractGangliaSink.SUPPORT_SPARSE_METRICS_PROPERTY, true);
> GangliaSink30 ganliasink=new GangliaSink30();
> ganliasink.init(conf);
> ganliasink.putMetrics(recordfilter);
>
> }
> ///////////////////////////////////////////////////////////////
> The root cause is:
> Gets a Iterable object in MetricsRecordFiltered.java:
> @Override public Iterable<AbstractMetric> metrics() {
> return new Iterable<AbstractMetric>() {
> final Iterator<AbstractMetric> it = delegate.metrics().iterator();
> @Override public Iterator<AbstractMetric> iterator() {
> return new AbstractIterator<AbstractMetric>() {
> @Override public AbstractMetric computeNext() {
> while (it.hasNext()) {
> AbstractMetric next = it.next();
> if (filter.accepts(next.name())) {
> return next;
> }
> }
> return (AbstractMetric)endOfData();
> }
> };
> }
> };
> }
> but convert to Collection in GangliaSink30.java line 164
> Collection<AbstractMetric> metrics = (Collection<AbstractMetric>) record
> .metrics();
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org