You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Julien Vaudour (JIRA)" <ji...@apache.org> on 2017/03/23 12:56:41 UTC
[jira] [Created] (HADOOP-14219) RumenToSLS: parsing problem with
crashed attempts
Julien Vaudour created HADOOP-14219:
---------------------------------------
Summary: RumenToSLS: parsing problem with crashed attempts
Key: HADOOP-14219
URL: https://issues.apache.org/jira/browse/HADOOP-14219
Project: Hadoop Common
Issue Type: Bug
Components: tools
Affects Versions: 2.6.0
Reporter: Julien Vaudour
Priority: Minor
In case of crashed task attempts, we may have in rumen logs task attempts with null hostName and finishTime defined to -1
for example
{code}
{
"resourceUsageMetrics": {
"heapUsage": 0,
"physicalMemoryUsage": 0,
"virtualMemoryUsage": 0,
"cumulativeCpuUsage": 0
},
"vmemKbytes": [],
"physMemKbytes": [],
"cpuUsages": [],
"clockSplits": [],
"location": null,
"sortFinished": -1,
"shuffleFinished": -1,
"spilledRecords": -1,
"reduceOutputRecords": -1,
"reduceShuffleBytes": -1,
"fileBytesRead": -1,
"hdfsBytesWritten": -1,
"hdfsBytesRead": -1,
"hostName": null,
"finishTime": -1,
"startTime": 1489619193378,
"result": null,
"attemptID": "attempt_1488896259152_410442_r_000015_1",
"fileBytesWritten": -1,
"mapInputRecords": -1,
"mapInputBytes": -1,
"mapOutputBytes": -1,
"mapOutputRecords": -1,
"combineInputRecords": -1,
"reduceInputGroups": -1,
"reduceInputRecords": -1
}
{code}
Jackson parser will automatically consider -1 as a java.lang.Integer. However RumenToSLSConverter make the assumption than jackson has deserialize all timstamp as instance of java.lang.Long, resulting in a ClassCastException.
RumenToSLSConverter also make the assumption that hostName is not null, so we can also have a NullPointerException.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-dev-help@hadoop.apache.org