You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Amit Mittal <am...@gmail.com> on 2014/12/15 17:30:11 UTC

Exception in DFSOutputStream.checkClosed: when 39 mapper tasks

Hi All,

Here is an issue (exception in
DFSOutputStream.checkClosed(DFSOutputStream.java:1317)) we are seeing when
running a MapReduce program with 39 input files. Can you please have a
quick look and advise. Cluster has 22 data nodes and dedicated name nodes
and job tracker. Hadoop 2.2, using new MR API.

I am having a Mapper only program which read all the files from a folder in
HDFS, process them and write its output to HDFS using MultipleOutputs. So
far I have executed that program with 1-2 input files and it worked.
However when I tried that with 39 files, the same program started giving me
following exceptions after processing around 10-12 files.
Can it be by the call of out.close() method in the cleanup() method. It
should not impact other mappers. I have seen the following jira, but it
seems to be of single mapper and not my case.

https://issues.apache.org/jira/browse/HDFS-5335

Can you please help, Thanks.

Thanks
Amit