You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by aurelien violette <au...@webgroup-limited.com> on 2016/04/12 19:10:14 UTC

Re: Issue managing SequenceFiles - Corrupted files ?

Hi all,

I've been struggling with this for a while. I'm pretty sure there is
something I miss to make this work correctly.

My flow is the following :
1 - I use a MR job to dump an Elasticsearch index to HDFS as a
SequenceFile. SequenceFile is <Text, MaWritable>.
2 - I use an other job to treat data later. Pretty much any job on any dump
sends me the exception :

Error: java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:197)
        at org.apache.hadoop.io.Text.readWithKnownLength(Text.java:319)
        at org.apache.hadoop.io.Text.readFields(Text.java:291)
        at org.apache.hadoop.io
.ArrayWritable.readFields(ArrayWritable.java:96)
        at org.elasticsearch.hadoop.mr
.WritableArrayWritable.readFields(WritableArrayWritable.java:52)
        at org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:188)
        at org.apache.hadoop.io
.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
        at org.apache.hadoop.io
.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
        at org.apache.hadoop.io
.SequenceFile$Reader.deserializeValue(SequenceFile.java:2247)
        at org.apache.hadoop.io
.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2220)
        at
org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:78)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
        at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

I can't believe that my disk are corrupted. So my guess, either I have an
issue to write the files or to read them.

Any idea to investigate the issue? I'm using hadoop 2.7.2.

Thank you

-- 
BR,
Aurelien Violette