You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Techy Teck <co...@gmail.com> on 2012/08/06 23:53:57 UTC
Caused by: java.io.EOFException
I am writing a simple query on our hive table and I am getting some
exception-
select count(*) from table1 where dt='20120731';
java.io.IOException: IO error in map input file
hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:236)
*Caused by: java.io.EOFException*
at java.io.DataInputStream.readFully(DataInputStream.java:180)
at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
at
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
at
org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
at
org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
at
org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
at
org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
at
org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
... 9 more
Can anyone suggest me what does *Caused by: java.io.EOFException *means
here? And when I ran the same query for different date (dt), then it works
fine.
Re: Caused by: java.io.EOFException
Posted by Techy Teck <co...@gmail.com>.
Yes I created that file manually. But other files are fine only that
particular file is having problem.
Is there any way I can fix that file?
On Mon, Aug 6, 2012 at 9:51 PM, shashwat shriparv <dwivedishashwat@gmail.com
> wrote:
> There are some extra information about which file system does not know,
> have you build that file mannually?
>
>
> On Tue, Aug 7, 2012 at 6:01 AM, Techy Teck <co...@gmail.com>wrote:
>
>> Yup that makes sense. But when I tried opening that file using-
>>
>> hadoop fs -text
>> /apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>>
>> I can see my file contents there? Then what's wrong with that file? And
>> is there any way I can fix that error in that file using some script?
>>
>>
>>
>>
>> On Mon, Aug 6, 2012 at 5:27 PM, Bejoy KS <be...@yahoo.com> wrote:
>>
>>> **
>>>
>>> It could be like the file corresponding to the partition dt='20120731'
>>> got corrupted.
>>>
>>> This file as pointed in the error logs should be the culprit.
>>>
>>> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>>>
>>>
>>>
>>>
>>> Regards
>>> Bejoy KS
>>>
>>> Sent from handheld, please excuse typos.
>>> ------------------------------
>>> *From: * Techy Teck <co...@gmail.com>
>>> *Date: *Mon, 6 Aug 2012 14:53:57 -0700
>>> *To: *<us...@hive.apache.org>
>>> *ReplyTo: * user@hive.apache.org
>>> *Subject: *Caused by: java.io.EOFException
>>>
>>> I am writing a simple query on our hive table and I am getting some
>>> exception-
>>>
>>> select count(*) from table1 where dt='20120731';
>>>
>>>
>>>
>>> java.io.IOException: IO error in map input file
>>> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>>>
>>> at
>>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
>>>
>>> at
>>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
>>>
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
>>>
>>> at
>>> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
>>>
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
>>>
>>> at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at javax.security.auth.Subject.doAs(Subject.java:396)
>>>
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>>>
>>> at org.apache.hadoop.mapred.Child.main(Child.java:236)
>>>
>>> *Caused by: java.io.EOFException*
>>>
>>> at java.io.DataInputStream.readFully(DataInputStream.java:180)
>>>
>>> at
>>> org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
>>>
>>> at
>>> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
>>>
>>> at
>>> org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
>>>
>>> at
>>> org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
>>>
>>> at
>>> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
>>>
>>> at
>>> org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
>>>
>>> at
>>> org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
>>>
>>> at
>>> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
>>>
>>> at
>>> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
>>>
>>> at
>>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
>>>
>>> ... 9 more
>>>
>>>
>>>
>>>
>>> Can anyone suggest me what does *Caused by: java.io.EOFException *means
>>> here? And when I ran the same query for different date (dt), then it works
>>> fine.
>>>
>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>
Re: Caused by: java.io.EOFException
Posted by shashwat shriparv <dw...@gmail.com>.
There are some extra information about which file system does not know,
have you build that file mannually?
On Tue, Aug 7, 2012 at 6:01 AM, Techy Teck <co...@gmail.com> wrote:
> Yup that makes sense. But when I tried opening that file using-
>
> hadoop fs -text
> /apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>
> I can see my file contents there? Then what's wrong with that file? And is
> there any way I can fix that error in that file using some script?
>
>
>
>
> On Mon, Aug 6, 2012 at 5:27 PM, Bejoy KS <be...@yahoo.com> wrote:
>
>> **
>>
>> It could be like the file corresponding to the partition dt='20120731'
>> got corrupted.
>>
>> This file as pointed in the error logs should be the culprit.
>>
>> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>>
>>
>>
>>
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: * Techy Teck <co...@gmail.com>
>> *Date: *Mon, 6 Aug 2012 14:53:57 -0700
>> *To: *<us...@hive.apache.org>
>> *ReplyTo: * user@hive.apache.org
>> *Subject: *Caused by: java.io.EOFException
>>
>> I am writing a simple query on our hive table and I am getting some
>> exception-
>>
>> select count(*) from table1 where dt='20120731';
>>
>>
>>
>> java.io.IOException: IO error in map input file
>> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>>
>> at
>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
>>
>> at
>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
>>
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
>>
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
>>
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
>>
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:396)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>>
>> at org.apache.hadoop.mapred.Child.main(Child.java:236)
>>
>> *Caused by: java.io.EOFException*
>>
>> at java.io.DataInputStream.readFully(DataInputStream.java:180)
>>
>> at
>> org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
>>
>> at
>> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
>>
>> at
>> org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
>>
>> at
>> org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
>>
>> at
>> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
>>
>> at
>> org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
>>
>> at
>> org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
>>
>> at
>> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
>>
>> at
>> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
>>
>> at
>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
>>
>> ... 9 more
>>
>>
>>
>>
>> Can anyone suggest me what does *Caused by: java.io.EOFException *means
>> here? And when I ran the same query for different date (dt), then it works
>> fine.
>>
>
>
--
∞
Shashwat Shriparv
Re: Caused by: java.io.EOFException
Posted by Techy Teck <co...@gmail.com>.
Yup that makes sense. But when I tried opening that file using-
hadoop fs -text
/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
I can see my file contents there? Then what's wrong with that file? And is
there any way I can fix that error in that file using some script?
On Mon, Aug 6, 2012 at 5:27 PM, Bejoy KS <be...@yahoo.com> wrote:
> **
>
> It could be like the file corresponding to the partition dt='20120731' got
> corrupted.
>
> This file as pointed in the error logs should be the culprit.
>
> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>
>
>
>
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: * Techy Teck <co...@gmail.com>
> *Date: *Mon, 6 Aug 2012 14:53:57 -0700
> *To: *<us...@hive.apache.org>
> *ReplyTo: * user@hive.apache.org
> *Subject: *Caused by: java.io.EOFException
>
> I am writing a simple query on our hive table and I am getting some
> exception-
>
> select count(*) from table1 where dt='20120731';
>
>
>
> java.io.IOException: IO error in map input file
> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>
> at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
>
> at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:236)
>
> *Caused by: java.io.EOFException*
>
> at java.io.DataInputStream.readFully(DataInputStream.java:180)
>
> at
> org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
>
> at
> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
>
> at
> org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
>
> at
> org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
>
> at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
>
> at
> org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
>
> at
> org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
>
> at
> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
>
> at
> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
>
> at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
>
> ... 9 more
>
>
>
>
> Can anyone suggest me what does *Caused by: java.io.EOFException *means
> here? And when I ran the same query for different date (dt), then it works
> fine.
>
Re: Caused by: java.io.EOFException
Posted by Bejoy KS <be...@yahoo.com>.
It could be like the file corresponding to the partition dt='20120731' got corrupted.
This file as pointed in the error logs should be the culprit.
hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-----Original Message-----
From: Techy Teck <co...@gmail.com>
Date: Mon, 6 Aug 2012 14:53:57
To: <us...@hive.apache.org>
Reply-To: user@hive.apache.org
Subject: Caused by: java.io.EOFException
I am writing a simple query on our hive table and I am getting some
exception-
select count(*) from table1 where dt='20120731';
java.io.IOException: IO error in map input file
hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:236)
*Caused by: java.io.EOFException*
at java.io.DataInputStream.readFully(DataInputStream.java:180)
at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
at
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
at
org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
at
org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
at
org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
at
org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
at
org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
... 9 more
Can anyone suggest me what does *Caused by: java.io.EOFException *means
here? And when I ran the same query for different date (dt), then it works
fine.