You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by waqas latif <wa...@gmail.com> on 2012/05/25 11:42:55 UTC

EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Hi Experts,

I am fairly new to hadoop MapR and I was trying to run a matrix
multiplication example presented by Mr. Norstadt under following link
http://www.norstad.org/matrix-multiply/index.html. I can run it
successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3 but
I am getting following error. Is it the problem with my hadoop
configuration or it is compatibility problem in the code which was written
in hadoop 0.20 by author.Also please guide me that how can I fix this error
in either case. Here is the error I am getting.

in thread "main" java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:180)
        at java.io.DataInputStream.readFully(DataInputStream.java:152)
        at
org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
        at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
        at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
        at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Thanks in advance

Regards,
waqas

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by Harsh J <ha...@cloudera.com>.
Waqas,

Can you ensure this file isn't empty (0 in size)?

On Fri, May 25, 2012 at 3:12 PM, waqas latif <wa...@gmail.com> wrote:
> Hi Experts,
>
> I am fairly new to hadoop MapR and I was trying to run a matrix
> multiplication example presented by Mr. Norstadt under following link
> http://www.norstad.org/matrix-multiply/index.html. I can run it
> successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3 but
> I am getting following error. Is it the problem with my hadoop
> configuration or it is compatibility problem in the code which was written
> in hadoop 0.20 by author.Also please guide me that how can I fix this error
> in either case. Here is the error I am getting.
>
> in thread "main" java.io.EOFException
>        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
> Thanks in advance
>
> Regards,
> waqas



-- 
Harsh J

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by waqas latif <wa...@gmail.com>.
Thanks Harsh. I got it running.

On Wed, May 30, 2012 at 5:58 PM, Harsh J <ha...@cloudera.com> wrote:

> When your code does a listStatus, you can pass a PathFilter object
> along that can do this filtering for you. See
>
> http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#listStatus(org.apache.hadoop.fs.Path,%20org.apache.hadoop.fs.PathFilter)
> for the API javadocs on that.
>
> On Wed, May 30, 2012 at 7:46 PM, waqas latif <wa...@gmail.com> wrote:
> > I got the problem with I am unable to solve it. I need to apply a filter
> > for _SUCCESS file while using FileSystem.listStatus method. Can someone
> > please guide me how to filter _SUCCESS files. Thanks
> >
> > On Tue, May 29, 2012 at 1:42 PM, waqas latif <wa...@gmail.com> wrote:
> >
> >> So my question is that do hadoop 0.20 and 1.0.3 differ in their support
> of
> >> writing or reading sequencefiles? same code works fine with hadoop 0.20
> but
> >> problem occurs when run it under hadoop 1.0.3.
> >>
> >>
> >> On Sun, May 27, 2012 at 6:15 PM, waqas latif <wa...@gmail.com>
> wrote:
> >>
> >>> But the thing is, it works with hadoop 0.20. even with 100 x100(and
> even
> >>> bigger matrices)  but when it comes to hadoop 1.0.3 then even there is
> a
> >>> problem with 3x3 matrix.
> >>>
> >>>
> >>> On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi <
> >>> prash1784@gmail.com> wrote:
> >>>
> >>>> I have seen this issue with large file writes using SequenceFile
> writer.
> >>>> Not found the same issue when testing with writing fairly small files
> ( <
> >>>> 1GB).
> >>>>
> >>>> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
> >>>> <ka...@gmail.com>wrote:
> >>>>
> >>>> > Hi,
> >>>> > If you are using a custom writable object while passing data from
> the
> >>>> > mapper to the reducer make sure that the read fields and the write
> has
> >>>> the
> >>>> > same number of variables. It might be possible that you wrote
> datavtova
> >>>> > file using custom writable but later modified the custom writable
> (like
> >>>> > adding new attribute to the writable) which the old data doesn't
> have.
> >>>> >
> >>>> > It might be a possibility is please check once
> >>>> >
> >>>> > On Friday, May 25, 2012, waqas latif wrote:
> >>>> >
> >>>> > > Hi Experts,
> >>>> > >
> >>>> > > I am fairly new to hadoop MapR and I was trying to run a matrix
> >>>> > > multiplication example presented by Mr. Norstadt under following
> link
> >>>> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
> >>>> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop
> >>>> 1.0.3
> >>>> > but
> >>>> > > I am getting following error. Is it the problem with my hadoop
> >>>> > > configuration or it is compatibility problem in the code which was
> >>>> > written
> >>>> > > in hadoop 0.20 by author.Also please guide me that how can I fix
> this
> >>>> > error
> >>>> > > in either case. Here is the error I am getting.
> >>>> > >
> >>>> > > in thread "main" java.io.EOFException
> >>>> > >        at
> java.io.DataInputStream.readFully(DataInputStream.java:180)
> >>>> > >        at
> java.io.DataInputStream.readFully(DataInputStream.java:152)
> >>>> > >        at
> >>>> > >
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
> >>>> > >        at
> >>>> > >
> >>>>
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
> >>>> > >        at
> >>>> > >
> >>>>
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
> >>>> > >        at
> >>>> > >
> >>>>
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
> >>>> > >        at
> TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
> >>>> > >        at
> TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
> >>>> > >        at
> TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
> >>>> > >        at
> TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
> >>>> > >        at
> TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
> >>>> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
> >>>> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> >>>> > >        at
> >>>> > >
> >>>> > >
> >>>> >
> >>>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>> > >        at
> >>>> > >
> >>>> > >
> >>>> >
> >>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>> > >        at java.lang.reflect.Method.invoke(Method.java:597)
> >>>> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>>> > >
> >>>> > > Thanks in advance
> >>>> > >
> >>>> > > Regards,
> >>>> > > waqas
> >>>> > >
> >>>> >
> >>>>
> >>>
> >>>
> >>
>
>
>
> --
> Harsh J
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by Harsh J <ha...@cloudera.com>.
When your code does a listStatus, you can pass a PathFilter object
along that can do this filtering for you. See
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#listStatus(org.apache.hadoop.fs.Path,%20org.apache.hadoop.fs.PathFilter)
for the API javadocs on that.

On Wed, May 30, 2012 at 7:46 PM, waqas latif <wa...@gmail.com> wrote:
> I got the problem with I am unable to solve it. I need to apply a filter
> for _SUCCESS file while using FileSystem.listStatus method. Can someone
> please guide me how to filter _SUCCESS files. Thanks
>
> On Tue, May 29, 2012 at 1:42 PM, waqas latif <wa...@gmail.com> wrote:
>
>> So my question is that do hadoop 0.20 and 1.0.3 differ in their support of
>> writing or reading sequencefiles? same code works fine with hadoop 0.20 but
>> problem occurs when run it under hadoop 1.0.3.
>>
>>
>> On Sun, May 27, 2012 at 6:15 PM, waqas latif <wa...@gmail.com> wrote:
>>
>>> But the thing is, it works with hadoop 0.20. even with 100 x100(and even
>>> bigger matrices)  but when it comes to hadoop 1.0.3 then even there is a
>>> problem with 3x3 matrix.
>>>
>>>
>>> On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi <
>>> prash1784@gmail.com> wrote:
>>>
>>>> I have seen this issue with large file writes using SequenceFile writer.
>>>> Not found the same issue when testing with writing fairly small files ( <
>>>> 1GB).
>>>>
>>>> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
>>>> <ka...@gmail.com>wrote:
>>>>
>>>> > Hi,
>>>> > If you are using a custom writable object while passing data from the
>>>> > mapper to the reducer make sure that the read fields and the write has
>>>> the
>>>> > same number of variables. It might be possible that you wrote datavtova
>>>> > file using custom writable but later modified the custom writable (like
>>>> > adding new attribute to the writable) which the old data doesn't have.
>>>> >
>>>> > It might be a possibility is please check once
>>>> >
>>>> > On Friday, May 25, 2012, waqas latif wrote:
>>>> >
>>>> > > Hi Experts,
>>>> > >
>>>> > > I am fairly new to hadoop MapR and I was trying to run a matrix
>>>> > > multiplication example presented by Mr. Norstadt under following link
>>>> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
>>>> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop
>>>> 1.0.3
>>>> > but
>>>> > > I am getting following error. Is it the problem with my hadoop
>>>> > > configuration or it is compatibility problem in the code which was
>>>> > written
>>>> > > in hadoop 0.20 by author.Also please guide me that how can I fix this
>>>> > error
>>>> > > in either case. Here is the error I am getting.
>>>> > >
>>>> > > in thread "main" java.io.EOFException
>>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>>>> > >        at
>>>> > > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>>>> > >        at
>>>> > >
>>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>>>> > >        at
>>>> > >
>>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>>>> > >        at
>>>> > >
>>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>>>> > >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>>>> > >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>>>> > >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>>>> > >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>>>> > >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>>>> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>>>> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> > >        at
>>>> > >
>>>> > >
>>>> >
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> > >        at
>>>> > >
>>>> > >
>>>> >
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> > >        at java.lang.reflect.Method.invoke(Method.java:597)
>>>> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>> > >
>>>> > > Thanks in advance
>>>> > >
>>>> > > Regards,
>>>> > > waqas
>>>> > >
>>>> >
>>>>
>>>
>>>
>>



-- 
Harsh J

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by waqas latif <wa...@gmail.com>.
I got the problem with I am unable to solve it. I need to apply a filter
for _SUCCESS file while using FileSystem.listStatus method. Can someone
please guide me how to filter _SUCCESS files. Thanks

On Tue, May 29, 2012 at 1:42 PM, waqas latif <wa...@gmail.com> wrote:

> So my question is that do hadoop 0.20 and 1.0.3 differ in their support of
> writing or reading sequencefiles? same code works fine with hadoop 0.20 but
> problem occurs when run it under hadoop 1.0.3.
>
>
> On Sun, May 27, 2012 at 6:15 PM, waqas latif <wa...@gmail.com> wrote:
>
>> But the thing is, it works with hadoop 0.20. even with 100 x100(and even
>> bigger matrices)  but when it comes to hadoop 1.0.3 then even there is a
>> problem with 3x3 matrix.
>>
>>
>> On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi <
>> prash1784@gmail.com> wrote:
>>
>>> I have seen this issue with large file writes using SequenceFile writer.
>>> Not found the same issue when testing with writing fairly small files ( <
>>> 1GB).
>>>
>>> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
>>> <ka...@gmail.com>wrote:
>>>
>>> > Hi,
>>> > If you are using a custom writable object while passing data from the
>>> > mapper to the reducer make sure that the read fields and the write has
>>> the
>>> > same number of variables. It might be possible that you wrote datavtova
>>> > file using custom writable but later modified the custom writable (like
>>> > adding new attribute to the writable) which the old data doesn't have.
>>> >
>>> > It might be a possibility is please check once
>>> >
>>> > On Friday, May 25, 2012, waqas latif wrote:
>>> >
>>> > > Hi Experts,
>>> > >
>>> > > I am fairly new to hadoop MapR and I was trying to run a matrix
>>> > > multiplication example presented by Mr. Norstadt under following link
>>> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
>>> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop
>>> 1.0.3
>>> > but
>>> > > I am getting following error. Is it the problem with my hadoop
>>> > > configuration or it is compatibility problem in the code which was
>>> > written
>>> > > in hadoop 0.20 by author.Also please guide me that how can I fix this
>>> > error
>>> > > in either case. Here is the error I am getting.
>>> > >
>>> > > in thread "main" java.io.EOFException
>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>>> > >        at
>>> > > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>>> > >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>>> > >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>>> > >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>>> > >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>>> > >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>>> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>>> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > >        at
>>> > >
>>> > >
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> > >        at
>>> > >
>>> > >
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > >        at java.lang.reflect.Method.invoke(Method.java:597)
>>> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>> > >
>>> > > Thanks in advance
>>> > >
>>> > > Regards,
>>> > > waqas
>>> > >
>>> >
>>>
>>
>>
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by waqas latif <wa...@gmail.com>.
So my question is that do hadoop 0.20 and 1.0.3 differ in their support of
writing or reading sequencefiles? same code works fine with hadoop 0.20 but
problem occurs when run it under hadoop 1.0.3.

On Sun, May 27, 2012 at 6:15 PM, waqas latif <wa...@gmail.com> wrote:

> But the thing is, it works with hadoop 0.20. even with 100 x100(and even
> bigger matrices)  but when it comes to hadoop 1.0.3 then even there is a
> problem with 3x3 matrix.
>
>
> On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi <prash1784@gmail.com
> > wrote:
>
>> I have seen this issue with large file writes using SequenceFile writer.
>> Not found the same issue when testing with writing fairly small files ( <
>> 1GB).
>>
>> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
>> <ka...@gmail.com>wrote:
>>
>> > Hi,
>> > If you are using a custom writable object while passing data from the
>> > mapper to the reducer make sure that the read fields and the write has
>> the
>> > same number of variables. It might be possible that you wrote datavtova
>> > file using custom writable but later modified the custom writable (like
>> > adding new attribute to the writable) which the old data doesn't have.
>> >
>> > It might be a possibility is please check once
>> >
>> > On Friday, May 25, 2012, waqas latif wrote:
>> >
>> > > Hi Experts,
>> > >
>> > > I am fairly new to hadoop MapR and I was trying to run a matrix
>> > > multiplication example presented by Mr. Norstadt under following link
>> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
>> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop
>> 1.0.3
>> > but
>> > > I am getting following error. Is it the problem with my hadoop
>> > > configuration or it is compatibility problem in the code which was
>> > written
>> > > in hadoop 0.20 by author.Also please guide me that how can I fix this
>> > error
>> > > in either case. Here is the error I am getting.
>> > >
>> > > in thread "main" java.io.EOFException
>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>> > >        at
>> > > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>> > >        at
>> > >
>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>> > >        at
>> > >
>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>> > >        at
>> > >
>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>> > >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>> > >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>> > >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>> > >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>> > >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > >        at
>> > >
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> > >        at
>> > >
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > >        at java.lang.reflect.Method.invoke(Method.java:597)
>> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> > >
>> > > Thanks in advance
>> > >
>> > > Regards,
>> > > waqas
>> > >
>> >
>>
>
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by waqas latif <wa...@gmail.com>.
But the thing is, it works with hadoop 0.20. even with 100 x100(and even
bigger matrices)  but when it comes to hadoop 1.0.3 then even there is a
problem with 3x3 matrix.

On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi
<pr...@gmail.com>wrote:

> I have seen this issue with large file writes using SequenceFile writer.
> Not found the same issue when testing with writing fairly small files ( <
> 1GB).
>
> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
> <ka...@gmail.com>wrote:
>
> > Hi,
> > If you are using a custom writable object while passing data from the
> > mapper to the reducer make sure that the read fields and the write has
> the
> > same number of variables. It might be possible that you wrote datavtova
> > file using custom writable but later modified the custom writable (like
> > adding new attribute to the writable) which the old data doesn't have.
> >
> > It might be a possibility is please check once
> >
> > On Friday, May 25, 2012, waqas latif wrote:
> >
> > > Hi Experts,
> > >
> > > I am fairly new to hadoop MapR and I was trying to run a matrix
> > > multiplication example presented by Mr. Norstadt under following link
> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3
> > but
> > > I am getting following error. Is it the problem with my hadoop
> > > configuration or it is compatibility problem in the code which was
> > written
> > > in hadoop 0.20 by author.Also please guide me that how can I fix this
> > error
> > > in either case. Here is the error I am getting.
> > >
> > > in thread "main" java.io.EOFException
> > >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
> > >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> > >        at
> > > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
> > >        at
> > > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
> > >        at
> > > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
> > >        at
> > > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
> > >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
> > >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
> > >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
> > >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
> > >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >        at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:597)
> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > >
> > > Thanks in advance
> > >
> > > Regards,
> > > waqas
> > >
> >
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by Prashant Kommireddi <pr...@gmail.com>.
I have seen this issue with large file writes using SequenceFile writer.
Not found the same issue when testing with writing fairly small files ( <
1GB).

On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
<ka...@gmail.com>wrote:

> Hi,
> If you are using a custom writable object while passing data from the
> mapper to the reducer make sure that the read fields and the write has the
> same number of variables. It might be possible that you wrote datavtova
> file using custom writable but later modified the custom writable (like
> adding new attribute to the writable) which the old data doesn't have.
>
> It might be a possibility is please check once
>
> On Friday, May 25, 2012, waqas latif wrote:
>
> > Hi Experts,
> >
> > I am fairly new to hadoop MapR and I was trying to run a matrix
> > multiplication example presented by Mr. Norstadt under following link
> > http://www.norstad.org/matrix-multiply/index.html. I can run it
> > successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3
> but
> > I am getting following error. Is it the problem with my hadoop
> > configuration or it is compatibility problem in the code which was
> written
> > in hadoop 0.20 by author.Also please guide me that how can I fix this
> error
> > in either case. Here is the error I am getting.
> >
> > in thread "main" java.io.EOFException
> >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
> >        at
> > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
> >        at
> > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
> >        at
> > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
> >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
> >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
> >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
> >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
> >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
> >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >
> > Thanks in advance
> >
> > Regards,
> > waqas
> >
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by Kasi Subrahmanyam <ka...@gmail.com>.
Hi,
If you are using a custom writable object while passing data from the
mapper to the reducer make sure that the read fields and the write has the
same number of variables. It might be possible that you wrote datavtova
file using custom writable but later modified the custom writable (like
adding new attribute to the writable) which the old data doesn't have.

It might be a possibility is please check once

On Friday, May 25, 2012, waqas latif wrote:

> Hi Experts,
>
> I am fairly new to hadoop MapR and I was trying to run a matrix
> multiplication example presented by Mr. Norstadt under following link
> http://www.norstad.org/matrix-multiply/index.html. I can run it
> successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3 but
> I am getting following error. Is it the problem with my hadoop
> configuration or it is compatibility problem in the code which was written
> in hadoop 0.20 by author.Also please guide me that how can I fix this error
> in either case. Here is the error I am getting.
>
> in thread "main" java.io.EOFException
>        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
> Thanks in advance
>
> Regards,
> waqas
>

Re: EOFException at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)......

Posted by Marcos Ortiz <ml...@uci.cu>.
Regards, waqas. I think that you have to ask to MapR experts.


On 05/25/2012 05:42 AM, waqas latif wrote:
> Hi Experts,
>
> I am fairly new to hadoop MapR and I was trying to run a matrix
> multiplication example presented by Mr. Norstadt under following link
> http://www.norstad.org/matrix-multiply/index.html. I can run it
> successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3 but
> I am getting following error. Is it the problem with my hadoop
> configuration or it is compatibility problem in the code which was written
> in hadoop 0.20 by author.Also please guide me that how can I fix this error
> in either case. Here is the error I am getting.
The same code that you write for 0.20.2 should work in 1.0.3 too.

>
> in thread "main" java.io.EOFException
>          at java.io.DataInputStream.readFully(DataInputStream.java:180)
>          at java.io.DataInputStream.readFully(DataInputStream.java:152)
>          at
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>          at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>          at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>          at
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>          at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>          at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>          at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>          at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>          at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>          at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>          at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>          at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>          at java.lang.reflect.Method.invoke(Method.java:597)
>          at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
> Thanks in advance
>
> Regards,
> waqas
Can you put here the completed log for this?
Best wishes

-- 
Marcos Luis Ortíz Valmaseda
  Data Engineer&&  Sr. System Administrator at UCI
  http://marcosluis2186.posterous.com
  http://www.linkedin.com/in/marcosluis2186
  Twitter: @marcosluis2186


10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci