You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Shashi Vishwakarma <sh...@gmail.com> on 2015/06/03 08:14:50 UTC

Hbase Bulk load - Map Reduce job failing

Hi

I have map reduce job for hbase bulk load. Job is converting data into
Hfiles and loading into hbase but after certain map % job is failing. Below
is the exception that I am getting.

Error: java.io.FileNotFoundException:
/var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index
    at
org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800)
    at
org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
    at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74)
    at
org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382)
    at
org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627)
    at
org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Above error says that file not found exception but I was able to locate
that particular spill on disk.

Only thing that i noticed in job that for small set of data it is working
fine but as data grows job starts failing.

Let me know if anyone has faced this issue.

Thanks

Shashi

Re: Hbase Bulk load - Map Reduce job failing

Posted by Shashi Vishwakarma <sh...@gmail.com>.
This was a bug in MapR. I got reply on MapR forum. If someone is facing
similar issue then refer below link.

http://answers.mapr.com/questions/163440/hbase-bulk-load-map-reduce-job-failing-on-mapr.html

On Wed, Jun 3, 2015 at 8:02 PM, Shashi Vishwakarma <shashi.vish123@gmail.com
> wrote:

> Hi
>
> Yes I am using MapR FS.  I have posted this problem on their forum but I
> haven't received any reply yet. Is there any other mapr mailing list apart
> from forum?
>
> Here is the link that i have posted.
>
>
> http://answers.mapr.com/questions/163440/hbase-bulk-load-map-reduce-job-failing-on-mapr.html
>
> Thanks.
>
> On Wed, Jun 3, 2015 at 7:15 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Looks like you're using MapR FS.
>>
>> Have you considered posting this question on their mailing list ?
>>
>> Cheers
>>
>> On Tue, Jun 2, 2015 at 11:14 PM, Shashi Vishwakarma <
>> shashi.vish123@gmail.com> wrote:
>>
>> > Hi
>> >
>> > I have map reduce job for hbase bulk load. Job is converting data into
>> > Hfiles and loading into hbase but after certain map % job is failing.
>> Below
>> > is the exception that I am getting.
>> >
>> > Error: java.io.FileNotFoundException:
>> >
>> >
>> /var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index
>> >     at
>> >
>> org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198)
>> >     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800)
>> >     at
>> >
>> >
>> org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
>> >     at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74)
>> >     at
>> >
>> >
>> org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382)
>> >     at
>> >
>> >
>> org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627)
>> >     at
>> >
>> org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345)
>> >     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:415)
>> >     at
>> >
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
>> >     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> >
>> > Above error says that file not found exception but I was able to locate
>> > that particular spill on disk.
>> >
>> > Only thing that i noticed in job that for small set of data it is
>> working
>> > fine but as data grows job starts failing.
>> >
>> > Let me know if anyone has faced this issue.
>> >
>> > Thanks
>> >
>> > Shashi
>> >
>>
>
>

Re: Hbase Bulk load - Map Reduce job failing

Posted by Shashi Vishwakarma <sh...@gmail.com>.
Hi

Yes I am using MapR FS.  I have posted this problem on their forum but I
haven't received any reply yet. Is there any other mapr mailing list apart
from forum?

Here is the link that i have posted.

http://answers.mapr.com/questions/163440/hbase-bulk-load-map-reduce-job-failing-on-mapr.html

Thanks.

On Wed, Jun 3, 2015 at 7:15 PM, Ted Yu <yu...@gmail.com> wrote:

> Looks like you're using MapR FS.
>
> Have you considered posting this question on their mailing list ?
>
> Cheers
>
> On Tue, Jun 2, 2015 at 11:14 PM, Shashi Vishwakarma <
> shashi.vish123@gmail.com> wrote:
>
> > Hi
> >
> > I have map reduce job for hbase bulk load. Job is converting data into
> > Hfiles and loading into hbase but after certain map % job is failing.
> Below
> > is the exception that I am getting.
> >
> > Error: java.io.FileNotFoundException:
> >
> >
> /var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index
> >     at
> > org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198)
> >     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800)
> >     at
> >
> >
> org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
> >     at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74)
> >     at
> >
> >
> org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382)
> >     at
> >
> >
> org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627)
> >     at
> >
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345)
> >     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:415)
> >     at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
> >     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> >
> > Above error says that file not found exception but I was able to locate
> > that particular spill on disk.
> >
> > Only thing that i noticed in job that for small set of data it is working
> > fine but as data grows job starts failing.
> >
> > Let me know if anyone has faced this issue.
> >
> > Thanks
> >
> > Shashi
> >
>

Re: Hbase Bulk load - Map Reduce job failing

Posted by Ted Yu <yu...@gmail.com>.
Looks like you're using MapR FS.

Have you considered posting this question on their mailing list ?

Cheers

On Tue, Jun 2, 2015 at 11:14 PM, Shashi Vishwakarma <
shashi.vish123@gmail.com> wrote:

> Hi
>
> I have map reduce job for hbase bulk load. Job is converting data into
> Hfiles and loading into hbase but after certain map % job is failing. Below
> is the exception that I am getting.
>
> Error: java.io.FileNotFoundException:
>
> /var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index
>     at
> org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800)
>     at
>
> org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
>     at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74)
>     at
>
> org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382)
>     at
>
> org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627)
>     at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> Above error says that file not found exception but I was able to locate
> that particular spill on disk.
>
> Only thing that i noticed in job that for small set of data it is working
> fine but as data grows job starts failing.
>
> Let me know if anyone has faced this issue.
>
> Thanks
>
> Shashi
>