You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Dhaval Makawana <dh...@gmail.com> on 2011/09/04 07:41:35 UTC
Re: Hbase bulk insert.
Hi Sriram,
You can use bulk upload utility to insert massive data in HBase.
http://archive.cloudera.com/cdh/3/hbase/bulk-loads.html
There is an alternative I use in my application where I had to insert large
amount of data from map-reduce job. I use async hbase client(
https://github.com/stumbleupon/asynchbase) instead of HTable(which I suppose
TableMapReduceUtil will internally be using) and found async client quite
faster.
Regards,
Dhaval
On Wed, Aug 31, 2011 at 9:44 PM, Stack <st...@duboce.net> wrote:
> On Wed, Aug 31, 2011 at 1:06 AM, sriram <rs...@gmail.com> wrote:
> > Error: unable to create new native thread
> > Only 8200 values are inserted remaining lakhs of datas are not inserted
> and the
> > job failed.Any ideas or solutions.?????
> >
>
> You are getting an OutOfMemoryError? Its coming from mapreduce or is
> it from hbase? Can you give your processes more memory? What are
> you trying to insert? Is it massive?
>
> St.Ack
>