You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Prakhar Srivastava <pr...@gmail.com> on 2014/09/09 14:55:11 UTC

Upadting a HBase KeyValue using bulk upload

Hi,

I have a MapReduce job which creates a StoreFile which I can load using
LoadIncrementalFiles in HBase. I am also using the timestamp component of
the KeyValue in my mapper to maintain version in an custom manner. But when
I am trying to overwrite the same version using the bulk import, it is not
working. When I try to perform a git, it returns me to the old version.

Also, if I try to update a KeyValue by overwriting the timestamp in the
hbase shell, I can see that the value is getting updated.

eg.  put 't1', 'r1', 'c1', 'value', ts1

Can someone help on why the updates are not reflecting when using bulk
import ?

Re: Upadting a HBase KeyValue using bulk upload

Posted by Ted Yu <yu...@gmail.com>.
Do you use HFileOutputFormat in your MapReduce job ?

Take a look at problem #1 of HBASE-11772
BTW HBASE-11772 was integrated to 0.98 yesterday.

Cheers

On Tue, Sep 9, 2014 at 5:55 AM, Prakhar Srivastava <pr...@gmail.com>
wrote:

> Hi,
>
> I have a MapReduce job which creates a StoreFile which I can load using
> LoadIncrementalFiles in HBase. I am also using the timestamp component of
> the KeyValue in my mapper to maintain version in an custom manner. But when
> I am trying to overwrite the same version using the bulk import, it is not
> working. When I try to perform a git, it returns me to the old version.
>
> Also, if I try to update a KeyValue by overwriting the timestamp in the
> hbase shell, I can see that the value is getting updated.
>
> eg.  put 't1', 'r1', 'c1', 'value', ts1
>
> Can someone help on why the updates are not reflecting when using bulk
> import ?
>