You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by byte array <by...@gmail.com> on 2013/07/16 17:36:33 UTC
Writing doubles into HBase table from Java client
Hello!
A rather rudimentary thing. I have noticed that sometimes HBase shell
shows double values in a readable format and sometimes as array of
(octal thus unreadable) 8 bytes .
This happens when I write them into the table from a java client:
eg.
org.apache.hadoop.hbase.util.Bytes.toBytes(1.23);
In this case I have problems reading/mapping the value in Pig script and
CDH beeswax/hue.
I made a temporary workaround by writing the doubles using Pig's class:
eg.
org.apache.pig.backend.hadoop.hbase.HBaseBinaryConverter.toBytes(1.23);
On the contrary, When I aggregate and store doubles from Pig script into
some other table, they are readable in the HBase shell, as if they were
strings and also readable by Pig and other programs.
I wonder what is the proper way to write doubles into HBase table from a
Java client?
Thanks.
Re: Writing doubles into HBase table from Java client
Posted by Jimmy Xiang <jx...@cloudera.com>.
HBase itself sees bytes only. It is up to the application to do the proper
en/decoding. As to the Pig script, it should be helpful to checkout how
Pig store doubles.
Thanks,
Jimmy
On Tue, Jul 16, 2013 at 8:36 AM, byte array <by...@gmail.com> wrote:
> Hello!
>
> A rather rudimentary thing. I have noticed that sometimes HBase shell
> shows double values in a readable format and sometimes as array of (octal
> thus unreadable) 8 bytes .
> This happens when I write them into the table from a java client:
> eg.
> org.apache.hadoop.hbase.util.**Bytes.toBytes(1.23);
> In this case I have problems reading/mapping the value in Pig script and
> CDH beeswax/hue.
>
> I made a temporary workaround by writing the doubles using Pig's class:
> eg.
> org.apache.pig.backend.hadoop.**hbase.HBaseBinaryConverter.**
> toBytes(1.23);
>
> On the contrary, When I aggregate and store doubles from Pig script into
> some other table, they are readable in the HBase shell, as if they were
> strings and also readable by Pig and other programs.
> I wonder what is the proper way to write doubles into HBase table from a
> Java client?
>
> Thanks.
>
Re: Writing doubles into HBase table from Java client
Posted by Ted Yu <yu...@gmail.com>.
I think the following JIRA is related:
HBASE-8201 Implement serialization strategies
Cheers
On Tue, Jul 16, 2013 at 8:36 AM, byte array <by...@gmail.com> wrote:
> Hello!
>
> A rather rudimentary thing. I have noticed that sometimes HBase shell
> shows double values in a readable format and sometimes as array of (octal
> thus unreadable) 8 bytes .
> This happens when I write them into the table from a java client:
> eg.
> org.apache.hadoop.hbase.util.**Bytes.toBytes(1.23);
> In this case I have problems reading/mapping the value in Pig script and
> CDH beeswax/hue.
>
> I made a temporary workaround by writing the doubles using Pig's class:
> eg.
> org.apache.pig.backend.hadoop.**hbase.HBaseBinaryConverter.**
> toBytes(1.23);
>
> On the contrary, When I aggregate and store doubles from Pig script into
> some other table, they are readable in the HBase shell, as if they were
> strings and also readable by Pig and other programs.
> I wonder what is the proper way to write doubles into HBase table from a
> Java client?
>
> Thanks.
>