You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Moustafa Aboul Atta <m....@gmail.com> on 2014/11/24 14:47:39 UTC

Reading Numerical Values Written Through Hive

Hello, I have Hive running on top of HBase through org.apache.hadoop.
hive.hbase.HBaseStorageHandler.

String columns on Hive are mapped to string columns on HBase mapped to
varchar columns on Phoenix
Numericals (INTs, BIGINTs) mapped to binary columns on Hbase mapped to
(INTEGER, BIGINT) on phoenix.

When I try to query through phoenix an already existing table on Hbase that
its data was inserted via Hive. Strings are read properly, however, any
numerical value is not. I am not sure what is the problem. It's not
overflow because types are configured properly. I suspect it may be a
problem with endianness but can't find any concrete lead. Values on Hbase
are stored as big endian.

Any insights will be highly appreciated. thanks.

-- 
Best Regards,
Moustafa Aboul Atta

Re: Reading Numerical Values Written Through Hive

Posted by Moustafa Aboul Atta <m....@gmail.com>.
UNSIGNED_INT and UNISIGNED_LONG do indeed work. Thanks

On Mon, Nov 24, 2014 at 4:51 PM, Abe Weinograd <ab...@flonet.com> wrote:

> Phoenix serializes INTEGER and BIGINT differently than Bytes.toBytes()
>
> I believe this will only work if you use UNSIGNED_INT and UNSIGNED_LONG,
> but would require you to not have negative numbers.
>
> Abe
>
> On Mon, Nov 24, 2014 at 8:47 AM, Moustafa Aboul Atta <
> m.aboulatta@gmail.com> wrote:
>
>> Hello, I have Hive running on top of HBase through org.apache.hadoop.
>> hive.hbase.HBaseStorageHandler.
>>
>> String columns on Hive are mapped to string columns on HBase mapped to
>> varchar columns on Phoenix
>> Numericals (INTs, BIGINTs) mapped to binary columns on Hbase mapped to
>> (INTEGER, BIGINT) on phoenix.
>>
>> When I try to query through phoenix an already existing table on Hbase
>> that its data was inserted via Hive. Strings are read properly, however,
>> any numerical value is not. I am not sure what is the problem. It's not
>> overflow because types are configured properly. I suspect it may be a
>> problem with endianness but can't find any concrete lead. Values on Hbase
>> are stored as big endian.
>>
>> Any insights will be highly appreciated. thanks.
>>
>> --
>> Best Regards,
>> Moustafa Aboul Atta
>>
>
>


-- 
Best Regards,
Moustafa Aboul Atta

Re: Reading Numerical Values Written Through Hive

Posted by Abe Weinograd <ab...@flonet.com>.
Phoenix serializes INTEGER and BIGINT differently than Bytes.toBytes()

I believe this will only work if you use UNSIGNED_INT and UNSIGNED_LONG,
but would require you to not have negative numbers.

Abe

On Mon, Nov 24, 2014 at 8:47 AM, Moustafa Aboul Atta <m....@gmail.com>
wrote:

> Hello, I have Hive running on top of HBase through org.apache.hadoop.
> hive.hbase.HBaseStorageHandler.
>
> String columns on Hive are mapped to string columns on HBase mapped to
> varchar columns on Phoenix
> Numericals (INTs, BIGINTs) mapped to binary columns on Hbase mapped to
> (INTEGER, BIGINT) on phoenix.
>
> When I try to query through phoenix an already existing table on Hbase
> that its data was inserted via Hive. Strings are read properly, however,
> any numerical value is not. I am not sure what is the problem. It's not
> overflow because types are configured properly. I suspect it may be a
> problem with endianness but can't find any concrete lead. Values on Hbase
> are stored as big endian.
>
> Any insights will be highly appreciated. thanks.
>
> --
> Best Regards,
> Moustafa Aboul Atta
>