You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by local host <un...@gmail.com> on 2014/03/22 00:49:19 UTC

Analyse the phoenix inserted data using pig

Hey All,

*How can I analyze the Hbase data, which was inserted by jdbc phoenix,
using Pig?*
I wish to do batch processing on Hbase data using pig and correct the
maintained counters.

In crux, I want to know what extra work phoenix is doing in a HBase table
at the time of insertion that requires some extra steps when I am analyzing
it from other mapreduce tools such as hive, pig, dril etc.


--UniLocal

Re: Analyse the phoenix inserted data using pig

Posted by Localhost shell <un...@gmail.com>.
Hey Ravi and James,

Do you have any updates on PhoenixHbaseLoader?
I am eager to test it out.

Thanks


On Mon, Mar 24, 2014 at 9:21 AM, local host
<un...@gmail.com>wrote:

> Thanks Ravi for the update.
>
> If you can share some more info on the expected date for PhoenixHbaseLoader
> that will be great.
>
> I am eager to use Phoenix in my current project but I want to know about
> the extra work done by phoenix while inserting records in HBase table so
> that I can freely use other batch analysis tools such as pig, impala, hive.
> *In crux, I want to know if Phoenix and other tools are inter-operable.*
>
>
>
>
> On Fri, Mar 21, 2014 at 7:37 PM, Ravi Kiran <ma...@gmail.com>wrote:
>
>> Hi
>>
>>    We are currently working on having a PhoenixHbaseLoader to load data
>> from HBase using Pig.
>>
>> Regards
>> Ravi
>>
>>
>> On Sat, Mar 22, 2014 at 5:19 AM, local host <
>> universal.localhost@gmail.com> wrote:
>>
>>> Hey All,
>>>
>>> *How can I analyze the Hbase data, which was inserted by jdbc phoenix,
>>> using Pig?*
>>> I wish to do batch processing on Hbase data using pig and correct the
>>> maintained counters.
>>>
>>> In crux, I want to know what extra work phoenix is doing in a HBase
>>> table at the time of insertion that requires some extra steps when I am
>>> analyzing it from other mapreduce tools such as hive, pig, dril etc.
>>>
>>>
>>> --UniLocal
>>>
>>
>>
>

Re: Analyse the phoenix inserted data using pig

Posted by local host <un...@gmail.com>.
Thanks Ravi for the update.

If you can share some more info on the expected date for PhoenixHbaseLoader
that will be great.

I am eager to use Phoenix in my current project but I want to know about
the extra work done by phoenix while inserting records in HBase table so
that I can freely use other batch analysis tools such as pig, impala, hive.
*In crux, I want to know if Phoenix and other tools are inter-operable.*




On Fri, Mar 21, 2014 at 7:37 PM, Ravi Kiran <ma...@gmail.com>wrote:

> Hi
>
>    We are currently working on having a PhoenixHbaseLoader to load data
> from HBase using Pig.
>
> Regards
> Ravi
>
>
> On Sat, Mar 22, 2014 at 5:19 AM, local host <universal.localhost@gmail.com
> > wrote:
>
>> Hey All,
>>
>> *How can I analyze the Hbase data, which was inserted by jdbc phoenix,
>> using Pig?*
>> I wish to do batch processing on Hbase data using pig and correct the
>> maintained counters.
>>
>> In crux, I want to know what extra work phoenix is doing in a HBase table
>> at the time of insertion that requires some extra steps when I am analyzing
>> it from other mapreduce tools such as hive, pig, dril etc.
>>
>>
>> --UniLocal
>>
>
>

Re: Analyse the phoenix inserted data using pig

Posted by Ravi Kiran <ma...@gmail.com>.
Hi

   We are currently working on having a PhoenixHbaseLoader to load data
from HBase using Pig.

Regards
Ravi


On Sat, Mar 22, 2014 at 5:19 AM, local host
<un...@gmail.com>wrote:

> Hey All,
>
> *How can I analyze the Hbase data, which was inserted by jdbc phoenix,
> using Pig?*
> I wish to do batch processing on Hbase data using pig and correct the
> maintained counters.
>
> In crux, I want to know what extra work phoenix is doing in a HBase table
> at the time of insertion that requires some extra steps when I am analyzing
> it from other mapreduce tools such as hive, pig, dril etc.
>
>
> --UniLocal
>