You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Alberto Cordioli <co...@gmail.com> on 2012/09/12 11:35:34 UTC

IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Hi all,

I'm currently working with Pig 0.10.0. I'd like to load some data from
an HBase table, but I encountered some problems. When I try to load
the data it seems to work:

grunt> raw = LOAD 'hbase://table_test' USING
org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
true -limit 5') as (hash:bytearray, data1:chararray);
2012-09-12 11:27:48,213 [main] INFO
org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
family:descriptor filters with values d:data1
2012-09-12 11:27:48,216 [main] INFO
org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
family:descriptor filters with values d:data1
2012-09-12 11:27:48,264 [main] INFO
org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
family:descriptor filters with values d:data1
2012-09-12 11:27:48,267 [main] INFO
org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
family:descriptor filters with values d:data1
2012-09-12 11:27:48,267 [main] INFO
org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
family:descriptor filters with values d:data1

but when I dump the data I get this exception:

Caused by: java.lang.IllegalArgumentException: Not a host:port pair:

My configuration is:
Apache Hadoop 1.0.3
HBase 0.94.1
Pig 0.10.0

In the pig script file I set the two env variable in this way:
HADOOP_CONF_DIR=/usr/local/hadoop/conf
HBASE_CONF_DIR=/usr/local/hbase/conf


Could you help me? I don't know how can I solve this issue.



Thanks,
Alberto




-- 
Alberto Cordioli

Re: IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Posted by Alberto Cordioli <co...@gmail.com>.
Ok, I solved the problem copying the required jars in $HADOOP_HOME/lib folder.


Thanks,
Alberto

On 13 September 2012 10:22, Alberto Cordioli <co...@gmail.com> wrote:
> Cheolsoo, doing so I obtain a ClassNotFoundException:
> ERROR 2998: Unhandled internal error.
> org/apache/hadoop/hbase/filter/WritableByteArrayComparable
>
> That's a very strange thing since I'm sure that package is in the CLASSPATH.
> Are you using Hadoop 1.0.0 or 1.0.1?
>
>
> Alberto
>
> On 13 September 2012 01:38, Cheolsoo Park <ch...@cloudera.com> wrote:
>> Hi Alberto,
>>
>> Here is my local setup that works. Please change them accordingly to your
>> environment.
>>
>> 1) I started hbase-0.94 in standalone mode.
>> 2) I downloaded hadoop-1.0.0.
>> 3) I built "pig-withouthadoop.jar" from the source and ran the following
>> commands:
>>
>> export HADOOP_HOME=/home/cheolsoo/workspace/hadoop-1.0.1
>> export HBASE_HOME=/home/cheolsoo/workspace/hbase-0.94.1
>> export ZOOKEEPER_HOME=/home/cheolsoo/workspace/hbase-0.94.1/lib
>> export
>> PIG_CLASSPATH=/home/cheolsoo/workspace/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar
>> ./bin/pig -x local
>>
>> Please note that by setting HBASE_HOME, Pig uses hbase.jar in that
>> directory. I also set ZOOKEEPER_HOME to hbase-0.94.1/lib because that's
>> where zookeeper.jar exists. Lastly, I add protobuf-java-2.4.0a.jar
>> to PIG_CLASSPATH because I found that HBaseStorage fails with a
>> ClassNotFound exception without it. Basically, everything HBaseStorage
>> needs should be present in classpath at runtime. If these jars are
>> available in classpath by default (for example because you installed hbase
>> via rpm, etc), you won't have to set them explicitly.
>>
>> I am able to successfully load columns from a hbase table.
>>
>> Thanks,
>> Cheolsoo
>>
>> On Wed, Sep 12, 2012 at 3:10 PM, Alberto Cordioli <
>> cordioli.alberto@gmail.com> wrote:
>>
>>> Thanks Cheolsoo.
>>> I've already seen that link. But it is not so much clear to me how Pig
>>> makes use of HBase jars.
>>> Even if I use Pig in MapReduce mode (real or pseudo-distributed) it
>>> needs HBase on the client, right?
>>>
>>> In my particular case I use Pig in a real distributed cluster (but I
>>> tested it also in pseudo-distributed env) running pig in interactive
>>> way (using the script pig in bin folder). Should I launch pig with
>>> java command?
>>> Please, could you explain me how to setup correctly Pig for my case
>>> (Hbase 0.94 and hadoop 1.0.3)?
>>>
>>> Thank you very much,
>>> Alberto
>>>
>>> On 12 September 2012 20:11, Cheolsoo Park <ch...@cloudera.com> wrote:
>>> > Hi Alberto,
>>> >
>>> > Are you running in local mode using "pig.jar", or in mapreduce mode using
>>> > "pig-withouthadoop.jar"?
>>> >
>>> > The error that you're seeing can happen when there is a version mismatch
>>> in
>>> > the HBase client and server. The pig.jar by default contains hbase-0.90,
>>> so
>>> > if you use pig.jar against hbase-0.94, it won't work. There was a jira
>>> > about running pig.jar against hbase-0.94:
>>> > https://issues.apache.org/jira/browse/PIG-2891
>>> >
>>> > Thanks,
>>> > Cheolsoo
>>> >
>>> >
>>> > On Wed, Sep 12, 2012 at 2:35 AM, Alberto Cordioli <
>>> > cordioli.alberto@gmail.com> wrote:
>>> >
>>> >> Hi all,
>>> >>
>>> >> I'm currently working with Pig 0.10.0. I'd like to load some data from
>>> >> an HBase table, but I encountered some problems. When I try to load
>>> >> the data it seems to work:
>>> >>
>>> >> grunt> raw = LOAD 'hbase://table_test' USING
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
>>> >> true -limit 5') as (hash:bytearray, data1:chararray);
>>> >> 2012-09-12 11:27:48,213 [main] INFO
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>>> >> family:descriptor filters with values d:data1
>>> >> 2012-09-12 11:27:48,216 [main] INFO
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>>> >> family:descriptor filters with values d:data1
>>> >> 2012-09-12 11:27:48,264 [main] INFO
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>>> >> family:descriptor filters with values d:data1
>>> >> 2012-09-12 11:27:48,267 [main] INFO
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>>> >> family:descriptor filters with values d:data1
>>> >> 2012-09-12 11:27:48,267 [main] INFO
>>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>>> >> family:descriptor filters with values d:data1
>>> >>
>>> >> but when I dump the data I get this exception:
>>> >>
>>> >> Caused by: java.lang.IllegalArgumentException: Not a host:port pair:
>>> >>
>>> >> My configuration is:
>>> >> Apache Hadoop 1.0.3
>>> >> HBase 0.94.1
>>> >> Pig 0.10.0
>>> >>
>>> >> In the pig script file I set the two env variable in this way:
>>> >> HADOOP_CONF_DIR=/usr/local/hadoop/conf
>>> >> HBASE_CONF_DIR=/usr/local/hbase/conf
>>> >>
>>> >>
>>> >> Could you help me? I don't know how can I solve this issue.
>>> >>
>>> >>
>>> >>
>>> >> Thanks,
>>> >> Alberto
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> --
>>> >> Alberto Cordioli
>>> >>
>>>
>>>
>>>
>>> --
>>> Alberto Cordioli
>>>
>
>
>
> --
> Alberto Cordioli



-- 
Alberto Cordioli

Re: IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Posted by Alberto Cordioli <co...@gmail.com>.
Cheolsoo, doing so I obtain a ClassNotFoundException:
ERROR 2998: Unhandled internal error.
org/apache/hadoop/hbase/filter/WritableByteArrayComparable

That's a very strange thing since I'm sure that package is in the CLASSPATH.
Are you using Hadoop 1.0.0 or 1.0.1?


Alberto

On 13 September 2012 01:38, Cheolsoo Park <ch...@cloudera.com> wrote:
> Hi Alberto,
>
> Here is my local setup that works. Please change them accordingly to your
> environment.
>
> 1) I started hbase-0.94 in standalone mode.
> 2) I downloaded hadoop-1.0.0.
> 3) I built "pig-withouthadoop.jar" from the source and ran the following
> commands:
>
> export HADOOP_HOME=/home/cheolsoo/workspace/hadoop-1.0.1
> export HBASE_HOME=/home/cheolsoo/workspace/hbase-0.94.1
> export ZOOKEEPER_HOME=/home/cheolsoo/workspace/hbase-0.94.1/lib
> export
> PIG_CLASSPATH=/home/cheolsoo/workspace/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar
> ./bin/pig -x local
>
> Please note that by setting HBASE_HOME, Pig uses hbase.jar in that
> directory. I also set ZOOKEEPER_HOME to hbase-0.94.1/lib because that's
> where zookeeper.jar exists. Lastly, I add protobuf-java-2.4.0a.jar
> to PIG_CLASSPATH because I found that HBaseStorage fails with a
> ClassNotFound exception without it. Basically, everything HBaseStorage
> needs should be present in classpath at runtime. If these jars are
> available in classpath by default (for example because you installed hbase
> via rpm, etc), you won't have to set them explicitly.
>
> I am able to successfully load columns from a hbase table.
>
> Thanks,
> Cheolsoo
>
> On Wed, Sep 12, 2012 at 3:10 PM, Alberto Cordioli <
> cordioli.alberto@gmail.com> wrote:
>
>> Thanks Cheolsoo.
>> I've already seen that link. But it is not so much clear to me how Pig
>> makes use of HBase jars.
>> Even if I use Pig in MapReduce mode (real or pseudo-distributed) it
>> needs HBase on the client, right?
>>
>> In my particular case I use Pig in a real distributed cluster (but I
>> tested it also in pseudo-distributed env) running pig in interactive
>> way (using the script pig in bin folder). Should I launch pig with
>> java command?
>> Please, could you explain me how to setup correctly Pig for my case
>> (Hbase 0.94 and hadoop 1.0.3)?
>>
>> Thank you very much,
>> Alberto
>>
>> On 12 September 2012 20:11, Cheolsoo Park <ch...@cloudera.com> wrote:
>> > Hi Alberto,
>> >
>> > Are you running in local mode using "pig.jar", or in mapreduce mode using
>> > "pig-withouthadoop.jar"?
>> >
>> > The error that you're seeing can happen when there is a version mismatch
>> in
>> > the HBase client and server. The pig.jar by default contains hbase-0.90,
>> so
>> > if you use pig.jar against hbase-0.94, it won't work. There was a jira
>> > about running pig.jar against hbase-0.94:
>> > https://issues.apache.org/jira/browse/PIG-2891
>> >
>> > Thanks,
>> > Cheolsoo
>> >
>> >
>> > On Wed, Sep 12, 2012 at 2:35 AM, Alberto Cordioli <
>> > cordioli.alberto@gmail.com> wrote:
>> >
>> >> Hi all,
>> >>
>> >> I'm currently working with Pig 0.10.0. I'd like to load some data from
>> >> an HBase table, but I encountered some problems. When I try to load
>> >> the data it seems to work:
>> >>
>> >> grunt> raw = LOAD 'hbase://table_test' USING
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
>> >> true -limit 5') as (hash:bytearray, data1:chararray);
>> >> 2012-09-12 11:27:48,213 [main] INFO
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> >> family:descriptor filters with values d:data1
>> >> 2012-09-12 11:27:48,216 [main] INFO
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> >> family:descriptor filters with values d:data1
>> >> 2012-09-12 11:27:48,264 [main] INFO
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> >> family:descriptor filters with values d:data1
>> >> 2012-09-12 11:27:48,267 [main] INFO
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> >> family:descriptor filters with values d:data1
>> >> 2012-09-12 11:27:48,267 [main] INFO
>> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> >> family:descriptor filters with values d:data1
>> >>
>> >> but when I dump the data I get this exception:
>> >>
>> >> Caused by: java.lang.IllegalArgumentException: Not a host:port pair:
>> >>
>> >> My configuration is:
>> >> Apache Hadoop 1.0.3
>> >> HBase 0.94.1
>> >> Pig 0.10.0
>> >>
>> >> In the pig script file I set the two env variable in this way:
>> >> HADOOP_CONF_DIR=/usr/local/hadoop/conf
>> >> HBASE_CONF_DIR=/usr/local/hbase/conf
>> >>
>> >>
>> >> Could you help me? I don't know how can I solve this issue.
>> >>
>> >>
>> >>
>> >> Thanks,
>> >> Alberto
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> Alberto Cordioli
>> >>
>>
>>
>>
>> --
>> Alberto Cordioli
>>



-- 
Alberto Cordioli

Re: IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Posted by Cheolsoo Park <ch...@cloudera.com>.
Hi Alberto,

Here is my local setup that works. Please change them accordingly to your
environment.

1) I started hbase-0.94 in standalone mode.
2) I downloaded hadoop-1.0.0.
3) I built "pig-withouthadoop.jar" from the source and ran the following
commands:

export HADOOP_HOME=/home/cheolsoo/workspace/hadoop-1.0.1
export HBASE_HOME=/home/cheolsoo/workspace/hbase-0.94.1
export ZOOKEEPER_HOME=/home/cheolsoo/workspace/hbase-0.94.1/lib
export
PIG_CLASSPATH=/home/cheolsoo/workspace/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar
./bin/pig -x local

Please note that by setting HBASE_HOME, Pig uses hbase.jar in that
directory. I also set ZOOKEEPER_HOME to hbase-0.94.1/lib because that's
where zookeeper.jar exists. Lastly, I add protobuf-java-2.4.0a.jar
to PIG_CLASSPATH because I found that HBaseStorage fails with a
ClassNotFound exception without it. Basically, everything HBaseStorage
needs should be present in classpath at runtime. If these jars are
available in classpath by default (for example because you installed hbase
via rpm, etc), you won't have to set them explicitly.

I am able to successfully load columns from a hbase table.

Thanks,
Cheolsoo

On Wed, Sep 12, 2012 at 3:10 PM, Alberto Cordioli <
cordioli.alberto@gmail.com> wrote:

> Thanks Cheolsoo.
> I've already seen that link. But it is not so much clear to me how Pig
> makes use of HBase jars.
> Even if I use Pig in MapReduce mode (real or pseudo-distributed) it
> needs HBase on the client, right?
>
> In my particular case I use Pig in a real distributed cluster (but I
> tested it also in pseudo-distributed env) running pig in interactive
> way (using the script pig in bin folder). Should I launch pig with
> java command?
> Please, could you explain me how to setup correctly Pig for my case
> (Hbase 0.94 and hadoop 1.0.3)?
>
> Thank you very much,
> Alberto
>
> On 12 September 2012 20:11, Cheolsoo Park <ch...@cloudera.com> wrote:
> > Hi Alberto,
> >
> > Are you running in local mode using "pig.jar", or in mapreduce mode using
> > "pig-withouthadoop.jar"?
> >
> > The error that you're seeing can happen when there is a version mismatch
> in
> > the HBase client and server. The pig.jar by default contains hbase-0.90,
> so
> > if you use pig.jar against hbase-0.94, it won't work. There was a jira
> > about running pig.jar against hbase-0.94:
> > https://issues.apache.org/jira/browse/PIG-2891
> >
> > Thanks,
> > Cheolsoo
> >
> >
> > On Wed, Sep 12, 2012 at 2:35 AM, Alberto Cordioli <
> > cordioli.alberto@gmail.com> wrote:
> >
> >> Hi all,
> >>
> >> I'm currently working with Pig 0.10.0. I'd like to load some data from
> >> an HBase table, but I encountered some problems. When I try to load
> >> the data it seems to work:
> >>
> >> grunt> raw = LOAD 'hbase://table_test' USING
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
> >> true -limit 5') as (hash:bytearray, data1:chararray);
> >> 2012-09-12 11:27:48,213 [main] INFO
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> >> family:descriptor filters with values d:data1
> >> 2012-09-12 11:27:48,216 [main] INFO
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> >> family:descriptor filters with values d:data1
> >> 2012-09-12 11:27:48,264 [main] INFO
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> >> family:descriptor filters with values d:data1
> >> 2012-09-12 11:27:48,267 [main] INFO
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> >> family:descriptor filters with values d:data1
> >> 2012-09-12 11:27:48,267 [main] INFO
> >> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> >> family:descriptor filters with values d:data1
> >>
> >> but when I dump the data I get this exception:
> >>
> >> Caused by: java.lang.IllegalArgumentException: Not a host:port pair:
> >>
> >> My configuration is:
> >> Apache Hadoop 1.0.3
> >> HBase 0.94.1
> >> Pig 0.10.0
> >>
> >> In the pig script file I set the two env variable in this way:
> >> HADOOP_CONF_DIR=/usr/local/hadoop/conf
> >> HBASE_CONF_DIR=/usr/local/hbase/conf
> >>
> >>
> >> Could you help me? I don't know how can I solve this issue.
> >>
> >>
> >>
> >> Thanks,
> >> Alberto
> >>
> >>
> >>
> >>
> >> --
> >> Alberto Cordioli
> >>
>
>
>
> --
> Alberto Cordioli
>

Re: IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Posted by Alberto Cordioli <co...@gmail.com>.
Thanks Cheolsoo.
I've already seen that link. But it is not so much clear to me how Pig
makes use of HBase jars.
Even if I use Pig in MapReduce mode (real or pseudo-distributed) it
needs HBase on the client, right?

In my particular case I use Pig in a real distributed cluster (but I
tested it also in pseudo-distributed env) running pig in interactive
way (using the script pig in bin folder). Should I launch pig with
java command?
Please, could you explain me how to setup correctly Pig for my case
(Hbase 0.94 and hadoop 1.0.3)?

Thank you very much,
Alberto

On 12 September 2012 20:11, Cheolsoo Park <ch...@cloudera.com> wrote:
> Hi Alberto,
>
> Are you running in local mode using "pig.jar", or in mapreduce mode using
> "pig-withouthadoop.jar"?
>
> The error that you're seeing can happen when there is a version mismatch in
> the HBase client and server. The pig.jar by default contains hbase-0.90, so
> if you use pig.jar against hbase-0.94, it won't work. There was a jira
> about running pig.jar against hbase-0.94:
> https://issues.apache.org/jira/browse/PIG-2891
>
> Thanks,
> Cheolsoo
>
>
> On Wed, Sep 12, 2012 at 2:35 AM, Alberto Cordioli <
> cordioli.alberto@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm currently working with Pig 0.10.0. I'd like to load some data from
>> an HBase table, but I encountered some problems. When I try to load
>> the data it seems to work:
>>
>> grunt> raw = LOAD 'hbase://table_test' USING
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
>> true -limit 5') as (hash:bytearray, data1:chararray);
>> 2012-09-12 11:27:48,213 [main] INFO
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> family:descriptor filters with values d:data1
>> 2012-09-12 11:27:48,216 [main] INFO
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> family:descriptor filters with values d:data1
>> 2012-09-12 11:27:48,264 [main] INFO
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> family:descriptor filters with values d:data1
>> 2012-09-12 11:27:48,267 [main] INFO
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> family:descriptor filters with values d:data1
>> 2012-09-12 11:27:48,267 [main] INFO
>> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
>> family:descriptor filters with values d:data1
>>
>> but when I dump the data I get this exception:
>>
>> Caused by: java.lang.IllegalArgumentException: Not a host:port pair:
>>
>> My configuration is:
>> Apache Hadoop 1.0.3
>> HBase 0.94.1
>> Pig 0.10.0
>>
>> In the pig script file I set the two env variable in this way:
>> HADOOP_CONF_DIR=/usr/local/hadoop/conf
>> HBASE_CONF_DIR=/usr/local/hbase/conf
>>
>>
>> Could you help me? I don't know how can I solve this issue.
>>
>>
>>
>> Thanks,
>> Alberto
>>
>>
>>
>>
>> --
>> Alberto Cordioli
>>



-- 
Alberto Cordioli

Re: IllegalArgumentException: Not a host:port pair - Pig 0.10.0 with HBase

Posted by Cheolsoo Park <ch...@cloudera.com>.
Hi Alberto,

Are you running in local mode using "pig.jar", or in mapreduce mode using
"pig-withouthadoop.jar"?

The error that you're seeing can happen when there is a version mismatch in
the HBase client and server. The pig.jar by default contains hbase-0.90, so
if you use pig.jar against hbase-0.94, it won't work. There was a jira
about running pig.jar against hbase-0.94:
https://issues.apache.org/jira/browse/PIG-2891

Thanks,
Cheolsoo


On Wed, Sep 12, 2012 at 2:35 AM, Alberto Cordioli <
cordioli.alberto@gmail.com> wrote:

> Hi all,
>
> I'm currently working with Pig 0.10.0. I'd like to load some data from
> an HBase table, but I encountered some problems. When I try to load
> the data it seems to work:
>
> grunt> raw = LOAD 'hbase://table_test' USING
> org.apache.pig.backend.hadoop.hbase.HBaseStorage('d:data1', '-loadKey
> true -limit 5') as (hash:bytearray, data1:chararray);
> 2012-09-12 11:27:48,213 [main] INFO
> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> family:descriptor filters with values d:data1
> 2012-09-12 11:27:48,216 [main] INFO
> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> family:descriptor filters with values d:data1
> 2012-09-12 11:27:48,264 [main] INFO
> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> family:descriptor filters with values d:data1
> 2012-09-12 11:27:48,267 [main] INFO
> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> family:descriptor filters with values d:data1
> 2012-09-12 11:27:48,267 [main] INFO
> org.apache.pig.backend.hadoop.hbase.HBaseStorage - Adding
> family:descriptor filters with values d:data1
>
> but when I dump the data I get this exception:
>
> Caused by: java.lang.IllegalArgumentException: Not a host:port pair:
>
> My configuration is:
> Apache Hadoop 1.0.3
> HBase 0.94.1
> Pig 0.10.0
>
> In the pig script file I set the two env variable in this way:
> HADOOP_CONF_DIR=/usr/local/hadoop/conf
> HBASE_CONF_DIR=/usr/local/hbase/conf
>
>
> Could you help me? I don't know how can I solve this issue.
>
>
>
> Thanks,
> Alberto
>
>
>
>
> --
> Alberto Cordioli
>