You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Aryeh Berkowitz <ar...@iswcorp.com> on 2009/12/23 17:40:57 UTC

Pig Setup

I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?

Re: Pig Setup

Posted by Jeff Zhang <zj...@gmail.com>.
You need to put the hadoop conf folder under your CLASSPATH


Jeff Zhang

On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:

> I followed the instructions on the Pig Setup page but I can't seem to be
> able to attach to my HDFS cluster. Is there a configuration file I'm missing
> or a environment variable that I'm missing?
>

Re: Pig Setup

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
You will want to run ant to compile the HBase support patch (or just
wait until 0.6, which will have it).

Not sure what the ant problem is.. might be that the "release" doesn't
include some of the build files. If you are comfortable with this sort
of thing, try checking out the source from svn:
http://svn.apache.org/repos/asf/hadoop/pig/tags/release-0.5.0/

Unless this really is an optional ant component that's missing --
though I don't recall installing anything other than vanilla ant and
ivy to get this stuff to compile.

-D

On Wed, Dec 23, 2009 at 9:52 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
> ls / worked. Thanks! I don't have a pig.jar, I only have pig-0.5.0-core.jar but it seemed to have worked.
>
> I didn't run ant. Do I need to? Ant fails when I run it.
>
> BUILD FAILED
> /usr/local/apps/pig-0.5.0/build.xml:237: Problem: failed to create task or type jjtree
> Cause: the class org.apache.tools.ant.taskdefs.optional.javacc.JJTree was not found.
>        This looks like one of Ant's optional components.
> Action: Check that the appropriate optional JAR exists in
>        -/usr/share/ant/lib
>        -/root/.ant/lib
>        -a directory added on the command line with the -lib argument
>
>
>
> -----Original Message-----
> From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com]
> Sent: Wednesday, December 23, 2009 12:41 PM
> To: pig-user@hadoop.apache.org
> Subject: Re: Pig Setup
>
> Hm, interesting. So it looks like you are now able to connect to HDFS
> fine, but "ls" on an empty string dies. Try "ls /" (ls on an empty
> string works for me, but I'm on trunk, which is halfway between 0.6
> and 0.7).
>
> If you continue to have trouble:
>
> By pig.jar I mean the pig.jar that gets dropped into your pig
> directory (such as /usr/local/apps/pig-0.5.0) when you run ant.
>
> Do you know what's up with the "attempt to override final parameter"
> messages? This is a Hadoop thing, not a Pig thing. Are you able to use
> Hadoop? Seems like something may be misconfigured.
>
> Can you send (or paste to pastebin) the logfile in
> /usr/local/apps/pig-0.5.0/pig_1261587982689.log ?
>
> BTW, undocumented (so far) tip: you can use pig.logfile in
> pig.properties to specify an alternative directory for your logfiles,
> if you don't like polluting the working directory with them.
>
> As far as supporting HBase, you will want to apply the patch from
> https://issues.apache.org/jira/browse/PIG-970 and rebuild.
>
> -D
>
> On Wed, Dec 23, 2009 at 9:12 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
>> Dmitriy,
>> Thanks for your reply. I added the pig_classpath and I got somewhere but I'm still getting errors which I put here: http://pastebin.com/d213a21b9.
>>
>> Also, when you say pig.jar, I'm assuming that I'm using pig-0.5.0-core.jar in the latest version?
>>
>> My ultimate goal is to be to load an HBase table. Any help with that would be appreciated.
>>
>> Aryeh
>>
>> -----Original Message-----
>> From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com]
>> Sent: Wednesday, December 23, 2009 11:51 AM
>> To: pig-user@hadoop.apache.org
>> Subject: Re: Pig Setup
>>
>> Hi Aryeh,
>> The most common cause of this is not having the hadoop conf directory
>> in your classpath.
>>
>> If you are using the bin/pig script (as opposed to using Pig through
>> Java) you can put both the conf directory and pig.jar in
>> PIG_CLASSPATH, for example I have:
>>
>> export PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/
>>
>> If this doesn't help, please send the exact error you are getting,
>> your pig version, and the relevant environment information (all
>> classpaths, etc).
>>
>> Cheers
>> -D
>>
>> On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
>>> I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?
>>>
>>
>

RE: Pig Setup

Posted by Aryeh Berkowitz <ar...@iswcorp.com>.
ls / worked. Thanks! I don't have a pig.jar, I only have pig-0.5.0-core.jar but it seemed to have worked. 

I didn't run ant. Do I need to? Ant fails when I run it. 

BUILD FAILED
/usr/local/apps/pig-0.5.0/build.xml:237: Problem: failed to create task or type jjtree
Cause: the class org.apache.tools.ant.taskdefs.optional.javacc.JJTree was not found.
        This looks like one of Ant's optional components.
Action: Check that the appropriate optional JAR exists in
        -/usr/share/ant/lib
        -/root/.ant/lib
        -a directory added on the command line with the -lib argument

 

-----Original Message-----
From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com] 
Sent: Wednesday, December 23, 2009 12:41 PM
To: pig-user@hadoop.apache.org
Subject: Re: Pig Setup

Hm, interesting. So it looks like you are now able to connect to HDFS
fine, but "ls" on an empty string dies. Try "ls /" (ls on an empty
string works for me, but I'm on trunk, which is halfway between 0.6
and 0.7).

If you continue to have trouble:

By pig.jar I mean the pig.jar that gets dropped into your pig
directory (such as /usr/local/apps/pig-0.5.0) when you run ant.

Do you know what's up with the "attempt to override final parameter"
messages? This is a Hadoop thing, not a Pig thing. Are you able to use
Hadoop? Seems like something may be misconfigured.

Can you send (or paste to pastebin) the logfile in
/usr/local/apps/pig-0.5.0/pig_1261587982689.log ?

BTW, undocumented (so far) tip: you can use pig.logfile in
pig.properties to specify an alternative directory for your logfiles,
if you don't like polluting the working directory with them.

As far as supporting HBase, you will want to apply the patch from
https://issues.apache.org/jira/browse/PIG-970 and rebuild.

-D

On Wed, Dec 23, 2009 at 9:12 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
> Dmitriy,
> Thanks for your reply. I added the pig_classpath and I got somewhere but I'm still getting errors which I put here: http://pastebin.com/d213a21b9.
>
> Also, when you say pig.jar, I'm assuming that I'm using pig-0.5.0-core.jar in the latest version?
>
> My ultimate goal is to be to load an HBase table. Any help with that would be appreciated.
>
> Aryeh
>
> -----Original Message-----
> From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com]
> Sent: Wednesday, December 23, 2009 11:51 AM
> To: pig-user@hadoop.apache.org
> Subject: Re: Pig Setup
>
> Hi Aryeh,
> The most common cause of this is not having the hadoop conf directory
> in your classpath.
>
> If you are using the bin/pig script (as opposed to using Pig through
> Java) you can put both the conf directory and pig.jar in
> PIG_CLASSPATH, for example I have:
>
> export PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/
>
> If this doesn't help, please send the exact error you are getting,
> your pig version, and the relevant environment information (all
> classpaths, etc).
>
> Cheers
> -D
>
> On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
>> I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?
>>
>

Re: Pig Setup

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Hm, interesting. So it looks like you are now able to connect to HDFS
fine, but "ls" on an empty string dies. Try "ls /" (ls on an empty
string works for me, but I'm on trunk, which is halfway between 0.6
and 0.7).

If you continue to have trouble:

By pig.jar I mean the pig.jar that gets dropped into your pig
directory (such as /usr/local/apps/pig-0.5.0) when you run ant.

Do you know what's up with the "attempt to override final parameter"
messages? This is a Hadoop thing, not a Pig thing. Are you able to use
Hadoop? Seems like something may be misconfigured.

Can you send (or paste to pastebin) the logfile in
/usr/local/apps/pig-0.5.0/pig_1261587982689.log ?

BTW, undocumented (so far) tip: you can use pig.logfile in
pig.properties to specify an alternative directory for your logfiles,
if you don't like polluting the working directory with them.

As far as supporting HBase, you will want to apply the patch from
https://issues.apache.org/jira/browse/PIG-970 and rebuild.

-D

On Wed, Dec 23, 2009 at 9:12 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
> Dmitriy,
> Thanks for your reply. I added the pig_classpath and I got somewhere but I'm still getting errors which I put here: http://pastebin.com/d213a21b9.
>
> Also, when you say pig.jar, I'm assuming that I'm using pig-0.5.0-core.jar in the latest version?
>
> My ultimate goal is to be to load an HBase table. Any help with that would be appreciated.
>
> Aryeh
>
> -----Original Message-----
> From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com]
> Sent: Wednesday, December 23, 2009 11:51 AM
> To: pig-user@hadoop.apache.org
> Subject: Re: Pig Setup
>
> Hi Aryeh,
> The most common cause of this is not having the hadoop conf directory
> in your classpath.
>
> If you are using the bin/pig script (as opposed to using Pig through
> Java) you can put both the conf directory and pig.jar in
> PIG_CLASSPATH, for example I have:
>
> export PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/
>
> If this doesn't help, please send the exact error you are getting,
> your pig version, and the relevant environment information (all
> classpaths, etc).
>
> Cheers
> -D
>
> On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
>> I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?
>>
>

RE: Pig Setup

Posted by Aryeh Berkowitz <ar...@iswcorp.com>.
Dmitriy,
Thanks for your reply. I added the pig_classpath and I got somewhere but I'm still getting errors which I put here: http://pastebin.com/d213a21b9.

Also, when you say pig.jar, I'm assuming that I'm using pig-0.5.0-core.jar in the latest version?

My ultimate goal is to be to load an HBase table. Any help with that would be appreciated.

Aryeh

-----Original Message-----
From: Dmitriy Ryaboy [mailto:dvryaboy@gmail.com] 
Sent: Wednesday, December 23, 2009 11:51 AM
To: pig-user@hadoop.apache.org
Subject: Re: Pig Setup

Hi Aryeh,
The most common cause of this is not having the hadoop conf directory
in your classpath.

If you are using the bin/pig script (as opposed to using Pig through
Java) you can put both the conf directory and pig.jar in
PIG_CLASSPATH, for example I have:

export PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/

If this doesn't help, please send the exact error you are getting,
your pig version, and the relevant environment information (all
classpaths, etc).

Cheers
-D

On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
> I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?
>

Re: Pig Setup

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Hi Aryeh,
The most common cause of this is not having the hadoop conf directory
in your classpath.

If you are using the bin/pig script (as opposed to using Pig through
Java) you can put both the conf directory and pig.jar in
PIG_CLASSPATH, for example I have:

export PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/

If this doesn't help, please send the exact error you are getting,
your pig version, and the relevant environment information (all
classpaths, etc).

Cheers
-D

On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <ar...@iswcorp.com> wrote:
> I followed the instructions on the Pig Setup page but I can't seem to be able to attach to my HDFS cluster. Is there a configuration file I'm missing or a environment variable that I'm missing?
>