You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Chris Allen <ch...@affinesystems.com> on 2011/08/16 00:37:54 UTC

Cannot connect to pseudo-distributed or full (EC2) hadoop cluster from pig on startup, fails on EOFException

Despite any amount of finagling I do with the classpath, I can't get
pig to connect to my local pseudo-distributed hadoop instance NOR my
cluster on EC2. My EC cluster is 20.2 CDH, local pseudo-distributed is
20.203. Pig is 0.9.1.

Gives the "cannot connect to localhost:9000" exception.

Does anyone have any idea? I've scoured Google and gone through
everything I could find in terms of solutions, most of which involved
using the withouthadoop.jar and setting the classpath, which I've done
a billion permutations of already.

If anyone has any advice or can point me to documentation, I would be
immensely grateful, thank you.

--- Chris

Examples of various dry-run outputs that will fail against
localhost:9000 (namenode) in the attempt to connect:

# Built locally with ant
java -cp pig.jar:/home/callen/hadoop/conf org.apache.pig.Main

# Bundled
java -cp /home/callen/pig/pig-0.9.0-core.jar:/home/callen/hadoop/conf/
org.apache.pig.Main

# Don't ask
/usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
-Dpig.log.dir=/home/callen/pig/bin/../logs -Dpig.log.file=pig.log
-Dpig.home.dir=/home/callen/pig/bin/..
-Dpig.root.logger=INFO,console,DRFA -classpath
/home/callen/pig/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/callen/pig/bin/../build/classes:/home/callen/pig/bin/../build/test/classes:/home/callen/pig/bin/../pig-0.9.0-core.jar:/home/callen/pig/bin/../lib/automaton.jar:/home/callen/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar:/home/callen/hadoop/conf:/home/callen/pig/pig-withouthadoop.jar:/home/callen/hadoop/hadoop-tools-0.20.203.0.jar:/home/callen/hadoop/hadoop-test-0.20.203.0.jar:/home/callen/hadoop/hadoop-examples-0.20.203.0.jar:/home/callen/hadoop/hadoop-core-0.20.203.0.jar:/home/callen/hadoop/hadoop-ant-0.20.203.0.jar:/home/callen/hadoop/lib/xmlenc-0.52.jar:/home/callen/hadoop/lib/slf4j-log4j12-1.4.3.jar:/home/callen/hadoop/lib/slf4j-api-1.4.3.jar:/home/callen/hadoop/lib/servlet-api-2.5-20081211.jar:/home/callen/hadoop/lib/oro-2.0.8.jar:/home/callen/hadoop/lib/mockito-all-1.8.5.jar:/home/callen/hadoop/lib/log4j-1.2.15.jar:/home/callen/hadoop/lib/kfs-0.2.2.jar:/home/callen/hadoop/lib/junit-4.5.jar:/home/callen/hadoop/lib/jsch-0.1.42.jar:/home/callen/hadoop/lib/jetty-util-6.1.26.jar:/home/callen/hadoop/lib/jetty-6.1.26.jar:/home/callen/hadoop/lib/jets3t-0.6.1.jar:/home/callen/hadoop/lib/jasper-runtime-5.5.12.jar:/home/callen/hadoop/lib/jasper-compiler-5.5.12.jar:/home/callen/hadoop/lib/jackson-mapper-asl-1.0.1.jar:/home/callen/hadoop/lib/jackson-core-asl-1.0.1.jar:/home/callen/hadoop/lib/hsqldb-1.8.0.10.jar:/home/callen/hadoop/lib/core-3.1.1.jar:/home/callen/hadoop/lib/commons-net-1.4.1.jar:/home/callen/hadoop/lib/commons-math-2.1.jar:/home/callen/hadoop/lib/commons-logging-api-1.0.4.jar:/home/callen/hadoop/lib/commons-logging-1.1.1.jar:/home/callen/hadoop/lib/commons-lang-2.4.jar:/home/callen/hadoop/lib/commons-httpclient-3.0.1.jar:/home/callen/hadoop/lib/commons-el-1.0.jar:/home/callen/hadoop/lib/commons-digester-1.8.jar:/home/callen/hadoop/lib/commons-daemon-1.0.1.jar:/home/callen/hadoop/lib/commons-configuration-1.6.jar:/home/callen/hadoop/lib/commons-collections-3.2.1.jar:/home/callen/hadoop/lib/commons-codec-1.4.jar:/home/callen/hadoop/lib/commons-cli-1.2.jar:/home/callen/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/callen/hadoop/lib/commons-beanutils-1.7.0.jar:/home/callen/hadoop/lib/aspectjtools-1.6.5.jar:/home/callen/hadoop/lib/aspectjrt-1.6.5.jar:
org.apache.pig.Main

Re: Cannot connect to pseudo-distributed or full (EC2) hadoop cluster from pig on startup, fails on EOFException

Posted by Chris Allen <ch...@affinesystems.com>.
I talked to a Yahoo guy about it, he said the same thing about it
being baffling. I produced a differential of what makes it work by
doing a bisect of various classpaths.

0.9.1 snapshot didn't work, 0.9.1-core didn't work, 0.8.1 didn't work, etc.

pig-withouthadoop.jar @ 0.9.1 worked.

I will test doing a build with modified ivy dependencies to see if I
can test your idea.

On Wed, Aug 17, 2011 at 11:45 AM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
> Chris, I didn't get a chance to pull down 203 and see what's going on
> yesterday.
> Something to consider is to try building pig with 203 to begin with (change
> the dependencies in ivy), and seeing if that works.  It's a bit
> mind-boggling that it doesn't work for you, as this is essentially the
> combination that Pig's biggest user (Yahoo) is running.
>
> On Wed, Aug 17, 2011 at 11:32 AM, Chris Allen <chris.allen@affinesystems.com
>> wrote:
>
>> I actually got it working by using Apache 20.2, I couldn't get it
>> working with 203 *at all*.
>>
>> On Wed, Aug 17, 2011 at 10:09 AM, Thejas Nair <th...@hortonworks.com>
>> wrote:
>> > Do you get the same exception (localhost:9000) when you try to connect to
>> > hadoop on EC2 as well ?
>> > Are you able to run any MR jobs (eg the wordcount example) on your
>> cluster ?
>> > It is always better to use withouthadoop.jar and add the hadoop.jar
>> version
>> > used in the cluster in the classpath.
>> >
>> > -Thejas
>> >
>> >
>> > On 8/15/11 3:37 PM, Chris Allen wrote:
>> >>
>> >> Despite any amount of finagling I do with the classpath, I can't get
>> >> pig to connect to my local pseudo-distributed hadoop instance NOR my
>> >> cluster on EC2. My EC cluster is 20.2 CDH, local pseudo-distributed is
>> >> 20.203. Pig is 0.9.1.
>> >>
>> >> Gives the "cannot connect to localhost:9000" exception.
>> >>
>> >> Does anyone have any idea? I've scoured Google and gone through
>> >> everything I could find in terms of solutions, most of which involved
>> >> using the withouthadoop.jar and setting the classpath, which I've done
>> >> a billion permutations of already.
>> >>
>> >> If anyone has any advice or can point me to documentation, I would be
>> >> immensely grateful, thank you.
>> >>
>> >> --- Chris
>> >>
>> >> Examples of various dry-run outputs that will fail against
>> >> localhost:9000 (namenode) in the attempt to connect:
>> >>
>> >> # Built locally with ant
>> >> java -cp pig.jar:/home/callen/hadoop/conf org.apache.pig.Main
>> >>
>> >> # Bundled
>> >> java -cp /home/callen/pig/pig-0.9.0-core.jar:/home/callen/hadoop/conf/
>> >> org.apache.pig.Main
>> >>
>> >> # Don't ask
>> >> /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
>> >> -Dpig.log.dir=/home/callen/pig/bin/../logs -Dpig.log.file=pig.log
>> >> -Dpig.home.dir=/home/callen/pig/bin/..
>> >> -Dpig.root.logger=INFO,console,DRFA -classpath
>> >>
>> >>
>> /home/callen/pig/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/callen/pig/bin/../build/classes:/home/callen/pig/bin/../build/test/classes:/home/callen/pig/bin/../pig-0.9.0-core.jar:/home/callen/pig/bin/../lib/automaton.jar:/home/callen/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar:/home/callen/hadoop/conf:/home/callen/pig/pig-withouthadoop.jar:/home/callen/hadoop/hadoop-tools-0.20.203.0.jar:/home/callen/hadoop/hadoop-test-0.20.203.0.jar:/home/callen/hadoop/hadoop-examples-0.20.203.0.jar:/home/callen/hadoop/hadoop-core-0.20.203.0.jar:/home/callen/hadoop/hadoop-ant-0.20.203.0.jar:/home/callen/hadoop/lib/xmlenc-0.52.jar:/home/callen/hadoop/lib/slf4j-log4j12-1.4.3.jar:/home/callen/hadoop/lib/slf4j-api-1.4.3.jar:/home/callen/hadoop/lib/servlet-api-2.5-20081211.jar:/home/callen/hadoop/lib/oro-2.0.8.jar:/home/callen/hadoop/lib/mockito-all-1.8.5.jar:/home/callen/hadoop/lib/log4j-1.2.15.jar:/home/callen/hadoop/lib/kfs-0.2.2.jar:/home/callen/hadoop/lib/junit-4.5.jar:/home/cal
>> >
>> >
>> len/hadoop/lib/jsch-0.1.42.jar:/home/callen/hadoop/lib/jetty-util-6.1.26.jar:/home/callen/hadoop/lib/jetty-6.1.26.jar:/home/callen/hadoop/lib/jets3t-0.6.1.jar:/home/callen/hadoop/lib/jasper-runtime-5.5.12.jar:/home/callen/hadoop/lib/jasper-compiler-5.5.12.jar:/home/callen/hadoop/lib/jackson-mapper-asl-1.0.1.jar:/home/callen/hadoop/lib/jackson-core-asl-1.0.1.jar:/home/callen/hadoop/lib/hsqldb-1.8.0.10.jar:/home/callen/hadoop/lib/core-3.1.1.jar:/home/callen/hadoop/lib/commons-net-1.4.1.jar:/home/callen/hadoop/lib/commons-math-2.1.jar:/home/callen/hadoop/lib/commons-logging-api-1.0.4.jar:/home/callen/hadoop/lib/commons-logging-1.1.1.jar:/home/callen/hadoop/lib/commons-lang-2.4.jar:/home/callen/hadoop/lib/commons-httpclient-3.0.1.jar:/home/callen/hadoop/lib/commons-el-1.0.jar:/home/callen/hadoop/lib/commons-digester-1.8.jar:/home/callen/hadoop/lib/commons-daemon-1.0.1.jar:/home/callen/hadoop/lib/commons-configuration-1.6.jar:/home/callen/hadoop/lib/commons-collections-3.2.1.jar:/
>> >
>> home/callen/hadoop/lib/commons-codec-1.4.jar:/home/callen/hadoop/lib/commons-cli-1.2.jar:/home/callen/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/callen/hadoop/lib/commons-beanutils-1.7.0.jar:/home/callen/hadoop/lib/aspectjtools-1.6.5.jar:/home/callen/hadoop/lib/aspectjrt-1.6.5.jar:
>> >>
>> >> org.apache.pig.Main
>> >
>> >
>>
>

Re: Cannot connect to pseudo-distributed or full (EC2) hadoop cluster from pig on startup, fails on EOFException

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Chris, I didn't get a chance to pull down 203 and see what's going on
yesterday.
Something to consider is to try building pig with 203 to begin with (change
the dependencies in ivy), and seeing if that works.  It's a bit
mind-boggling that it doesn't work for you, as this is essentially the
combination that Pig's biggest user (Yahoo) is running.

On Wed, Aug 17, 2011 at 11:32 AM, Chris Allen <chris.allen@affinesystems.com
> wrote:

> I actually got it working by using Apache 20.2, I couldn't get it
> working with 203 *at all*.
>
> On Wed, Aug 17, 2011 at 10:09 AM, Thejas Nair <th...@hortonworks.com>
> wrote:
> > Do you get the same exception (localhost:9000) when you try to connect to
> > hadoop on EC2 as well ?
> > Are you able to run any MR jobs (eg the wordcount example) on your
> cluster ?
> > It is always better to use withouthadoop.jar and add the hadoop.jar
> version
> > used in the cluster in the classpath.
> >
> > -Thejas
> >
> >
> > On 8/15/11 3:37 PM, Chris Allen wrote:
> >>
> >> Despite any amount of finagling I do with the classpath, I can't get
> >> pig to connect to my local pseudo-distributed hadoop instance NOR my
> >> cluster on EC2. My EC cluster is 20.2 CDH, local pseudo-distributed is
> >> 20.203. Pig is 0.9.1.
> >>
> >> Gives the "cannot connect to localhost:9000" exception.
> >>
> >> Does anyone have any idea? I've scoured Google and gone through
> >> everything I could find in terms of solutions, most of which involved
> >> using the withouthadoop.jar and setting the classpath, which I've done
> >> a billion permutations of already.
> >>
> >> If anyone has any advice or can point me to documentation, I would be
> >> immensely grateful, thank you.
> >>
> >> --- Chris
> >>
> >> Examples of various dry-run outputs that will fail against
> >> localhost:9000 (namenode) in the attempt to connect:
> >>
> >> # Built locally with ant
> >> java -cp pig.jar:/home/callen/hadoop/conf org.apache.pig.Main
> >>
> >> # Bundled
> >> java -cp /home/callen/pig/pig-0.9.0-core.jar:/home/callen/hadoop/conf/
> >> org.apache.pig.Main
> >>
> >> # Don't ask
> >> /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
> >> -Dpig.log.dir=/home/callen/pig/bin/../logs -Dpig.log.file=pig.log
> >> -Dpig.home.dir=/home/callen/pig/bin/..
> >> -Dpig.root.logger=INFO,console,DRFA -classpath
> >>
> >>
> /home/callen/pig/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/callen/pig/bin/../build/classes:/home/callen/pig/bin/../build/test/classes:/home/callen/pig/bin/../pig-0.9.0-core.jar:/home/callen/pig/bin/../lib/automaton.jar:/home/callen/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar:/home/callen/hadoop/conf:/home/callen/pig/pig-withouthadoop.jar:/home/callen/hadoop/hadoop-tools-0.20.203.0.jar:/home/callen/hadoop/hadoop-test-0.20.203.0.jar:/home/callen/hadoop/hadoop-examples-0.20.203.0.jar:/home/callen/hadoop/hadoop-core-0.20.203.0.jar:/home/callen/hadoop/hadoop-ant-0.20.203.0.jar:/home/callen/hadoop/lib/xmlenc-0.52.jar:/home/callen/hadoop/lib/slf4j-log4j12-1.4.3.jar:/home/callen/hadoop/lib/slf4j-api-1.4.3.jar:/home/callen/hadoop/lib/servlet-api-2.5-20081211.jar:/home/callen/hadoop/lib/oro-2.0.8.jar:/home/callen/hadoop/lib/mockito-all-1.8.5.jar:/home/callen/hadoop/lib/log4j-1.2.15.jar:/home/callen/hadoop/lib/kfs-0.2.2.jar:/home/callen/hadoop/lib/junit-4.5.jar:/home/cal
> >
> >
> len/hadoop/lib/jsch-0.1.42.jar:/home/callen/hadoop/lib/jetty-util-6.1.26.jar:/home/callen/hadoop/lib/jetty-6.1.26.jar:/home/callen/hadoop/lib/jets3t-0.6.1.jar:/home/callen/hadoop/lib/jasper-runtime-5.5.12.jar:/home/callen/hadoop/lib/jasper-compiler-5.5.12.jar:/home/callen/hadoop/lib/jackson-mapper-asl-1.0.1.jar:/home/callen/hadoop/lib/jackson-core-asl-1.0.1.jar:/home/callen/hadoop/lib/hsqldb-1.8.0.10.jar:/home/callen/hadoop/lib/core-3.1.1.jar:/home/callen/hadoop/lib/commons-net-1.4.1.jar:/home/callen/hadoop/lib/commons-math-2.1.jar:/home/callen/hadoop/lib/commons-logging-api-1.0.4.jar:/home/callen/hadoop/lib/commons-logging-1.1.1.jar:/home/callen/hadoop/lib/commons-lang-2.4.jar:/home/callen/hadoop/lib/commons-httpclient-3.0.1.jar:/home/callen/hadoop/lib/commons-el-1.0.jar:/home/callen/hadoop/lib/commons-digester-1.8.jar:/home/callen/hadoop/lib/commons-daemon-1.0.1.jar:/home/callen/hadoop/lib/commons-configuration-1.6.jar:/home/callen/hadoop/lib/commons-collections-3.2.1.jar:/
> >
> home/callen/hadoop/lib/commons-codec-1.4.jar:/home/callen/hadoop/lib/commons-cli-1.2.jar:/home/callen/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/callen/hadoop/lib/commons-beanutils-1.7.0.jar:/home/callen/hadoop/lib/aspectjtools-1.6.5.jar:/home/callen/hadoop/lib/aspectjrt-1.6.5.jar:
> >>
> >> org.apache.pig.Main
> >
> >
>

Re: Cannot connect to pseudo-distributed or full (EC2) hadoop cluster from pig on startup, fails on EOFException

Posted by Chris Allen <ch...@affinesystems.com>.
I actually got it working by using Apache 20.2, I couldn't get it
working with 203 *at all*.

On Wed, Aug 17, 2011 at 10:09 AM, Thejas Nair <th...@hortonworks.com> wrote:
> Do you get the same exception (localhost:9000) when you try to connect to
> hadoop on EC2 as well ?
> Are you able to run any MR jobs (eg the wordcount example) on your cluster ?
> It is always better to use withouthadoop.jar and add the hadoop.jar version
> used in the cluster in the classpath.
>
> -Thejas
>
>
> On 8/15/11 3:37 PM, Chris Allen wrote:
>>
>> Despite any amount of finagling I do with the classpath, I can't get
>> pig to connect to my local pseudo-distributed hadoop instance NOR my
>> cluster on EC2. My EC cluster is 20.2 CDH, local pseudo-distributed is
>> 20.203. Pig is 0.9.1.
>>
>> Gives the "cannot connect to localhost:9000" exception.
>>
>> Does anyone have any idea? I've scoured Google and gone through
>> everything I could find in terms of solutions, most of which involved
>> using the withouthadoop.jar and setting the classpath, which I've done
>> a billion permutations of already.
>>
>> If anyone has any advice or can point me to documentation, I would be
>> immensely grateful, thank you.
>>
>> --- Chris
>>
>> Examples of various dry-run outputs that will fail against
>> localhost:9000 (namenode) in the attempt to connect:
>>
>> # Built locally with ant
>> java -cp pig.jar:/home/callen/hadoop/conf org.apache.pig.Main
>>
>> # Bundled
>> java -cp /home/callen/pig/pig-0.9.0-core.jar:/home/callen/hadoop/conf/
>> org.apache.pig.Main
>>
>> # Don't ask
>> /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
>> -Dpig.log.dir=/home/callen/pig/bin/../logs -Dpig.log.file=pig.log
>> -Dpig.home.dir=/home/callen/pig/bin/..
>> -Dpig.root.logger=INFO,console,DRFA -classpath
>>
>> /home/callen/pig/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/callen/pig/bin/../build/classes:/home/callen/pig/bin/../build/test/classes:/home/callen/pig/bin/../pig-0.9.0-core.jar:/home/callen/pig/bin/../lib/automaton.jar:/home/callen/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar:/home/callen/hadoop/conf:/home/callen/pig/pig-withouthadoop.jar:/home/callen/hadoop/hadoop-tools-0.20.203.0.jar:/home/callen/hadoop/hadoop-test-0.20.203.0.jar:/home/callen/hadoop/hadoop-examples-0.20.203.0.jar:/home/callen/hadoop/hadoop-core-0.20.203.0.jar:/home/callen/hadoop/hadoop-ant-0.20.203.0.jar:/home/callen/hadoop/lib/xmlenc-0.52.jar:/home/callen/hadoop/lib/slf4j-log4j12-1.4.3.jar:/home/callen/hadoop/lib/slf4j-api-1.4.3.jar:/home/callen/hadoop/lib/servlet-api-2.5-20081211.jar:/home/callen/hadoop/lib/oro-2.0.8.jar:/home/callen/hadoop/lib/mockito-all-1.8.5.jar:/home/callen/hadoop/lib/log4j-1.2.15.jar:/home/callen/hadoop/lib/kfs-0.2.2.jar:/home/callen/hadoop/lib/junit-4.5.jar:/home/cal
>
> len/hadoop/lib/jsch-0.1.42.jar:/home/callen/hadoop/lib/jetty-util-6.1.26.jar:/home/callen/hadoop/lib/jetty-6.1.26.jar:/home/callen/hadoop/lib/jets3t-0.6.1.jar:/home/callen/hadoop/lib/jasper-runtime-5.5.12.jar:/home/callen/hadoop/lib/jasper-compiler-5.5.12.jar:/home/callen/hadoop/lib/jackson-mapper-asl-1.0.1.jar:/home/callen/hadoop/lib/jackson-core-asl-1.0.1.jar:/home/callen/hadoop/lib/hsqldb-1.8.0.10.jar:/home/callen/hadoop/lib/core-3.1.1.jar:/home/callen/hadoop/lib/commons-net-1.4.1.jar:/home/callen/hadoop/lib/commons-math-2.1.jar:/home/callen/hadoop/lib/commons-logging-api-1.0.4.jar:/home/callen/hadoop/lib/commons-logging-1.1.1.jar:/home/callen/hadoop/lib/commons-lang-2.4.jar:/home/callen/hadoop/lib/commons-httpclient-3.0.1.jar:/home/callen/hadoop/lib/commons-el-1.0.jar:/home/callen/hadoop/lib/commons-digester-1.8.jar:/home/callen/hadoop/lib/commons-daemon-1.0.1.jar:/home/callen/hadoop/lib/commons-configuration-1.6.jar:/home/callen/hadoop/lib/commons-collections-3.2.1.jar:/
> home/callen/hadoop/lib/commons-codec-1.4.jar:/home/callen/hadoop/lib/commons-cli-1.2.jar:/home/callen/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/callen/hadoop/lib/commons-beanutils-1.7.0.jar:/home/callen/hadoop/lib/aspectjtools-1.6.5.jar:/home/callen/hadoop/lib/aspectjrt-1.6.5.jar:
>>
>> org.apache.pig.Main
>
>

Re: Cannot connect to pseudo-distributed or full (EC2) hadoop cluster from pig on startup, fails on EOFException

Posted by Thejas Nair <th...@hortonworks.com>.
Do you get the same exception (localhost:9000) when you try to connect 
to hadoop on EC2 as well ?
Are you able to run any MR jobs (eg the wordcount example) on your cluster ?
It is always better to use withouthadoop.jar and add the hadoop.jar 
version used in the cluster in the classpath.

-Thejas


On 8/15/11 3:37 PM, Chris Allen wrote:
> Despite any amount of finagling I do with the classpath, I can't get
> pig to connect to my local pseudo-distributed hadoop instance NOR my
> cluster on EC2. My EC cluster is 20.2 CDH, local pseudo-distributed is
> 20.203. Pig is 0.9.1.
>
> Gives the "cannot connect to localhost:9000" exception.
>
> Does anyone have any idea? I've scoured Google and gone through
> everything I could find in terms of solutions, most of which involved
> using the withouthadoop.jar and setting the classpath, which I've done
> a billion permutations of already.
>
> If anyone has any advice or can point me to documentation, I would be
> immensely grateful, thank you.
>
> --- Chris
>
> Examples of various dry-run outputs that will fail against
> localhost:9000 (namenode) in the attempt to connect:
>
> # Built locally with ant
> java -cp pig.jar:/home/callen/hadoop/conf org.apache.pig.Main
>
> # Bundled
> java -cp /home/callen/pig/pig-0.9.0-core.jar:/home/callen/hadoop/conf/
> org.apache.pig.Main
>
> # Don't ask
> /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
> -Dpig.log.dir=/home/callen/pig/bin/../logs -Dpig.log.file=pig.log
> -Dpig.home.dir=/home/callen/pig/bin/..
> -Dpig.root.logger=INFO,console,DRFA -classpath
> /home/callen/pig/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/callen/pig/bin/../build/classes:/home/callen/pig/bin/../build/test/classes:/home/callen/pig/bin/../pig-0.9.0-core.jar:/home/callen/pig/bin/../lib/automaton.jar:/home/callen/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar:/home/callen/hadoop/conf:/home/callen/pig/pig-withouthadoop.jar:/home/callen/hadoop/hadoop-tools-0.20.203.0.jar:/home/callen/hadoop/hadoop-test-0.20.203.0.jar:/home/callen/hadoop/hadoop-examples-0.20.203.0.jar:/home/callen/hadoop/hadoop-core-0.20.203.0.jar:/home/callen/hadoop/hadoop-ant-0.20.203.0.jar:/home/callen/hadoop/lib/xmlenc-0.52.jar:/home/callen/hadoop/lib/slf4j-log4j12-1.4.3.jar:/home/callen/hadoop/lib/slf4j-api-1.4.3.jar:/home/callen/hadoop/lib/servlet-api-2.5-20081211.jar:/home/callen/hadoop/lib/oro-2.0.8.jar:/home/callen/hadoop/lib/mockito-all-1.8.5.jar:/home/callen/hadoop/lib/log4j-1.2.15.jar:/home/callen/hadoop/lib/kfs-0.2.2.jar:/home/callen/hadoop/lib/junit-4.5.jar:/home/cal
len/hadoop/lib/jsch-0.1.42.jar:/home/callen/hadoop/lib/jetty-util-6.1.26.jar:/home/callen/hadoop/lib/jetty-6.1.26.jar:/home/callen/hadoop/lib/jets3t-0.6.1.jar:/home/callen/hadoop/lib/jasper-runtime-5.5.12.jar:/home/callen/hadoop/lib/jasper-compiler-5.5.12.jar:/home/callen/hadoop/lib/jackson-mapper-asl-1.0.1.jar:/home/callen/hadoop/lib/jackson-core-asl-1.0.1.jar:/home/callen/hadoop/lib/hsqldb-1.8.0.10.jar:/home/callen/hadoop/lib/core-3.1.1.jar:/home/callen/hadoop/lib/commons-net-1.4.1.jar:/home/callen/hadoop/lib/commons-math-2.1.jar:/home/callen/hadoop/lib/commons-logging-api-1.0.4.jar:/home/callen/hadoop/lib/commons-logging-1.1.1.jar:/home/callen/hadoop/lib/commons-lang-2.4.jar:/home/callen/hadoop/lib/commons-httpclient-3.0.1.jar:/home/callen/hadoop/lib/commons-el-1.0.jar:/home/callen/hadoop/lib/commons-digester-1.8.jar:/home/callen/hadoop/lib/commons-daemon-1.0.1.jar:/home/callen/hadoop/lib/commons-configuration-1.6.jar:/home/callen/hadoop/lib/commons-collections-3.2.1.jar:/
home/callen/hadoop/lib/commons-codec-1.4.jar:/home/callen/hadoop/lib/commons-cli-1.2.jar:/home/callen/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/callen/hadoop/lib/commons-beanutils-1.7.0.jar:/home/callen/hadoop/lib/aspectjtools-1.6.5.jar:/home/callen/hadoop/lib/aspectjrt-1.6.5.jar:
> org.apache.pig.Main