You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Jeff Dalton <je...@gmail.com> on 2010/01/10 00:22:29 UTC

PiggyBank and Pig 0.6 Problem

A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
I've been having problems with scripts that use PiggyBank functions.
All the map jobs for the script fail with:
WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
to org.apache.pig.impl.PigContext
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
	at org.apache.hadoop.mapred.Child.main(Child.java:159)
INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

I compiled the PiggyBank jar using the latest code from SVN (as of Jan
9) and Pig 0.6.  Below I've included a simple example program that
caused the error that simply reads a text file of words and lower
cases them.

register ./piggybank.jar
DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();

The Hadoop error isn't very informative about what is going on.  Am I
using a compatible version of PiggyBank?  What should I be doing
differently?

Thanks,

- Jeff

Re: PiggyBank and Pig 0.6 Problem

Posted by Jeff Dalton <je...@gmail.com>.
The problem was my fault. I got the wrong PiggyBank version out of the
repository. I managed to get it working with the correct version this
morning.

Thanks for taking a look for me.

On Sun, Jan 10, 2010 at 6:06 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:

> Jeff, I am unable to reproduce this error using pig compiled from the
> current top of the 0.6 branch and the script you provided. Are you
> sure 0.6 is what you are actually using? It hasn't been released yet.
> Do you know what svn revision the jar was compiled from?
>
> -D
>
> On Sat, Jan 9, 2010 at 5:07 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
> > Jeff,
> > I'll check it out this weekend.
> >
> > -D
> >
> > On Sat, Jan 9, 2010 at 3:47 PM, Jeff Dalton <je...@gmail.com>
> wrote:
> >> I downloaded the version of PiggyBank from the 0.6 branch, compiled,
> >> and deployed it.  However, I still get the same error message:
> >>
> >> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> >> to org.apache.pig.impl.PigContext
> >>        at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
> >>        at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
> >>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> >>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
> >>
> >> I'll try again later, but if anyone has any insights, I would
> >> appreciate the help.
> >>
> >> Thanks,
> >>
> >> - Jeff
> >>
> >> On Sat, Jan 9, 2010 at 6:33 PM, Jeff Dalton <je...@gmail.com>
> wrote:
> >>> Ahh, the PiggyBank version was the latest from Trunk.  I probably need
> >>> to go track down the version from the 0.6 branch.
> >>>
> >>> On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <dv...@gmail.com>
> wrote:
> >>>> When you say that the code is from SVN, do you mean trunk, or the 0.6
> branch?
> >>>>
> >>>>
> >>>> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com>
> wrote:
> >>>>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
> >>>>> I've been having problems with scripts that use PiggyBank functions.
> >>>>> All the map jobs for the script fail with:
> >>>>> WARN org.apache.hadoop.mapred.Child: Error running child
> >>>>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> >>>>> to org.apache.pig.impl.PigContext
> >>>>>        at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
> >>>>>        at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
> >>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> >>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
> >>>>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
> >>>>>
> >>>>> I compiled the PiggyBank jar using the latest code from SVN (as of
> Jan
> >>>>> 9) and Pig 0.6.  Below I've included a simple example program that
> >>>>> caused the error that simply reads a text file of words and lower
> >>>>> cases them.
> >>>>>
> >>>>> register ./piggybank.jar
> >>>>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
> >>>>> words = LOAD './data/headwords_sample' USING PigStorage() as
> (word:charArray);
> >>>>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
> >>>>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
> >>>>>
> >>>>> The Hadoop error isn't very informative about what is going on.  Am I
> >>>>> using a compatible version of PiggyBank?  What should I be doing
> >>>>> differently?
> >>>>>
> >>>>> Thanks,
> >>>>>
> >>>>> - Jeff
> >>>>>
> >>>>
> >>>
> >>
> >
>

Re: PiggyBank and Pig 0.6 Problem

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Jeff, I am unable to reproduce this error using pig compiled from the
current top of the 0.6 branch and the script you provided. Are you
sure 0.6 is what you are actually using? It hasn't been released yet.
Do you know what svn revision the jar was compiled from?

-D

On Sat, Jan 9, 2010 at 5:07 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
> Jeff,
> I'll check it out this weekend.
>
> -D
>
> On Sat, Jan 9, 2010 at 3:47 PM, Jeff Dalton <je...@gmail.com> wrote:
>> I downloaded the version of PiggyBank from the 0.6 branch, compiled,
>> and deployed it.  However, I still get the same error message:
>>
>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
>> to org.apache.pig.impl.PigContext
>>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>>
>> I'll try again later, but if anyone has any insights, I would
>> appreciate the help.
>>
>> Thanks,
>>
>> - Jeff
>>
>> On Sat, Jan 9, 2010 at 6:33 PM, Jeff Dalton <je...@gmail.com> wrote:
>>> Ahh, the PiggyBank version was the latest from Trunk.  I probably need
>>> to go track down the version from the 0.6 branch.
>>>
>>> On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
>>>> When you say that the code is from SVN, do you mean trunk, or the 0.6 branch?
>>>>
>>>>
>>>> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com> wrote:
>>>>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
>>>>> I've been having problems with scripts that use PiggyBank functions.
>>>>> All the map jobs for the script fail with:
>>>>> WARN org.apache.hadoop.mapred.Child: Error running child
>>>>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
>>>>> to org.apache.pig.impl.PigContext
>>>>>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>>>>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>>>>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
>>>>>
>>>>> I compiled the PiggyBank jar using the latest code from SVN (as of Jan
>>>>> 9) and Pig 0.6.  Below I've included a simple example program that
>>>>> caused the error that simply reads a text file of words and lower
>>>>> cases them.
>>>>>
>>>>> register ./piggybank.jar
>>>>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
>>>>> words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
>>>>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
>>>>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
>>>>>
>>>>> The Hadoop error isn't very informative about what is going on.  Am I
>>>>> using a compatible version of PiggyBank?  What should I be doing
>>>>> differently?
>>>>>
>>>>> Thanks,
>>>>>
>>>>> - Jeff
>>>>>
>>>>
>>>
>>
>

Re: PiggyBank and Pig 0.6 Problem

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Jeff,
I'll check it out this weekend.

-D

On Sat, Jan 9, 2010 at 3:47 PM, Jeff Dalton <je...@gmail.com> wrote:
> I downloaded the version of PiggyBank from the 0.6 branch, compiled,
> and deployed it.  However, I still get the same error message:
>
> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> to org.apache.pig.impl.PigContext
>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>
> I'll try again later, but if anyone has any insights, I would
> appreciate the help.
>
> Thanks,
>
> - Jeff
>
> On Sat, Jan 9, 2010 at 6:33 PM, Jeff Dalton <je...@gmail.com> wrote:
>> Ahh, the PiggyBank version was the latest from Trunk.  I probably need
>> to go track down the version from the 0.6 branch.
>>
>> On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
>>> When you say that the code is from SVN, do you mean trunk, or the 0.6 branch?
>>>
>>>
>>> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com> wrote:
>>>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
>>>> I've been having problems with scripts that use PiggyBank functions.
>>>> All the map jobs for the script fail with:
>>>> WARN org.apache.hadoop.mapred.Child: Error running child
>>>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
>>>> to org.apache.pig.impl.PigContext
>>>>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>>>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>>>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
>>>>
>>>> I compiled the PiggyBank jar using the latest code from SVN (as of Jan
>>>> 9) and Pig 0.6.  Below I've included a simple example program that
>>>> caused the error that simply reads a text file of words and lower
>>>> cases them.
>>>>
>>>> register ./piggybank.jar
>>>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
>>>> words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
>>>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
>>>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
>>>>
>>>> The Hadoop error isn't very informative about what is going on.  Am I
>>>> using a compatible version of PiggyBank?  What should I be doing
>>>> differently?
>>>>
>>>> Thanks,
>>>>
>>>> - Jeff
>>>>
>>>
>>
>

Re: PiggyBank and Pig 0.6 Problem

Posted by Jeff Dalton <je...@gmail.com>.
I downloaded the version of PiggyBank from the 0.6 branch, compiled,
and deployed it.  However, I still get the same error message:

java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
to org.apache.pig.impl.PigContext
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
	at org.apache.hadoop.mapred.Child.main(Child.java:159)

I'll try again later, but if anyone has any insights, I would
appreciate the help.

Thanks,

- Jeff

On Sat, Jan 9, 2010 at 6:33 PM, Jeff Dalton <je...@gmail.com> wrote:
> Ahh, the PiggyBank version was the latest from Trunk.  I probably need
> to go track down the version from the 0.6 branch.
>
> On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
>> When you say that the code is from SVN, do you mean trunk, or the 0.6 branch?
>>
>>
>> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com> wrote:
>>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
>>> I've been having problems with scripts that use PiggyBank functions.
>>> All the map jobs for the script fail with:
>>> WARN org.apache.hadoop.mapred.Child: Error running child
>>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
>>> to org.apache.pig.impl.PigContext
>>>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
>>>
>>> I compiled the PiggyBank jar using the latest code from SVN (as of Jan
>>> 9) and Pig 0.6.  Below I've included a simple example program that
>>> caused the error that simply reads a text file of words and lower
>>> cases them.
>>>
>>> register ./piggybank.jar
>>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
>>> words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
>>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
>>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
>>>
>>> The Hadoop error isn't very informative about what is going on.  Am I
>>> using a compatible version of PiggyBank?  What should I be doing
>>> differently?
>>>
>>> Thanks,
>>>
>>> - Jeff
>>>
>>
>

Re: PiggyBank and Pig 0.6 Problem

Posted by Jeff Dalton <je...@gmail.com>.
Ahh, the PiggyBank version was the latest from Trunk.  I probably need
to go track down the version from the 0.6 branch.

On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <dv...@gmail.com> wrote:
> When you say that the code is from SVN, do you mean trunk, or the 0.6 branch?
>
>
> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com> wrote:
>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
>> I've been having problems with scripts that use PiggyBank functions.
>> All the map jobs for the script fail with:
>> WARN org.apache.hadoop.mapred.Child: Error running child
>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
>> to org.apache.pig.impl.PigContext
>>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
>>
>> I compiled the PiggyBank jar using the latest code from SVN (as of Jan
>> 9) and Pig 0.6.  Below I've included a simple example program that
>> caused the error that simply reads a text file of words and lower
>> cases them.
>>
>> register ./piggybank.jar
>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
>> words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
>>
>> The Hadoop error isn't very informative about what is going on.  Am I
>> using a compatible version of PiggyBank?  What should I be doing
>> differently?
>>
>> Thanks,
>>
>> - Jeff
>>
>

Re: PiggyBank and Pig 0.6 Problem

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
When you say that the code is from SVN, do you mean trunk, or the 0.6 branch?


On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <je...@gmail.com> wrote:
> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
> I've been having problems with scripts that use PiggyBank functions.
> All the map jobs for the script fail with:
> WARN org.apache.hadoop.mapred.Child: Error running child
> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> to org.apache.pig.impl.PigContext
>        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
>
> I compiled the PiggyBank jar using the latest code from SVN (as of Jan
> 9) and Pig 0.6.  Below I've included a simple example program that
> caused the error that simply reads a text file of words and lower
> cases them.
>
> register ./piggybank.jar
> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
> words = LOAD './data/headwords_sample' USING PigStorage() as (word:charArray);
> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
>
> The Hadoop error isn't very informative about what is going on.  Am I
> using a compatible version of PiggyBank?  What should I be doing
> differently?
>
> Thanks,
>
> - Jeff
>