You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Uttam Kumar <uk...@gmail.com> on 2013/12/02 20:26:07 UTC
Error running PIG 12 with Hadoop 0.23.1.
Hi All,
I am trying to run PIG 12 with Hadoop 0.23.1 and getting following error
msg, Can someone please help and suggest what I am missing. I can run PIG
in local mode without any issue which executes with inbuilt Hadoop. I had
recompiled PIG with hadoop 23, but so far no luck in resolving this issue.
*grunt> A = load 'NYSE_dividends' as (exch, symb, dt, div);*
*grunt> dump A ;*
...
...
*2013-11-26 13:20:35,975 [JobControl] ERROR
org.apache.pig.backend.hadoop23.PigJobControl - Error while trying to run
jobs.java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException*
at org.apache.pig.backend.hadoop23.PigJobControl.submit(
PigJobControl.java:130)
at org.apache.pig.backend.hadoop23.PigJobControl.run(
PigJobControl.java:191)
at java.lang.Thread.run(Thread.java:619)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.
MapReduceLauncher$1.run(MapReduceLauncher.java:270)
*Caused by: java.lang.reflect.InvocationTargetException at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)*
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.pig.backend.hadoop23.PigJobControl.submit(
PigJobControl.java:128)
... 3 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.fs.FileSystem.
getDefaultBlockSize(Lorg/apache/hadoop/fs/Path;)J
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.
getDefaultBlockSize(HadoopShims.java:108)
at org.apache.pig.backend.hadoop.executionengine.
mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:277)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(
JobSubmitter.java:451)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(
JobSubmitter.java:468)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
JobSubmitter.java:360)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1221)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1218)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(
UserGroupInformation.java:1177)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1218)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.
submit(ControlledJob.java:336)
... 8 more
2013-11-26 13:20:35,976 [main] INFO org.apache.pig.backend.hadoop.
executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId:
job_1385489868206_0001
--
Regards,
Uttam Kumar
Re: Error running PIG 12 with Hadoop 0.23.1.
Posted by Uttam Kumar <uk...@gmail.com>.
Hi Rohini,
Thanks for suggestion. I tried PIG 12 with Hadoop 0.23.9 and it works.
Again, does PG12 works with Hadoop 0.23.1 .... can I replace the lib and
jar files for mapred and yarn folder from 23.9 to 23.1 to resolve the issue
?
I noticed, yarn and mapred jar files are in separate folder under ver
23.9, but in ver 23.1 it's under combined folder mapread.
Regards,
-Uttam
On Fri, Dec 6, 2013 at 3:25 PM, Rohini Palaniswamy
<ro...@gmail.com>wrote:
> Can you try with Hadoop 0.23.8 or 0.23.9?
>
> -Rohini
>
>
> On Mon, Dec 2, 2013 at 11:26 AM, Uttam Kumar <uk...@gmail.com> wrote:
>
> > Hi All,
> >
> > I am trying to run PIG 12 with Hadoop 0.23.1 and getting following error
> > msg, Can someone please help and suggest what I am missing. I can run
> PIG
> > in local mode without any issue which executes with inbuilt Hadoop. I
> had
> > recompiled PIG with hadoop 23, but so far no luck in resolving this
> issue.
> >
> > *grunt> A = load 'NYSE_dividends' as (exch, symb, dt, div);*
> > *grunt> dump A ;*
> > ...
> > ...
> >
> >
> > *2013-11-26 13:20:35,975 [JobControl] ERROR
> > org.apache.pig.backend.hadoop23.PigJobControl - Error while trying to run
> > jobs.java.lang.RuntimeException:
> > java.lang.reflect.InvocationTargetException*
> > at org.apache.pig.backend.hadoop23.PigJobControl.submit(
> > PigJobControl.java:130)
> > at org.apache.pig.backend.hadoop23.PigJobControl.run(
> > PigJobControl.java:191)
> > at java.lang.Thread.run(Thread.java:619)
> > at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.
> > MapReduceLauncher$1.run(MapReduceLauncher.java:270)
> >
> >
> > *Caused by: java.lang.reflect.InvocationTargetException at
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)*
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> > at org.apache.pig.backend.hadoop23.PigJobControl.submit(
> > PigJobControl.java:128)
> > ... 3 more
> > Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.fs.FileSystem.
> > getDefaultBlockSize(Lorg/apache/hadoop/fs/Path;)J
> > at
> org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.
> > getDefaultBlockSize(HadoopShims.java:108)
> > at org.apache.pig.backend.hadoop.executionengine.
> > mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:277)
> > at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(
> > JobSubmitter.java:451)
> > at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(
> > JobSubmitter.java:468)
> > at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> > JobSubmitter.java:360)
> > at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1221)
> > at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1218)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:396)
> > at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1177)
> > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1218)
> > at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.
> > submit(ControlledJob.java:336)
> > ... 8 more
> > 2013-11-26 13:20:35,976 [main] INFO org.apache.pig.backend.hadoop.
> > executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId:
> > job_1385489868206_0001
> >
> > --
> > Regards,
> > Uttam Kumar
> >
>
--
Regards,
Uttam Kumar
Re: Error running PIG 12 with Hadoop 0.23.1.
Posted by Rohini Palaniswamy <ro...@gmail.com>.
Can you try with Hadoop 0.23.8 or 0.23.9?
-Rohini
On Mon, Dec 2, 2013 at 11:26 AM, Uttam Kumar <uk...@gmail.com> wrote:
> Hi All,
>
> I am trying to run PIG 12 with Hadoop 0.23.1 and getting following error
> msg, Can someone please help and suggest what I am missing. I can run PIG
> in local mode without any issue which executes with inbuilt Hadoop. I had
> recompiled PIG with hadoop 23, but so far no luck in resolving this issue.
>
> *grunt> A = load 'NYSE_dividends' as (exch, symb, dt, div);*
> *grunt> dump A ;*
> ...
> ...
>
>
> *2013-11-26 13:20:35,975 [JobControl] ERROR
> org.apache.pig.backend.hadoop23.PigJobControl - Error while trying to run
> jobs.java.lang.RuntimeException:
> java.lang.reflect.InvocationTargetException*
> at org.apache.pig.backend.hadoop23.PigJobControl.submit(
> PigJobControl.java:130)
> at org.apache.pig.backend.hadoop23.PigJobControl.run(
> PigJobControl.java:191)
> at java.lang.Thread.run(Thread.java:619)
> at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.
> MapReduceLauncher$1.run(MapReduceLauncher.java:270)
>
>
> *Caused by: java.lang.reflect.InvocationTargetException at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)*
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.pig.backend.hadoop23.PigJobControl.submit(
> PigJobControl.java:128)
> ... 3 more
> Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.fs.FileSystem.
> getDefaultBlockSize(Lorg/apache/hadoop/fs/Path;)J
> at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.
> getDefaultBlockSize(HadoopShims.java:108)
> at org.apache.pig.backend.hadoop.executionengine.
> mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:277)
> at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(
> JobSubmitter.java:451)
> at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(
> JobSubmitter.java:468)
> at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> JobSubmitter.java:360)
> at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1221)
> at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1218)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1177)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:1218)
> at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.
> submit(ControlledJob.java:336)
> ... 8 more
> 2013-11-26 13:20:35,976 [main] INFO org.apache.pig.backend.hadoop.
> executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId:
> job_1385489868206_0001
>
> --
> Regards,
> Uttam Kumar
>