You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by praveenesh kumar <pr...@gmail.com> on 2011/07/04 07:38:05 UTC
How to make pig works with hadoop-0.20-append
Hello people,
I am new to pig.
Currently I am using hadoop and hbase together.
Since hadoop-0.20-append supports Hbase in production, so currently I am
using hadoop 0.20-append jar files.
Now I am interested to use pig which supports 0.20-append version.
I am trying to use pig 0.8, but it seems not to be working.
Whenever I am trying to run pig in map-reduce mode, it is giving me
Error:2999.
Here is output of my log file.
hadoop@ub13:/usr/local/pig/bin$ pig
2011-07-01 17:41:52,150 [main] INFO org.apache.pig.Main - Logging error
messages to: /usr/local/pig/bin/pig_1309522312144.log
2011-07-01 17:41:52,454 [main] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
to hadoop file system at: hdfs://ub13:54310
2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
Unexpected internal error. Failed to create DataStorage
LOG MESSAGE -----
Error before Pig is launched---------------------------
ERROR 2999: Unexpected internal error. Failed to create DataStorage
java.lang.RuntimeException: Failed to create DataStorage
at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
at org.apache.pig.PigServer.<init>(PigServer.java:226)
at org.apache.pig.PigServer.<init>(PigServer.java:215)
at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
at org.apache.pig.Main.run(Main.java:452)
at org.apache.pig.Main.main(Main.java:107)
Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client =
41, server = 43)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
... 9 more
================================================================================
I guess the problem is the version mismatch between the hadoop-append-core
jar files that my hadoop/hbase clusters is using currently and the
hadoop-core jar files that pig is using.Anyone faced any similar kind of
issue ?
On the documentation website... its written requirement for pig 0.8 as
hadoop-0.20.2, but the problem is I want to use my hadoop and hbase along
with pig also.
Any suggestions.. how to resolve this issue..!!
Re: How to make pig works with hadoop-0.20-append
Posted by Ashutosh Chauhan <ha...@apache.org>.
This looks something that needs to be fixed in bin/pig. Jameson, feel
free to raise jira for it.
Ashutosh
On Tue, Jul 5, 2011 at 19:12, Jameson Li <ho...@gmail.com> wrote:
> http://thedatachef.blogspot.com/2011/01/apache-pig-08-with-cloudera-cdh3.html
>
> a little difference:
> update "for jar in $HADOOP_HOME/hadoop-core-*.jar $HADOOP_HOME/lib/* ; do"
> to "for jar in $HADOOP_HOME/hadoop*core*.jar $HADOOP_HOME/lib/* ; do"
>
> 2011/7/4 Dmitriy Ryaboy <dv...@gmail.com>
>
>> Use the jar built with "ant jar-withouthadoop" and make sure your
>> HADOOP_HOME is set so that the production hadoop jars get picked up.
>> Check what the classpath is that pig will use by running "pig
>> -secretDebugCmd"
>>
>
Re: How to make pig works with hadoop-0.20-append
Posted by Jameson Li <ho...@gmail.com>.
http://thedatachef.blogspot.com/2011/01/apache-pig-08-with-cloudera-cdh3.html
a little difference:
update "for jar in $HADOOP_HOME/hadoop-core-*.jar $HADOOP_HOME/lib/* ; do"
to "for jar in $HADOOP_HOME/hadoop*core*.jar $HADOOP_HOME/lib/* ; do"
2011/7/4 Dmitriy Ryaboy <dv...@gmail.com>
> Use the jar built with "ant jar-withouthadoop" and make sure your
> HADOOP_HOME is set so that the production hadoop jars get picked up.
> Check what the classpath is that pig will use by running "pig
> -secretDebugCmd"
>
Re: How to make pig works with hadoop-0.20-append
Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Use the jar built with "ant jar-withouthadoop" and make sure your
HADOOP_HOME is set so that the production hadoop jars get picked up.
Check what the classpath is that pig will use by running "pig
-secretDebugCmd"
D
On Sun, Jul 3, 2011 at 11:08 PM, praveenesh kumar <pr...@gmail.com> wrote:
> I tried doing that..but same error.
>
> here is output of the folder after replacing jar files
>
> hadoop@ub13:/usr/local/pig/build/ivy/lib/Pig$ ls
> ant-1.6.5.jar ftpserver-deprecated-1.0.0-M2.jar
> javacc-4.2.jar jython-2.5.0.jar
> commons-cli-1.2.jar guava-r06.jar
> javacc.jar kfs-0.3.jar
> commons-codec-1.3.jar hadoop-0.20-append-for-hbase-core.jar
> jets3t-0.7.1.jar log4j-1.2.14.jar
> commons-el-1.0.jar hadoop-0.20-append-for-hbase-test.jar
> jetty-6.1.14.jar mina-core-2.0.0-M5.jar
> commons-httpclient-3.0.1.jar hbase-0.90.0.jar
> jetty-util-6.1.14.jar oro-2.0.8.jar
> commons-lang-2.4.jar hbase-0.90.0-tests.jar
> jline-0.9.94.jar servlet-api-2.5-6.1.14.jar
> commons-logging-1.1.1.jar hsqldb-1.8.0.10.jar
> joda-time-1.6.jar slf4j-api-1.5.2.jar
> commons-net-1.4.1.jar jackson-core-asl-1.0.1.jar
> jsch-0.1.38.jar slf4j-log4j12-1.4.3.jar
> core-3.1.1.jar jackson-mapper-asl-1.0.1.jar
> jsp-2.1-6.1.14.jar xmlenc-0.52.jar
> ftplet-api-1.0.0.jar jasper-compiler-5.5.12.jar
> jsp-api-2.1-6.1.14.jar zookeeper-3.3.3.jar
> ftpserver-core-1.0.0.jar jasper-runtime-5.5.12.jar
> junit-4.5.jar
>
>
> Its after replacing the hadoop-0.20-append-for-hbase jar files.
>
> Still its not working :-(
>
> Thanks,
> Praveenesh
>
> On Mon, Jul 4, 2011 at 11:27 AM, Daniel Dai <da...@hortonworks.com> wrote:
>
>> One way to make it work is to replace
>> build/ivy/lib/Pig/hadoop-core-0.20.2.jar with
>> build/hadoop-core-0.20.3-SNAPSHOT.jar from hadoop append.
>>
>> Daniel
>>
>> On Mon, Jul 4, 2011 at 12:38 AM, praveenesh kumar <praveenesh@gmail.com
>> >wrote:
>>
>> > Hello people,
>> > I am new to pig.
>> > Currently I am using hadoop and hbase together.
>> > Since hadoop-0.20-append supports Hbase in production, so currently I am
>> > using hadoop 0.20-append jar files.
>> >
>> > Now I am interested to use pig which supports 0.20-append version.
>> >
>> > I am trying to use pig 0.8, but it seems not to be working.
>> > Whenever I am trying to run pig in map-reduce mode, it is giving me
>> > Error:2999.
>> > Here is output of my log file.
>> >
>> > hadoop@ub13:/usr/local/pig/bin$ pig
>> >
>> > 2011-07-01 17:41:52,150 [main] INFO org.apache.pig.Main - Logging error
>> > messages to: /usr/local/pig/bin/pig_1309522312144.log
>> >
>> > 2011-07-01 17:41:52,454 [main] INFO
>> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
>> Connecting
>> > to hadoop file system at: hdfs://ub13:54310
>> >
>> > 2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
>> > Unexpected internal error. Failed to create DataStorage
>> >
>> > LOG MESSAGE -----
>> >
>> > Error before Pig is launched---------------------------
>> >
>> > ERROR 2999: Unexpected internal error. Failed to create DataStorage
>> >
>> > java.lang.RuntimeException: Failed to create DataStorage
>> >
>> > at
>> >
>> >
>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
>> >
>> > at
>> >
>> >
>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
>> >
>> > at
>> >
>> >
>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
>> >
>> > at
>> >
>> >
>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
>> >
>> > at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
>> >
>> > at org.apache.pig.PigServer.<init>(PigServer.java:226)
>> >
>> > at org.apache.pig.PigServer.<init>(PigServer.java:215)
>> >
>> > at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
>> >
>> > at org.apache.pig.Main.run(Main.java:452)
>> >
>> > at org.apache.pig.Main.main(Main.java:107)
>> >
>> > Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
>> > org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client
>> =
>> > 41, server = 43)
>> >
>> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
>> >
>> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>> >
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> >
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>> >
>> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>> >
>> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> >
>> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> >
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> >
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> >
>> > at
>> >
>> >
>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
>> >
>> > ... 9 more
>> >
>> >
>> ================================================================================
>> >
>> > I guess the problem is the version mismatch between the
>> hadoop-append-core
>> > jar files that my hadoop/hbase clusters is using currently and the
>> > hadoop-core jar files that pig is using.Anyone faced any similar kind of
>> > issue ?
>> > On the documentation website... its written requirement for pig 0.8 as
>> > hadoop-0.20.2, but the problem is I want to use my hadoop and hbase along
>> > with pig also.
>> >
>> > Any suggestions.. how to resolve this issue..!!
>> >
>>
>
Re: How to make pig works with hadoop-0.20-append
Posted by praveenesh kumar <pr...@gmail.com>.
I tried doing that..but same error.
here is output of the folder after replacing jar files
hadoop@ub13:/usr/local/pig/build/ivy/lib/Pig$ ls
ant-1.6.5.jar ftpserver-deprecated-1.0.0-M2.jar
javacc-4.2.jar jython-2.5.0.jar
commons-cli-1.2.jar guava-r06.jar
javacc.jar kfs-0.3.jar
commons-codec-1.3.jar hadoop-0.20-append-for-hbase-core.jar
jets3t-0.7.1.jar log4j-1.2.14.jar
commons-el-1.0.jar hadoop-0.20-append-for-hbase-test.jar
jetty-6.1.14.jar mina-core-2.0.0-M5.jar
commons-httpclient-3.0.1.jar hbase-0.90.0.jar
jetty-util-6.1.14.jar oro-2.0.8.jar
commons-lang-2.4.jar hbase-0.90.0-tests.jar
jline-0.9.94.jar servlet-api-2.5-6.1.14.jar
commons-logging-1.1.1.jar hsqldb-1.8.0.10.jar
joda-time-1.6.jar slf4j-api-1.5.2.jar
commons-net-1.4.1.jar jackson-core-asl-1.0.1.jar
jsch-0.1.38.jar slf4j-log4j12-1.4.3.jar
core-3.1.1.jar jackson-mapper-asl-1.0.1.jar
jsp-2.1-6.1.14.jar xmlenc-0.52.jar
ftplet-api-1.0.0.jar jasper-compiler-5.5.12.jar
jsp-api-2.1-6.1.14.jar zookeeper-3.3.3.jar
ftpserver-core-1.0.0.jar jasper-runtime-5.5.12.jar
junit-4.5.jar
Its after replacing the hadoop-0.20-append-for-hbase jar files.
Still its not working :-(
Thanks,
Praveenesh
On Mon, Jul 4, 2011 at 11:27 AM, Daniel Dai <da...@hortonworks.com> wrote:
> One way to make it work is to replace
> build/ivy/lib/Pig/hadoop-core-0.20.2.jar with
> build/hadoop-core-0.20.3-SNAPSHOT.jar from hadoop append.
>
> Daniel
>
> On Mon, Jul 4, 2011 at 12:38 AM, praveenesh kumar <praveenesh@gmail.com
> >wrote:
>
> > Hello people,
> > I am new to pig.
> > Currently I am using hadoop and hbase together.
> > Since hadoop-0.20-append supports Hbase in production, so currently I am
> > using hadoop 0.20-append jar files.
> >
> > Now I am interested to use pig which supports 0.20-append version.
> >
> > I am trying to use pig 0.8, but it seems not to be working.
> > Whenever I am trying to run pig in map-reduce mode, it is giving me
> > Error:2999.
> > Here is output of my log file.
> >
> > hadoop@ub13:/usr/local/pig/bin$ pig
> >
> > 2011-07-01 17:41:52,150 [main] INFO org.apache.pig.Main - Logging error
> > messages to: /usr/local/pig/bin/pig_1309522312144.log
> >
> > 2011-07-01 17:41:52,454 [main] INFO
> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
> Connecting
> > to hadoop file system at: hdfs://ub13:54310
> >
> > 2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
> > Unexpected internal error. Failed to create DataStorage
> >
> > LOG MESSAGE -----
> >
> > Error before Pig is launched---------------------------
> >
> > ERROR 2999: Unexpected internal error. Failed to create DataStorage
> >
> > java.lang.RuntimeException: Failed to create DataStorage
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> >
> > at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> >
> > at org.apache.pig.PigServer.<init>(PigServer.java:226)
> >
> > at org.apache.pig.PigServer.<init>(PigServer.java:215)
> >
> > at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> >
> > at org.apache.pig.Main.run(Main.java:452)
> >
> > at org.apache.pig.Main.main(Main.java:107)
> >
> > Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
> > org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client
> =
> > 41, server = 43)
> >
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
> >
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> >
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> >
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> >
> > at
> >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> >
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> >
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> >
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> >
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> >
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> >
> > ... 9 more
> >
> >
> ================================================================================
> >
> > I guess the problem is the version mismatch between the
> hadoop-append-core
> > jar files that my hadoop/hbase clusters is using currently and the
> > hadoop-core jar files that pig is using.Anyone faced any similar kind of
> > issue ?
> > On the documentation website... its written requirement for pig 0.8 as
> > hadoop-0.20.2, but the problem is I want to use my hadoop and hbase along
> > with pig also.
> >
> > Any suggestions.. how to resolve this issue..!!
> >
>
Re: How to make pig works with hadoop-0.20-append
Posted by Daniel Dai <da...@hortonworks.com>.
One way to make it work is to replace
build/ivy/lib/Pig/hadoop-core-0.20.2.jar with
build/hadoop-core-0.20.3-SNAPSHOT.jar from hadoop append.
Daniel
On Mon, Jul 4, 2011 at 12:38 AM, praveenesh kumar <pr...@gmail.com>wrote:
> Hello people,
> I am new to pig.
> Currently I am using hadoop and hbase together.
> Since hadoop-0.20-append supports Hbase in production, so currently I am
> using hadoop 0.20-append jar files.
>
> Now I am interested to use pig which supports 0.20-append version.
>
> I am trying to use pig 0.8, but it seems not to be working.
> Whenever I am trying to run pig in map-reduce mode, it is giving me
> Error:2999.
> Here is output of my log file.
>
> hadoop@ub13:/usr/local/pig/bin$ pig
>
> 2011-07-01 17:41:52,150 [main] INFO org.apache.pig.Main - Logging error
> messages to: /usr/local/pig/bin/pig_1309522312144.log
>
> 2011-07-01 17:41:52,454 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
> to hadoop file system at: hdfs://ub13:54310
>
> 2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
> Unexpected internal error. Failed to create DataStorage
>
> LOG MESSAGE -----
>
> Error before Pig is launched---------------------------
>
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
>
> java.lang.RuntimeException: Failed to create DataStorage
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
>
> at
>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
>
> at
>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
>
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
>
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
>
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
>
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
>
> at org.apache.pig.Main.run(Main.java:452)
>
> at org.apache.pig.Main.main(Main.java:107)
>
> Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
> org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client =
> 41, server = 43)
>
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
>
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>
> at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
>
> ... 9 more
>
> ================================================================================
>
> I guess the problem is the version mismatch between the hadoop-append-core
> jar files that my hadoop/hbase clusters is using currently and the
> hadoop-core jar files that pig is using.Anyone faced any similar kind of
> issue ?
> On the documentation website... its written requirement for pig 0.8 as
> hadoop-0.20.2, but the problem is I want to use my hadoop and hbase along
> with pig also.
>
> Any suggestions.. how to resolve this issue..!!
>