You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by 李运田 <cu...@163.com> on 2015/02/27 03:58:01 UTC

use hcatalog in eclipse/pig

I want to use hcatalog in eclipse to deal with tables in hive.
but I cant store table into hive::
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I can store into file::
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
perhaps ,the hcatalog jars are wrong?

Re:Re:Re: use hcatalog in eclipse/pig

Posted by 李运田 <cu...@163.com>.
2015-03-03 19:54:43,072 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1421806758143_0252_000001
2015-03-03 19:54:43,623 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-03-03 19:54:43,650 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-03-03 19:54:43,778 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-03-03 19:54:43,798 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
2015-03-03 19:54:43,798 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (org.apache.hadoop.yarn.security.AMRMTokenIdentifier@307c9a57)
2015-03-03 19:54:43,833 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: The specific max attempts: 2 for application: 252. Attempt num: 1 is last retry: false
2015-03-03 19:54:43,841 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter.
2015-03-03 19:54:43,985 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-03-03 19:54:43,999 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-03-03 19:54:44,535 WARN [main] org.apache.hadoop.hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2015-03-03 19:54:44,719 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null
2015-03-03 19:54:44,844 INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:473)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:374)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:1456)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1453)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1386)
Caused by: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
	at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:59)
	at org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
	at org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFContext(MapRedUtil.java:173)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:229)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:471)
	... 8 more
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.data.schema.HCatSchema
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)







在 2015-02-28 09:37:29,"李运田" <cu...@163.com> 写道:

pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
when I execute this,I cant get  any error
but,When I execute
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I get error like this:::
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias tmp
 at org.apache.pig.PigServer.openIterator(PigServer.java:880)
 at org.gradle.PigHiveHCat.run(PigHiveHCat.java:68)
 at org.gradle.PigHiveHCat.main(PigHiveHCat.java:28)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
 at org.apache.pig.PigServer.openIterator(PigServer.java:872)
 ... 2 more
...........................................................................................................
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
........................................this is OK,so I think something about Hcatalog is wrong...........................


At 2015-02-28 01:21:12, "Alan Gates" <al...@gmail.com> wrote:
What error message are you getting?

Alan.


李运田
February 26, 2015 at 18:58
I want to use hcatalog in eclipse to deal with tables in hive.
but I cant store table into hive::
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I can store into file::
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
perhaps ,the hcatalog jars are wrong?



Re:Re:Re: use hcatalog in eclipse/pig

Posted by 李运田 <cu...@163.com>.
2015-03-03 19:54:43,072 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1421806758143_0252_000001
2015-03-03 19:54:43,623 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-03-03 19:54:43,650 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-03-03 19:54:43,778 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-03-03 19:54:43,798 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
2015-03-03 19:54:43,798 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (org.apache.hadoop.yarn.security.AMRMTokenIdentifier@307c9a57)
2015-03-03 19:54:43,833 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: The specific max attempts: 2 for application: 252. Attempt num: 1 is last retry: false
2015-03-03 19:54:43,841 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter.
2015-03-03 19:54:43,985 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-03-03 19:54:43,999 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-03-03 19:54:44,535 WARN [main] org.apache.hadoop.hdfs.BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2015-03-03 19:54:44,719 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null
2015-03-03 19:54:44,844 INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:473)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:374)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:1456)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1453)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1386)
Caused by: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema
	at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:59)
	at org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
	at org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFContext(MapRedUtil.java:173)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:229)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:471)
	... 8 more
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.data.schema.HCatSchema
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)







在 2015-02-28 09:37:29,"李运田" <cu...@163.com> 写道:

pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
when I execute this,I cant get  any error
but,When I execute
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I get error like this:::
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias tmp
 at org.apache.pig.PigServer.openIterator(PigServer.java:880)
 at org.gradle.PigHiveHCat.run(PigHiveHCat.java:68)
 at org.gradle.PigHiveHCat.main(PigHiveHCat.java:28)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
 at org.apache.pig.PigServer.openIterator(PigServer.java:872)
 ... 2 more
...........................................................................................................
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
........................................this is OK,so I think something about Hcatalog is wrong...........................


At 2015-02-28 01:21:12, "Alan Gates" <al...@gmail.com> wrote:
What error message are you getting?

Alan.


李运田
February 26, 2015 at 18:58
I want to use hcatalog in eclipse to deal with tables in hive.
but I cant store table into hive::
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I can store into file::
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
perhaps ,the hcatalog jars are wrong?



Re:Re: use hcatalog in eclipse/pig

Posted by 李运田 <cu...@163.com>.
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
when I execute this,I cant get  any error
but,When I execute
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I get error like this:::
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias tmp
 at org.apache.pig.PigServer.openIterator(PigServer.java:880)
 at org.gradle.PigHiveHCat.run(PigHiveHCat.java:68)
 at org.gradle.PigHiveHCat.main(PigHiveHCat.java:28)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
 at org.apache.pig.PigServer.openIterator(PigServer.java:872)
 ... 2 more
...........................................................................................................
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
........................................this is OK,so I think something about Hcatalog is wrong...........................


At 2015-02-28 01:21:12, "Alan Gates" <al...@gmail.com> wrote:
What error message are you getting?

Alan.


李运田
February 26, 2015 at 18:58
I want to use hcatalog in eclipse to deal with tables in hive.
but I cant store table into hive::
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I can store into file::
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
perhaps ,the hcatalog jars are wrong?

Re:Re: use hcatalog in eclipse/pig

Posted by 李运田 <cu...@163.com>.
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
when I execute this,I cant get  any error
but,When I execute
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I get error like this:::
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias tmp
 at org.apache.pig.PigServer.openIterator(PigServer.java:880)
 at org.gradle.PigHiveHCat.run(PigHiveHCat.java:68)
 at org.gradle.PigHiveHCat.main(PigHiveHCat.java:28)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
 at org.apache.pig.PigServer.openIterator(PigServer.java:872)
 ... 2 more
...........................................................................................................
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
........................................this is OK,so I think something about Hcatalog is wrong...........................








At 2015-02-28 01:21:12, "Alan Gates" <al...@gmail.com> wrote:
What error message are you getting?

Alan.


李运田
February 26, 2015 at 18:58
I want to use hcatalog in eclipse to deal with tables in hive.
but I cant store table into hive::
pigServer.registerQuery("tmp = load 'pig' using org.apache.hcatalog.pig.HCatLoader();");
pigServer.registerQuery("tmp = foreach tmp generate id;");
pigServer.registerQuery("store tmp into 'hive' using org.apache.hcatalog.pig.HCatStorer();");
I can store into file::
pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
pigServer.store("a", "/user/hadoop/pig1.txt");
pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
perhaps ,the hcatalog jars are wrong?

Re: use hcatalog in eclipse/pig

Posted by Alan Gates <al...@gmail.com>.
What error message are you getting?

Alan.

> 李运田 <ma...@163.com>
> February 26, 2015 at 18:58
> I want to use hcatalog in eclipse to deal with tables in hive.
> but I cant store table into hive::
> pigServer.registerQuery("tmp = load 'pig' using 
> org.apache.hcatalog.pig.HCatLoader();");
> pigServer.registerQuery("tmp = foreach tmp generate id;");
> pigServer.registerQuery("store tmp into 'hive' using 
> org.apache.hcatalog.pig.HCatStorer();");
> I can store into file::
> pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
> pigServer.store("a", "/user/hadoop/pig1.txt");
> pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
> perhaps ,the hcatalog jars are wrong?