You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by felix gao <gr...@gmail.com> on 2011/01/24 22:11:36 UTC
java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Hi Guys,
I am testing out AVRO in our cluster and having java.lang.NoSuchMethodError:
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
when running a simple pig script.
after took a look at AVRO-493 I uploaded the new jackson jar to replace the
CDH2's jackson 1.0.1 jars.
The pigscipt looks like below
REGISTER /home/pig/jars/avro-1.4.1.jar
REGISTER /home/pig/jars/json_simple-1.1.jar
REGISTER /home/pig/jars/piggybank.jar
REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
log_load = LOAD '/user/felix/avro_input/*.avro' USING
org.apache.pig.piggybank.storage.avro.AvroStorage() ;
dump log_load
jackson jars are copied to each slaves and master
The full stacktrace:
ERROR 2998: Unhandled internal error.
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
java.lang.NoSuchMethodError:
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
at org.apache.avro.Schema.<clinit>(Schema.java:82)
at
org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
at
org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
at
org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
at
org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
at
org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
at
org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
at org.apache.pig.Main.main(Main.java:357)
I am wondering if I need to restart tasktracker and jobtracker in order for
the jackson jars to be picked up?
Re: java.lang.NoSuchMethodError:
org.codehaus.jackson.JsonFactory.enable
Posted by Scott Carey <sc...@richrelevance.com>.
That looks like a bug in org.apache.pig.piggybank.storage.avro.AvroStorage()
It is not using Hadoop's wildcard/glob matching. You'll want to file a bug against that (it is part of piggybank, not Avro).
If the schema is being dynamically read from the files, it needs to use Hadoop's globbing to resolve one of the files in order to read it and inspect the schema, and not take the passed in string literally.
At this point, you'll need help from those that contributed that to piggybank. The pig JIRA related to it is:
https://issues.apache.org/jira/browse/PIG-1748
But that may not be the best place for this usability question.
On 1/24/11 6:14 PM, "felix gao" <gr...@gmail.com>> wrote:
Thanks for the info. I have not compiled a new version of pig and it works when I load a single avro file. But it failed when I do wildcard filename matching.
log_load = LOAD '/user/felix/avro/access_log.test.avro' USING org.apache.pig.piggybank.storage.avro.AvroStorage(); <--- works fine
but
log_load = LOAD '/user/felix/avro/*.avro' USING org.apache.pig.piggybank.storage.avro.AvroStorage();
ERROR 1018: Problem determining schema during load
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during parsing. Problem determining schema during load
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1342)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1286)
at org.apache.pig.PigServer.registerQuery(PigServer.java:460)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:738)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:163)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:139)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:89)
at org.apache.pig.Main.main(Main.java:414)
Caused by: org.apache.pig.impl.logicalLayer.parser.ParseException: Problem determining schema during load
at org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:752)
at org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1336)
... 8 more
Caused by: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1018: Problem determining schema during load
at org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:156)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:750)
... 10 more
Caused by: java.io.FileNotFoundException: File does not exist: /user/felix/avro/*.avro
at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1586)
at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.<init>(DFSClient.java:1577)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:428)
at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:185)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:431)
at org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:181)
at org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
at org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
at org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
at org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
at org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
... 11 more
How do I load multiple avro files with the load function.
Felix
On Mon, Jan 24, 2011 at 4:25 PM, Scott Carey <sc...@richrelevance.com>> wrote:
A jar prior to the jackson jar contains an earlier version of jackson inside of it.
Pig's jar typically contains all its dependencies (there is 'pig-withouthadoop.jar' instead).
So my guess is that one of these (look at a listing of jar contents) has jackson in it:
/usr/lib/pig/bin/../pig-0.7.0+16-core.jar
/usr/lib/pig/bin/../pig-0.7.0+16.jar
/usr/lib/pig/bin/../build/pig-*-core.jar:
/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar
On 1/24/11 4:06 PM, "felix gao" <gr...@gmail.com>> wrote:
here is the actual process that is running the pig script, I hope this helps.
root 20820 19838 66 18:57 pts/0 00:00:00 /usr/java/default/bin/java -Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA -classpath /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/* org.apache.pig.Main avro.pig
pig -secretDebugCmd
dry run:
/usr/java/default/bin/java -Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA -classpath /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/* org.apache.pig.Main
seems the jackson 1.5.5 is on the classpath of the pig as well as tasktracker and the actual job.
Felix
On Mon, Jan 24, 2011 at 3:44 PM, Tatu Saloranta <ts...@gmail.com>> wrote:
On Mon, Jan 24, 2011 at 3:13 PM, Scott Carey <sc...@richrelevance.com>> wrote:
> That is confusing. Can you capture the classpath of an actual task process,
> not just the TT? They shouldn't differ much, but it is worth checking.
> Jackson 1.3 (or was it 1.2?) and above have all been backwards compatible
> with each other I believe. And the error you are getting is definitely
> caused by accessing the enable() methods that were added after 1.0.1.
> I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and
> unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
Just in case anyone is interested, this is due to change in 1.3.0
which changed return type of configuration method from 'void' to
ObjectMapper, to allow fluent-style chaining of configuration. This is
source compatible, but unfortunately binary incompatible change. On
plus side, it is the only known such problem, which makes it easier to
recognize.
-+ Tatu +-
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by felix gao <gr...@gmail.com>.
Thanks for the info. I have not compiled a new version of pig and it works
when I load a single avro file. But it failed when I do wildcard filename
matching.
log_load = LOAD '/user/felix/avro/access_log.test.avro' USING
org.apache.pig.piggybank.storage.avro.AvroStorage(); <--- works fine
but
log_load = LOAD '/user/felix/avro/*.avro' USING
org.apache.pig.piggybank.storage.avro.AvroStorage();
ERROR 1018: Problem determining schema during load
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during
parsing. Problem determining schema during load
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1342)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1286)
at org.apache.pig.PigServer.registerQuery(PigServer.java:460)
at
org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:738)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:163)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:139)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:89)
at org.apache.pig.Main.main(Main.java:414)
Caused by: org.apache.pig.impl.logicalLayer.parser.ParseException: Problem
determining schema during load
at
org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:752)
at
org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1336)
... 8 more
Caused by: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1018:
Problem determining schema during load
at org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:156)
at
org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:750)
... 10 more
Caused by: java.io.FileNotFoundException: File does not exist:
/user/felix/avro/*.avro
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1586)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.<init>(DFSClient.java:1577)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:428)
at
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:185)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:431)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:181)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
at
org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
at
org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
at org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
... 11 more
How do I load multiple avro files with the load function.
Felix
On Mon, Jan 24, 2011 at 4:25 PM, Scott Carey <sc...@richrelevance.com>wrote:
> A jar prior to the jackson jar contains an earlier version of jackson
> inside of it.
>
> Pig's jar typically contains all its dependencies (there is
> 'pig-withouthadoop.jar' instead).
>
> So my guess is that one of these (look at a listing of jar contents) has
> jackson in it:
>
> /usr/lib/pig/bin/../pig-0.7.0+16-core.jar
> /usr/lib/pig/bin/../pig-0.7.0+16.jar
> /usr/lib/pig/bin/../build/pig-*-core.jar:
> /usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar
>
>
> On 1/24/11 4:06 PM, "felix gao" <gr...@gmail.com> wrote:
>
> here is the actual process that is running the pig script, I hope this
> helps.
>
> root 20820 19838 66 18:57 pts/0 00:00:00 /usr/java/default/bin/java
> -Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
> -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
> -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log
> -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA
> -classpath
> /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:
> /usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/*
> org.apache.pig.Main avro.pig
>
>
> pig -secretDebugCmd
> dry run:
> /usr/java/default/bin/java -Xmx1000m
> -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
> -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
> -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log
> -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA
> -classpath
> /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar
> :/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/*
> org.apache.pig.Main
>
>
> seems the jackson 1.5.5 is on the classpath of the pig as well as
> tasktracker and the actual job.
>
> Felix
>
>
> On Mon, Jan 24, 2011 at 3:44 PM, Tatu Saloranta <ts...@gmail.com>wrote:
>
>> On Mon, Jan 24, 2011 at 3:13 PM, Scott Carey <sc...@richrelevance.com>
>> wrote:
>> > That is confusing. Can you capture the classpath of an actual task
>> process,
>> > not just the TT? They shouldn't differ much, but it is worth checking.
>> > Jackson 1.3 (or was it 1.2?) and above have all been backwards
>> compatible
>> > with each other I believe. And the error you are getting is definitely
>> > caused by accessing the enable() methods that were added after 1.0.1.
>> > I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and
>> > unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
>>
>> Just in case anyone is interested, this is due to change in 1.3.0
>> which changed return type of configuration method from 'void' to
>> ObjectMapper, to allow fluent-style chaining of configuration. This is
>> source compatible, but unfortunately binary incompatible change. On
>> plus side, it is the only known such problem, which makes it easier to
>> recognize.
>>
>> -+ Tatu +-
>>
>
>
Re: java.lang.NoSuchMethodError:
org.codehaus.jackson.JsonFactory.enable
Posted by Scott Carey <sc...@richrelevance.com>.
A jar prior to the jackson jar contains an earlier version of jackson inside of it.
Pig's jar typically contains all its dependencies (there is 'pig-withouthadoop.jar' instead).
So my guess is that one of these (look at a listing of jar contents) has jackson in it:
/usr/lib/pig/bin/../pig-0.7.0+16-core.jar
/usr/lib/pig/bin/../pig-0.7.0+16.jar
/usr/lib/pig/bin/../build/pig-*-core.jar:
/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar
On 1/24/11 4:06 PM, "felix gao" <gr...@gmail.com>> wrote:
here is the actual process that is running the pig script, I hope this helps.
root 20820 19838 66 18:57 pts/0 00:00:00 /usr/java/default/bin/java -Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA -classpath /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/* org.apache.pig.Main avro.pig
pig -secretDebugCmd
dry run:
/usr/java/default/bin/java -Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA -classpath /usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/* org.apache.pig.Main
seems the jackson 1.5.5 is on the classpath of the pig as well as tasktracker and the actual job.
Felix
On Mon, Jan 24, 2011 at 3:44 PM, Tatu Saloranta <ts...@gmail.com>> wrote:
On Mon, Jan 24, 2011 at 3:13 PM, Scott Carey <sc...@richrelevance.com>> wrote:
> That is confusing. Can you capture the classpath of an actual task process,
> not just the TT? They shouldn't differ much, but it is worth checking.
> Jackson 1.3 (or was it 1.2?) and above have all been backwards compatible
> with each other I believe. And the error you are getting is definitely
> caused by accessing the enable() methods that were added after 1.0.1.
> I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and
> unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
Just in case anyone is interested, this is due to change in 1.3.0
which changed return type of configuration method from 'void' to
ObjectMapper, to allow fluent-style chaining of configuration. This is
source compatible, but unfortunately binary incompatible change. On
plus side, it is the only known such problem, which makes it easier to
recognize.
-+ Tatu +-
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by felix gao <gr...@gmail.com>.
here is the actual process that is running the pig script, I hope this
helps.
root 20820 19838 66 18:57 pts/0 00:00:00 /usr/java/default/bin/java
-Xmx1000m -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
-Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
-Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log
-Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA
-classpath
/usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:
/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/*
org.apache.pig.Main avro.pig
pig -secretDebugCmd
dry run:
/usr/java/default/bin/java -Xmx1000m
-Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
-Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
-Dpig.log.dir=/usr/lib/pig/bin/../logs -Dpig.log.file=pig.log
-Dpig.home.dir=/usr/lib/pig/bin/.. -Dpig.root.logger=INFO,console,DRFA
-classpath
/usr/lib/pig/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/pig/bin/../pig-0.7.0+16-core.jar:/usr/lib/pig/bin/../pig-0.7.0+16.jar:/usr/lib/pig/bin/../build/pig-*-core.jar:/usr/lib/pig/bin/../lib/automaton.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0-test.jar:/usr/lib/pig/bin/../lib/hbase-0.20.0.jar:/usr/lib/pig/bin/../lib/zookeeper-hbase-1329.jar:/usr/lib/pig/bin/../build/ivy/lib/Pig/*.jar:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar
:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jdiff:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/native:/usr/lib/hadoop-0.20/lib/native_libs.tar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar:/usr/lib/hadoop-0.20/conf::/home/felix/hadoop-lzo.jar:/home/felix/elephant-bird.jar:/home/felix/elephant-bird/lib/*
org.apache.pig.Main
seems the jackson 1.5.5 is on the classpath of the pig as well as
tasktracker and the actual job.
Felix
On Mon, Jan 24, 2011 at 3:44 PM, Tatu Saloranta <ts...@gmail.com>wrote:
> On Mon, Jan 24, 2011 at 3:13 PM, Scott Carey <sc...@richrelevance.com>
> wrote:
> > That is confusing. Can you capture the classpath of an actual task
> process,
> > not just the TT? They shouldn't differ much, but it is worth checking.
> > Jackson 1.3 (or was it 1.2?) and above have all been backwards compatible
> > with each other I believe. And the error you are getting is definitely
> > caused by accessing the enable() methods that were added after 1.0.1.
> > I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and
> > unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
>
> Just in case anyone is interested, this is due to change in 1.3.0
> which changed return type of configuration method from 'void' to
> ObjectMapper, to allow fluent-style chaining of configuration. This is
> source compatible, but unfortunately binary incompatible change. On
> plus side, it is the only known such problem, which makes it easier to
> recognize.
>
> -+ Tatu +-
>
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by Tatu Saloranta <ts...@gmail.com>.
On Mon, Jan 24, 2011 at 3:13 PM, Scott Carey <sc...@richrelevance.com> wrote:
> That is confusing. Can you capture the classpath of an actual task process,
> not just the TT? They shouldn't differ much, but it is worth checking.
> Jackson 1.3 (or was it 1.2?) and above have all been backwards compatible
> with each other I believe. And the error you are getting is definitely
> caused by accessing the enable() methods that were added after 1.0.1.
> I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and
> unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
Just in case anyone is interested, this is due to change in 1.3.0
which changed return type of configuration method from 'void' to
ObjectMapper, to allow fluent-style chaining of configuration. This is
source compatible, but unfortunately binary incompatible change. On
plus side, it is the only known such problem, which makes it easier to
recognize.
-+ Tatu +-
Re: java.lang.NoSuchMethodError:
org.codehaus.jackson.JsonFactory.enable
Posted by Scott Carey <sc...@richrelevance.com>.
That is confusing. Can you capture the classpath of an actual task process, not just the TT? They shouldn't differ much, but it is worth checking.
Jackson 1.3 (or was it 1.2?) and above have all been backwards compatible with each other I believe. And the error you are getting is definitely caused by accessing the enable() methods that were added after 1.0.1.
I can change the Avro dependency on Jackson to 1.5.5, 1.7.1, or 1.3, and unit tests pass. If I change it to 1.2, 1.1, or 1.0.1 they break.
-Scott
On 1/24/11 2:40 PM, "felix gao" <gr...@gmail.com>> wrote:
sorry for the spamming, but I forgot to include the TaskTracker's process information.
root 20368 1 0 17:31 pts/0 00:00:00 su mapred -s /usr/java/default/bin/java -- -Dproc_tasktracker -Xmx1000m -Dhadoop.log.dir=/usr/lib/hadoop-0.20/logs -Dhadoop.log.file=hadoop-hadoop-tasktracker-ip-10-212-86-214.log -Dhadoop.home.dir=/usr/lib/hadoop-0.20 -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,DRFA -Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64:/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 -Dhadoop.policy.file=hadoop-policy.xml -classpath /etc/hadoop-0.20/conf:/usr/java/default/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar org.apache.hadoop.mapred.TaskTracker
as you can see the correct version of jackson library is in the classpath.
On Mon, Jan 24, 2011 at 2:09 PM, felix gao <gr...@gmail.com>> wrote:
Chase,
I tried to ran it on my local box with a standalone version of hadoop installed and I still got the same error.
/usr/local/pig-0.7.0/bin/pig avro.pig
11/01/24 14:01:29 INFO pig.Main: Logging error messages to: /Users/felix/Documents/pig/pig_1295906489023.log
2011-01-24 14:01:29,328 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2011-01-24 14:01:30,277 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2011-01-24 14:01:30,537 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
Any ideas why?
Thanks,
Felix
On Mon, Jan 24, 2011 at 1:18 PM, Chase Bradford <ch...@gmail.com>> wrote:
Yes, you will need to restart them. The child tasks inherit the TT's
classpath, which will list only the 1.0.1 jars until you restart the
daemon.
On Mon, Jan 24, 2011 at 1:11 PM, felix gao <gr...@gmail.com>> wrote:
> Hi Guys,
>
> I am testing out AVRO in our cluster and having java.lang.NoSuchMethodError:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> when running a simple pig script.
> after took a look at AVRO-493 I uploaded the new jackson jar to replace the
> CDH2's jackson 1.0.1 jars.
>
>
> The pigscipt looks like below
> REGISTER /home/pig/jars/avro-1.4.1.jar
> REGISTER /home/pig/jars/json_simple-1.1.jar
> REGISTER /home/pig/jars/piggybank.jar
> REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
> REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
>
> log_load = LOAD '/user/felix/avro_input/*.avro' USING
> org.apache.pig.piggybank.storage.avro.AvroStorage() ;
>
> dump log_load
>
>
> jackson jars are copied to each slaves and master
>
> The full stacktrace:
> ERROR 2998: Unhandled internal error.
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> java.lang.NoSuchMethodError:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> at org.apache.avro.Schema.<clinit>(Schema.java:82)
> at
> org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
> at
> org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
> at
> org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
> at
> org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
> at
> org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
> at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
> at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
> at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
> at
> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
> at
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
> at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
> at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
> at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
> at org.apache.pig.Main.main(Main.java:357)
>
> I am wondering if I need to restart tasktracker and jobtracker in order for
> the jackson jars to be picked up?
>
>
>
>
>
>
>
>
--
Chase Bradford
“If in physics there's something you don't understand, you can always
hide behind the uncharted depths of nature. But if your program
doesn't work, there is no obstinate nature. If it doesn't work, you've
messed up.”
- Edsger Dijkstra
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by felix gao <gr...@gmail.com>.
sorry for the spamming, but I forgot to include the TaskTracker's process
information.
root 20368 1 0 17:31 pts/0 00:00:00 su mapred -s
/usr/java/default/bin/java -- -Dproc_tasktracker -Xmx1000m
-Dhadoop.log.dir=/usr/lib/hadoop-0.20/logs
-Dhadoop.log.file=hadoop-hadoop-tasktracker-ip-10-212-86-214.log
-Dhadoop.home.dir=/usr/lib/hadoop-0.20 -Dhadoop.id.str=hadoop
-Dhadoop.root.logger=INFO,DRFA
-Djava.library.path=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64:/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
-Dhadoop.policy.file=hadoop-policy.xml -classpath
/etc/hadoop-0.20/conf:/usr/java/default/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.6.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2+737.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:
/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.14.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.14.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo.0.4.4.jar
org.apache.hadoop.mapred.TaskTracker
as you can see the correct version of jackson library is in the classpath.
On Mon, Jan 24, 2011 at 2:09 PM, felix gao <gr...@gmail.com> wrote:
> Chase,
>
> I tried to ran it on my local box with a standalone version of hadoop
> installed and I still got the same error.
> /usr/local/pig-0.7.0/bin/pig avro.pig
> 11/01/24 14:01:29 INFO pig.Main: Logging error messages to:
> /Users/felix/Documents/pig/pig_1295906489023.log
> 2011-01-24 14:01:29,328 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
> to hadoop file system at: file:///
> 2011-01-24 14:01:30,277 [main] INFO
> org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 2011-01-24 14:01:30,537 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 2998: Unhandled internal error.
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Any ideas why?
>
> Thanks,
>
> Felix
>
>
>
>
> On Mon, Jan 24, 2011 at 1:18 PM, Chase Bradford <ch...@gmail.com>wrote:
>
>> Yes, you will need to restart them. The child tasks inherit the TT's
>> classpath, which will list only the 1.0.1 jars until you restart the
>> daemon.
>>
>> On Mon, Jan 24, 2011 at 1:11 PM, felix gao <gr...@gmail.com> wrote:
>> > Hi Guys,
>> >
>> > I am testing out AVRO in our cluster and having
>> java.lang.NoSuchMethodError:
>> >
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>> > when running a simple pig script.
>> > after took a look at AVRO-493 I uploaded the new jackson jar to replace
>> the
>> > CDH2's jackson 1.0.1 jars.
>> >
>> >
>> > The pigscipt looks like below
>> > REGISTER /home/pig/jars/avro-1.4.1.jar
>> > REGISTER /home/pig/jars/json_simple-1.1.jar
>> > REGISTER /home/pig/jars/piggybank.jar
>> > REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
>> > REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
>> >
>> > log_load = LOAD '/user/felix/avro_input/*.avro' USING
>> > org.apache.pig.piggybank.storage.avro.AvroStorage() ;
>> >
>> > dump log_load
>> >
>> >
>> > jackson jars are copied to each slaves and master
>> >
>> > The full stacktrace:
>> > ERROR 2998: Unhandled internal error.
>> >
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>> >
>> > java.lang.NoSuchMethodError:
>> >
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>> > at org.apache.avro.Schema.<clinit>(Schema.java:82)
>> > at
>> >
>> org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
>> > at
>> >
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
>> > at
>> >
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
>> > at
>> >
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
>> > at
>> >
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
>> > at
>> > org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
>> > at
>> > org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
>> > at
>> >
>> org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
>> > at
>> >
>> org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
>> > at
>> org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
>> > at
>> org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
>> > at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
>> > at
>> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
>> > at
>> >
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
>> > at
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
>> > at
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
>> > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
>> > at org.apache.pig.Main.main(Main.java:357)
>> >
>> > I am wondering if I need to restart tasktracker and jobtracker in order
>> for
>> > the jackson jars to be picked up?
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>>
>>
>>
>> --
>> Chase Bradford
>>
>>
>> “If in physics there's something you don't understand, you can always
>> hide behind the uncharted depths of nature. But if your program
>> doesn't work, there is no obstinate nature. If it doesn't work, you've
>> messed up.”
>>
>> - Edsger Dijkstra
>>
>
>
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by Chase Bradford <ch...@gmail.com>.
Does the pig client have the right classpath?
Sent from phnoe
On Jan 24, 2011, at 2:09 PM, felix gao <gr...@gmail.com> wrote:
> Chase,
>
> I tried to ran it on my local box with a standalone version of hadoop installed and I still got the same error.
> /usr/local/pig-0.7.0/bin/pig avro.pig
> 11/01/24 14:01:29 INFO pig.Main: Logging error messages to: /Users/felix/Documents/pig/pig_1295906489023.log
> 2011-01-24 14:01:29,328 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
> 2011-01-24 14:01:30,277 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
> 2011-01-24 14:01:30,537 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Any ideas why?
>
> Thanks,
>
> Felix
>
>
>
> On Mon, Jan 24, 2011 at 1:18 PM, Chase Bradford <ch...@gmail.com> wrote:
> Yes, you will need to restart them. The child tasks inherit the TT's
> classpath, which will list only the 1.0.1 jars until you restart the
> daemon.
>
> On Mon, Jan 24, 2011 at 1:11 PM, felix gao <gr...@gmail.com> wrote:
> > Hi Guys,
> >
> > I am testing out AVRO in our cluster and having java.lang.NoSuchMethodError:
> > org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> > when running a simple pig script.
> > after took a look at AVRO-493 I uploaded the new jackson jar to replace the
> > CDH2's jackson 1.0.1 jars.
> >
> >
> > The pigscipt looks like below
> > REGISTER /home/pig/jars/avro-1.4.1.jar
> > REGISTER /home/pig/jars/json_simple-1.1.jar
> > REGISTER /home/pig/jars/piggybank.jar
> > REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
> > REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
> >
> > log_load = LOAD '/user/felix/avro_input/*.avro' USING
> > org.apache.pig.piggybank.storage.avro.AvroStorage() ;
> >
> > dump log_load
> >
> >
> > jackson jars are copied to each slaves and master
> >
> > The full stacktrace:
> > ERROR 2998: Unhandled internal error.
> > org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> >
> > java.lang.NoSuchMethodError:
> > org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> > at org.apache.avro.Schema.<clinit>(Schema.java:82)
> > at
> > org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
> > at
> > org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
> > at
> > org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
> > at
> > org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
> > at
> > org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
> > at
> > org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
> > at
> > org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
> > at
> > org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
> > at
> > org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
> > at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
> > at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
> > at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
> > at
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
> > at
> > org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
> > at
> > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
> > at
> > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
> > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
> > at org.apache.pig.Main.main(Main.java:357)
> >
> > I am wondering if I need to restart tasktracker and jobtracker in order for
> > the jackson jars to be picked up?
> >
> >
> >
> >
> >
> >
> >
> >
>
>
>
> --
> Chase Bradford
>
>
> “If in physics there's something you don't understand, you can always
> hide behind the uncharted depths of nature. But if your program
> doesn't work, there is no obstinate nature. If it doesn't work, you've
> messed up.”
>
> - Edsger Dijkstra
>
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by David Rosenstrauch <da...@darose.net>.
I solved this issue by ssh'ing into each node on the cluster, renaming
each jackson.jar (e.g., to jackson.jar.disabled) and then restarting the
daemons on the node. This is a total hack, obviously - and also only
feasible if you have a small cluster.
HTH,
DR
On 01/24/2011 05:09 PM, felix gao wrote:
> Chase,
>
> I tried to ran it on my local box with a standalone version of hadoop
> installed and I still got the same error.
> /usr/local/pig-0.7.0/bin/pig avro.pig
> 11/01/24 14:01:29 INFO pig.Main: Logging error messages to:
> /Users/felix/Documents/pig/pig_1295906489023.log
> 2011-01-24 14:01:29,328 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
> to hadoop file system at: file:///
> 2011-01-24 14:01:30,277 [main] INFO
> org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 2011-01-24 14:01:30,537 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 2998: Unhandled internal error.
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Any ideas why?
>
> Thanks,
>
> Felix
>
>
>
> On Mon, Jan 24, 2011 at 1:18 PM, Chase Bradford<ch...@gmail.com>wrote:
>
>> Yes, you will need to restart them. The child tasks inherit the TT's
>> classpath, which will list only the 1.0.1 jars until you restart the
>> daemon.
>>
>> On Mon, Jan 24, 2011 at 1:11 PM, felix gao<gr...@gmail.com> wrote:
>>> Hi Guys,
>>>
>>> I am testing out AVRO in our cluster and having
>> java.lang.NoSuchMethodError:
>>>
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>>> when running a simple pig script.
>>> after took a look at AVRO-493 I uploaded the new jackson jar to replace
>> the
>>> CDH2's jackson 1.0.1 jars.
>>>
>>>
>>> The pigscipt looks like below
>>> REGISTER /home/pig/jars/avro-1.4.1.jar
>>> REGISTER /home/pig/jars/json_simple-1.1.jar
>>> REGISTER /home/pig/jars/piggybank.jar
>>> REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
>>> REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
>>>
>>> log_load = LOAD '/user/felix/avro_input/*.avro' USING
>>> org.apache.pig.piggybank.storage.avro.AvroStorage() ;
>>>
>>> dump log_load
>>>
>>>
>>> jackson jars are copied to each slaves and master
>>>
>>> The full stacktrace:
>>> ERROR 2998: Unhandled internal error.
>>>
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>>>
>>> java.lang.NoSuchMethodError:
>>>
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>>> at org.apache.avro.Schema.<clinit>(Schema.java:82)
>>> at
>>>
>> org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
>>> at
>>>
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
>>> at
>>>
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
>>> at
>>>
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
>>> at
>>>
>> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
>>> at
>>> org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
>>> at
>>> org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
>>> at
>>>
>> org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
>>> at
>>>
>> org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
>>> at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
>>> at
>> org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
>>> at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
>>> at
>>> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
>>> at
>>>
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
>>> at
>>>
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
>>> at
>>>
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
>>> at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
>>> at org.apache.pig.Main.main(Main.java:357)
>>>
>>> I am wondering if I need to restart tasktracker and jobtracker in order
>> for
>>> the jackson jars to be picked up?
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>>
>> --
>> Chase Bradford
>>
>>
>> “If in physics there's something you don't understand, you can always
>> hide behind the uncharted depths of nature. But if your program
>> doesn't work, there is no obstinate nature. If it doesn't work, you've
>> messed up.”
>>
>> - Edsger Dijkstra
>>
>
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by felix gao <gr...@gmail.com>.
Chase,
I tried to ran it on my local box with a standalone version of hadoop
installed and I still got the same error.
/usr/local/pig-0.7.0/bin/pig avro.pig
11/01/24 14:01:29 INFO pig.Main: Logging error messages to:
/Users/felix/Documents/pig/pig_1295906489023.log
2011-01-24 14:01:29,328 [main] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
to hadoop file system at: file:///
2011-01-24 14:01:30,277 [main] INFO
org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with
processName=JobTracker, sessionId=
2011-01-24 14:01:30,537 [main] ERROR org.apache.pig.tools.grunt.Grunt -
ERROR 2998: Unhandled internal error.
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
Any ideas why?
Thanks,
Felix
On Mon, Jan 24, 2011 at 1:18 PM, Chase Bradford <ch...@gmail.com>wrote:
> Yes, you will need to restart them. The child tasks inherit the TT's
> classpath, which will list only the 1.0.1 jars until you restart the
> daemon.
>
> On Mon, Jan 24, 2011 at 1:11 PM, felix gao <gr...@gmail.com> wrote:
> > Hi Guys,
> >
> > I am testing out AVRO in our cluster and having
> java.lang.NoSuchMethodError:
> >
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> > when running a simple pig script.
> > after took a look at AVRO-493 I uploaded the new jackson jar to replace
> the
> > CDH2's jackson 1.0.1 jars.
> >
> >
> > The pigscipt looks like below
> > REGISTER /home/pig/jars/avro-1.4.1.jar
> > REGISTER /home/pig/jars/json_simple-1.1.jar
> > REGISTER /home/pig/jars/piggybank.jar
> > REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
> > REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
> >
> > log_load = LOAD '/user/felix/avro_input/*.avro' USING
> > org.apache.pig.piggybank.storage.avro.AvroStorage() ;
> >
> > dump log_load
> >
> >
> > jackson jars are copied to each slaves and master
> >
> > The full stacktrace:
> > ERROR 2998: Unhandled internal error.
> >
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> >
> > java.lang.NoSuchMethodError:
> >
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> > at org.apache.avro.Schema.<clinit>(Schema.java:82)
> > at
> >
> org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
> > at
> >
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
> > at
> >
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
> > at
> >
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
> > at
> >
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
> > at
> > org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
> > at
> > org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
> > at
> >
> org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
> > at
> >
> org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
> > at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
> > at
> org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
> > at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
> > at
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
> > at
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
> > at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
> > at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
> > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
> > at org.apache.pig.Main.main(Main.java:357)
> >
> > I am wondering if I need to restart tasktracker and jobtracker in order
> for
> > the jackson jars to be picked up?
> >
> >
> >
> >
> >
> >
> >
> >
>
>
>
> --
> Chase Bradford
>
>
> “If in physics there's something you don't understand, you can always
> hide behind the uncharted depths of nature. But if your program
> doesn't work, there is no obstinate nature. If it doesn't work, you've
> messed up.”
>
> - Edsger Dijkstra
>
Re: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable
Posted by Chase Bradford <ch...@gmail.com>.
Yes, you will need to restart them. The child tasks inherit the TT's
classpath, which will list only the 1.0.1 jars until you restart the
daemon.
On Mon, Jan 24, 2011 at 1:11 PM, felix gao <gr...@gmail.com> wrote:
> Hi Guys,
>
> I am testing out AVRO in our cluster and having java.lang.NoSuchMethodError:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> when running a simple pig script.
> after took a look at AVRO-493 I uploaded the new jackson jar to replace the
> CDH2's jackson 1.0.1 jars.
>
>
> The pigscipt looks like below
> REGISTER /home/pig/jars/avro-1.4.1.jar
> REGISTER /home/pig/jars/json_simple-1.1.jar
> REGISTER /home/pig/jars/piggybank.jar
> REGISTER /usr/lib/hadoop/lib/jackson-core-asl-1.5.5.jar
> REGISTER /usr/lib/hadoop/lib/jackson-mapper-asl-1.5.5.jar
>
> log_load = LOAD '/user/felix/avro_input/*.avro' USING
> org.apache.pig.piggybank.storage.avro.AvroStorage() ;
>
> dump log_load
>
>
> jackson jars are copied to each slaves and master
>
> The full stacktrace:
> ERROR 2998: Unhandled internal error.
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> java.lang.NoSuchMethodError:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> at org.apache.avro.Schema.<clinit>(Schema.java:82)
> at
> org.apache.pig.piggybank.storage.avro.ASCommons.<clinit>(ASCommons.java:44)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:177)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:133)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getAvroSchema(AvroStorage.java:108)
> at
> org.apache.pig.piggybank.storage.avro.AvroStorage.getSchema(AvroStorage.java:233)
> at
> org.apache.pig.impl.logicalLayer.LOLoad.determineSchema(LOLoad.java:169)
> at
> org.apache.pig.impl.logicalLayer.LOLoad.getSchema(LOLoad.java:150)
> at
> org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:843)
> at
> org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
> at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1164)
> at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1114)
> at org.apache.pig.PigServer.registerQuery(PigServer.java:425)
> at
> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:737)
> at
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
> at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:162)
> at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:138)
> at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
> at org.apache.pig.Main.main(Main.java:357)
>
> I am wondering if I need to restart tasktracker and jobtracker in order for
> the jackson jars to be picked up?
>
>
>
>
>
>
>
>
--
Chase Bradford
“If in physics there's something you don't understand, you can always
hide behind the uncharted depths of nature. But if your program
doesn't work, there is no obstinate nature. If it doesn't work, you've
messed up.”
- Edsger Dijkstra