You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Jonathan Coveney <jc...@gmail.com> on 2011/05/09 23:35:42 UTC

Running pig without hadoop gives antlr error?

I am not sure if this is a bug, or by design so I thought I would ping the
listserv.

I want to use the trunk version of pig with my own hadoop, so I used ant
jar-withouthadoop per Dmitriy's useful pointed, and then tried running pig
by pointing it to pig-withouthadoop.jar and my hadoop jar. However, a basic
"load file, limit 1, dump" failed (unable to open iterator). Checking the
mapper, I got this error:

Error: java.lang.ClassNotFoundException: org.antlr.runtime.tree.Tree
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:425)
	at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:455)
	at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POStore.getStoreFunc(POStore.java:229)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.getCommitters(PigOutputCommitter.java:85)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.<init>(PigOutputCommitter.java:68)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:278)
	at org.apache.hadoop.mapred.Task.initialize(Task.java:488)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:359)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)

I tried manually pointing it to
/build/ivy/lib/Pig/antlr-runtime-3.2.jar via the classpath, but that
didn't work.

Eventually it worked by registering the antlr jar manually

register /home/jcoveney/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar;

But it seems like this wouldn't be intended. Is this a bug?