You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Rohini Palaniswamy <ro...@gmail.com> on 2013/03/01 13:18:21 UTC

Re: NoClassDefFoundError: org/apache/hadoop/hbase/client/HTableInterface

Did you check if hdfs://hqitrtd91.pclc0.merkle.local:
8020/user/oozie/hbase-shared/hbase.jar is a zero byte file?

Regards,
Rohini


On Tue, Feb 26, 2013 at 9:18 AM, Jinye Luo <jl...@merkleinc.com> wrote:

> Hmm, found something fishy.
>
> By looking at a mapper's log, I have found that all jar files I am
> interested in are referred in the classpath like below.
>
>
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appcache/application_1360641830618_0183/container_1360641830618_0183_01_000002/
>
> Then I looked at that directory, all xml/shell files in the directory are
> normal files, but all jar files are soft links. Then I looked the actual
> files they are linked to, and guess what, they are all of size 0, except
> for the application jar! That explains why I have to include hbase jar to
> my application jar. But WHY?
>
> Here is what I have seen:
>
> ls -l container_1360641830618_0183_01_000002
>
> rwxrwxrwx 1 yarn yarn   100 Feb 26 11:30 hbase-0.92.1-cdh4.1.1.jar ->
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/3126140985408287414/hbase-0.92.1-cdh4.1.1.jar
> lrwxrwxrwx 1 yarn yarn    89 Feb 26 11:30 zookeeper.jar ->
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-5716822313188611937/zookeeper.jar
> ...
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-5156709486956032280/java-launcher.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-5156709486956032280/java-launcher.jar
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-5716822313188611937/zookeeper.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-5716822313188611937/zookeeper.jar
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/3126140985408287414/hbase-0.92.1-cdh4.1.1.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/3126140985408287414/hbase-0.92.1-cdh4.1.1.jar
>
>
> -----Original Message-----
> From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> Sent: Tuesday, February 19, 2013 2:12 PM
> To: user@oozie.apache.org
> Subject: Re: NoClassDefFoundError:
> org/apache/hadoop/hbase/client/HTableInterface
>
> Jinye,
>      The most likely possibility at this point is that HTableInterface
> class is actually not in the jar- hdfs://hqitrtd91.pclc0.merkle.local:
> 8020/user/oozie/hbase-shared/hbase.jar. Can you copy the jar back to local
> disk, do a jar -tvf and confirm that it actually contains the class?
>
>   Another possibility is that if there is some unsubstituted variable in
> the classpath set by YARN, the jars will not get included.  In that case,
> need to go to the node manager that launched AM and dig out
> launcher_container.sh script and check if the export CLASSPATH statement in
> that does not throw an error. This is most likely not the case with you as
> I don't see that happening with oozie.
>
> Regards,
> Rohini
>
>
> On Tue, Feb 19, 2013 at 10:36 AM, Jinye Luo <jl...@merkleinc.com> wrote:
>
> > Rohini,
> >
> > Thanks for pointing out where the Job's configuration is and I did
> > find it. Learned something today!
> >
> > However, the hbase.jar is actually in the config. I am pasting the
> > exact values here.
> >
> > mapreduce.job.cache.files:
> >
> > hdfs://hqitrtd91.pclc0.merkle.local:8020/user/cloudera/oozie-oozi/0000
> > 052-130210101321239-oozie-oozi-W/java-aggregator--java/java-launcher.j
> > ar,hdfs://hqitrtd91.pclc0.merkle.local:8020/user/cloudera/ltd/cr-wf2/l
> > ib/cr-1.0-SNAPSHOT.jar,/user/cloudera/ltd/cr-wf2/lib/cr-run-config.xml
> > #cr-run-config.xml,hdfs://hqitrtd91.pclc0.merkle.local:8020/user/oozie
> > /hbase-shared/hbase.jar,hdfs://hqitrtd91.pclc0.merkle.local:8020/user/
> > oozie/hbase-shared/zookeeper.jar,hdfs://hqitrtd91.pclc0.merkle.local:8
> > 020/user/oozie/share/lib/oozie/json-simple-1.1.jar
> >
> > mapreduce.job.cache.files.filesizes:
> > 19018,147377,1508,3258651,1301515,16046
> > mapreduce.job.cache.files.timestamps:
> >
> 1361298149677,1361297463000,1360880488031,1360619188779,1360619228172,1358357320213
> > mapreduce.job.cache.files.visibilities      true,true,true,true,true,true
> >
> > ===============================
> > And here is the error:
> >
> > Error: java.lang.RuntimeException:
> > java.lang.reflect.InvocationTargetException
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:128)
> >         at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:621)
> >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:396)
> >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
> > Caused by: java.lang.reflect.InvocationTargetException
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:126)
> >         ... 7 more
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/hadoop/hbase/client/HTableInterface
> >         at
> >
> com.merkleinc.cr.etl.aggregator.OmnitureHitReducerListener.<init>(OmnitureHitReducerListener.java:53)
> >         at
> >
> com.merkleinc.cr.etl.hadoop.ReduceDispatcher.<init>(ReduceDispatcher.java:18)
> >         at
> com.merkleinc.cr.etl.hadoop.CRReducer.<init>(CRReducer.java:42)
> >         ... 12 more
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.client.HTableInterface
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >
> >
> > -----Original Message-----
> > From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> > Sent: Monday, February 18, 2013 8:15 PM
> > To: user@oozie.apache.org
> > Subject: Re: NoClassDefFoundError:
> > org/apache/hadoop/hbase/client/HTableInterface
> >
> > Jinye,
> >      The shared jars of oozie action are added to the hadoop's
> > DistributedCache so that it is available for the map/reduce tasks in
> > their classpath. You can verify if the jars are in the distributed
> > cache of a job by looking at that Job's Configuration
>  mapreduce.job.cache.files property.
> > In the yarn RM page, go to the job link. On the left side you will see
> > a Job pane with the following.
> >
> >   Job
> >
> > Overview
> > Counters
> > Configuration
> > Map tasks
> > Reduce tasks
> >
> > Click on the  Configuration link. You can then search for cache in the
> > key section.
> >
> > Regards,
> > Rohini
> >
> >
> > On Sat, Feb 16, 2013 at 9:46 PM, Jinye Luo <jl...@merkleinc.com> wrote:
> >
> > > Rohini,
> > >
> > > Thanks for your reply.
> > >
> > > I got the java.class.path from the MR driver log. From what I see,
> > > hbase.jar is visible to the driver class, because
> > > "HBaseConfiguration.create()" called from the driver was successful.
> > > But somehow the jar is not visible to Reducer running from a
> > > different
> > machine.
> > >
> > > Can you give me a bit detail how to find "configuration tab of the
> > > job"? I looked everywhere in "oozie console" and "yarn
> > > ResourceManager" and not able to find it.
> > >
> > > After all, do you know if there is any document on how to use shared
> > > jars from an oozie action? I got information from a blog which might
> > > not be completed.
> > >
> > > Thx.
> > >
> > > Jinye
> > >
> > > -----Original Message-----
> > > From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> > > Sent: Saturday, February 16, 2013 10:13 PM
> > > To: user@oozie.apache.org
> > > Subject: Re: NoClassDefFoundError:
> > > org/apache/hadoop/hbase/client/HTableInterface
> > >
> > > Jinye,
> > >    This is happening in ApplicationMaster code and you will be
> > > seeing the error in AM logs.
> > >
> > >
> > > http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.0.1-alp
> > > ha
> > > /hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-c
> > > li
> > > ent-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/job/impl/Jo
> > > bI mpl.java?revision=1439715&view=markup
> > >
> > >    It tries to load the reducer to see if it is a ChainReducer. It
> > > is able to find your reducer class, but while trying to load it, it
> > > failed with java.lang.NoClassDefFoundError as
> > > org/apache/hadoop/hbase/client/HTableInterface
> > > is not in its classpath.
> > >
> > > One thing I can say for sure is that hbase.jar is not in the
> > > classpath. In which log do you see the java.class.path system
> > > property printed? Can you check if hbase.jar is in the
> > > mapreduce.job.cache.files setting of the failed job (Search in the
> > Configuration tab of the job).
> > >
> > > Regards,
> > > Rohini
> > >
> > >
> > >
> > > On Mon, Feb 11, 2013 at 10:18 PM, Jinye Luo <jl...@merkleinc.com>
> wrote:
> > >
> > > > I figured someone smart out there must have done it, hence asking.
> > > >
> > > > I have a M/R job accessing Hbase and it is launched by oozie. The
> > > > job is successful if I put all dependencies in a big jar, however,
> > > > everyone understands it is not a good idea to deploy hadoop/hbase
> > > > files along with application jars. Hence I am trying to configure
> > > > it to use the jars in some shared fashion but with no luck.
> > > >
> > > > First I put hbase.jar and zookeeper.jar in /user/oozie/share/lib,
> > > > but from the log, it is clear that hbase.jar is just not visible
> > > > in the
> > > classpath.
> > > >
> > > > Then I tried a second option to take advantage of libpath by
> > > > creating a directory "/user/oozie/hbase-shared" in hdfs and
> > > > putting the two jars there. And I pointed oozie.libpath to the dir
> > > > as "oozie.libpath=${nameNode}/user/oozie/hbase-shared" in
> > > > job.properties
> > > file.
> > > > It is doing better because the call to "HBaseConfiguration.create()"
> > > > from my driver was successful and HBaseConfiguration is a class in
> > > > hbase.jar, so I know hbase.jar is indeed visible to the driver class.
> > > > However, somewhere down the line I still received exception of
> > > "NoClassDefFoundError:
> > > > org/apache/hadoop/hbase/client/HTableInterface" and I only use
> > > > HTableInterface in the reducer.
> > > >
> > > > So any idea please? Also, please let me know if there is a better
> > > > way to do it.
> > > >
> > > > And here is system properties I found in the log, which clearly
> > > > shows hbase.jar is in the classpath.
> > > >
> > > >
> > > > #Tue Feb 12 00:22:20 EST 2013
> > > > java.runtime.name=Java(TM) SE Runtime Environment
> > > >
> > > > oozie.action.externalChildIDs.properties=/var/lib/hadoop-yarn/cach
> > > > e/
> > > > ya
> > > > rn/nm-local-dir/usercache/cloudera/appcache/application_1360641830
> > > > 61
> > > > 8_
> > > > 0011/container_1360641830618_0011_01_000002/externalChildIds.prope
> > > > rt
> > > > ie
> > > > s
> > > > sun.boot.library.path=/usr/java/jdk1.6.0_31/jre/lib/amd64
> > > > java.vm.version=20.6-b01
> > > > hadoop.root.logger=INFO,CLA
> > > > java.vm.vendor=Sun Microsystems Inc.
> > > > java.vendor.url=http\://java.sun.com/
> > > > path.separator=\:
> > > > java.vm.name=Java HotSpot(TM) 64-Bit Server VM
> > > > file.encoding.pkg=sun.io sun.java.launcher=SUN_STANDARD
> > > > user.country=US sun.os.patch.level=unknown
> > > > java.vm.specification.name=Java Virtual Machine Specification
> > > >
> > > > user.dir=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cl
> > > > ou
> > > > de
> > > > ra/appcache/application_1360641830618_0011/container_1360641830618
> > > > _0
> > > > 01
> > > > 1_01_000002
> > > > java.runtime.version=1.6.0_31-b04
> > > > java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment
> > > > java.endorsed.dirs=/usr/java/jdk1.6.0_31/jre/lib/endorsed
> > > > os.arch=amd64
> > > > oozie.job.id=0000015-130210101321239-oozie-oozi-W
> > > > oozie.action.id=0000015-130210101321239-oozie-oozi-W@java-aggregat
> > > > or
> > > >
> > > > java.io.tmpdir=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/userca
> > > > ch
> > > > e/
> > > > cloudera/appcache/application_1360641830618_0011/container_1360641
> > > > 83
> > > > 06
> > > > 18_0011_01_000002/tmp
> > > > line.separator=\n
> > > >
> > > > oozie.action.output.properties=/var/lib/hadoop-yarn/cache/yarn/nm-
> > > > lo
> > > > ca
> > > > l-dir/usercache/cloudera/appcache/application_1360641830618_0011/c
> > > > on ta iner_1360641830618_0011_01_000002/output.properties
> > > > java.vm.specification.vendor=Sun Microsystems Inc.
> > > > os.name=Linux
> > > > log4j.configuration=container-log4j.properties
> > > > sun.jnu.encoding=ANSI_X3.4-1968
> > > >
> > > > java.library.path=/usr/java/jdk1.6.0_31/jre/lib/amd64/server\:/usr
> > > > /j
> > > > av
> > > > a/jdk1.6.0_31/jre/lib/amd64\:/usr/java/jdk1.6.0_31/jre/../lib/amd6
> > > > 4\
> > > > :/
> > > > var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/app
> > > > ca
> > > > ch
> > > > e/application_1360641830618_0011/container_1360641830618_0011_01_0
> > > > 00
> > > > 00
> > > > 2\:/usr/lib/hadoop/lib/native\:/usr/java/packages/lib/amd64\:/usr/
> > > > li
> > > > b6
> > > > 4\:/lib64\:/lib\:/usr/lib
> > > > yarn.app.mapreduce.container.log.filesize=0
> > > >
> > > > oozie.action.conf.xml=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir
> > > > /u
> > > > se
> > > > rcache/cloudera/appcache/application_1360641830618_0011/container_
> > > > 13
> > > > 60
> > > > 641830618_0011_01_000002/action.xml
> > > > hadoop.metrics.log.level=WARN
> > > > java.specification.name=Java Platform API Specification
> > > > java.class.version=50.0
> > > > java.net.preferIPv4Stack=true
> > > > sun.management.compiler=HotSpot 64-Bit Tiered Compilers
> > > > os.version=2.6.32-220.23.1.el6.x86_64
> > > >
> > > > yarn.app.mapreduce.container.log.dir=/var/log/hadoop-yarn/containe
> > > > rs
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_0000
> > > > 02
> > > > user.home=/var/lib/hadoop-yarn
> > > > user.timezone=America/New_York
> > > > java.awt.printerjob=sun.print.PSPrinterJob
> > > > file.encoding=ANSI_X3.4-1968
> > > > java.specification.version=1.6
> > > >
> > > > java.class.path=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/userc
> > > > ac
> > > > he
> > > > /cloudera/appcache/application_1360641830618_0011/container_136064
> > > > 18
> > > > 30
> > > > 618_0011_01_000002\:/etc/hadoop/conf\:/usr/lib/hadoop/hadoop-annot
> > > > at
> > > > io
> > > > ns-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop/hadoop-auth.jar\:/usr/lib/h
> > > > ad
> > > > oo
> > > > p/hadoop-common-2.0.0-cdh4.1.1-tests.jar\:/usr/lib/hadoop/hadoop-a
> > > > ut
> > > > h-
> > > >
> 2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop/hadoop-common-2.0.0-cdh4.1.1.jar\:
> > > >
> /usr/lib/hadoop/hadoop-annotations.jar\:/usr/lib/hadoop/hadoop-common.
> > > > jar\:/usr/lib/hadoop/lib/log4j-1.2.17.jar\:/usr/lib/hadoop/lib/com
> > > > mo
> > > > ns
> > > > -logging-1.1.1.jar\:/usr/lib/hadoop/lib/servlet-api-2.5.jar\:/usr/
> > > > li
> > > > b/
> > > > hadoop/lib/jline-0.9.94.jar\:/usr/lib/hadoop/lib/commons-configura
> > > > ti
> > > > on
> > > > -1.6.jar\:/usr/lib/hadoop/lib/jersey-json-1.8.jar\:/usr/lib/hadoop
> > > > /l
> > > > ib
> > > > /commons-httpclient-3.1.jar\:/usr/lib/hadoop/lib/jaxb-api-2.2.2.ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar\:/usr/lib/hadoop/li
> > > > b/
> > > > zo
> > > > okeeper-3.4.3-cdh4.1.1.jar\:/usr/lib/hadoop/lib/commons-cli-1.2.ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop/lib/jersey-server-1.8.jar\:/usr/lib/hadoop/lib/hue-
> > > > pl
> > > > ug
> > > > ins-2.1.0-cdh4.1.1.jar\:/usr/lib/hadoop/lib/commons-el-1.0.jar\:/u
> > > > sr
> > > > /l
> > > > ib/hadoop/lib/jackson-core-asl-1.8.8.jar\:/usr/lib/hadoop/lib/jsp-
> > > > ap
> > > > i-
> > > > 2.1.jar\:/usr/lib/hadoop/lib/commons-net-3.1.jar\:/usr/lib/hadoop/
> > > > li
> > > > b/
> > > > jasper-compiler-5.5.23.jar\:/usr/lib/hadoop/lib/jettison-1.1.jar\:
> > > > /u
> > > > sr
> > > > /lib/hadoop/lib/jsr305-1.3.9.jar\:/usr/lib/hadoop/lib/paranamer-2.3.
> > > > ja
> > > > r\:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar\:/usr/lib/hadoop/lib/
> > > > co
> > > > mm
> > > > ons-beanutils-1.7.0.jar\:/usr/lib/hadoop/lib/stax-api-1.0.1.jar\:/
> > > > us
> > > > r/
> > > >
> lib/hadoop/lib/jets3t-0.6.1.jar\:/usr/lib/hadoop/lib/jasper-runtime-5.
> > > > 5.23.jar\:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar\:/usr/lib/hado
> > > > op
> > > > /l
> > > > ib/protobuf-java-2.4.0a.jar\:/usr/lib/hadoop/lib/jackson-xc-1.8.8.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/commons-digester-1.8.jar\:/usr/lib/hadoop/lib
> > > > /j
> > > > et
> > > > ty-util-6.1.26.cloudera.2.jar\:/usr/lib/hadoop/lib/activation-1.1.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar\:/usr/lib/hadoop/lib/
> > > > sl
> > > > f4
> > > > j-api-1.6.1.jar\:/usr/lib/hadoop/lib/avro-1.7.1.cloudera.2.jar\:/u
> > > > sr
> > > > /l
> > > > ib/hadoop/lib/commons-beanutils-core-1.8.0.jar\:/usr/lib/hadoop/li
> > > > b/
> > > > ja
> > > > ckson-mapper-asl-1.8.8.jar\:/usr/lib/hadoop/lib/jsch-0.1.42.jar\:/
> > > > us
> > > > r/
> > > >
> lib/hadoop/lib/kfs-0.3.jar\:/usr/lib/hadoop/lib/commons-math-2.1.jar\:
> > > > /usr/lib/hadoop/lib/xmlenc-0.52.jar\:/usr/lib/hadoop/lib/commons-c
> > > > ol
> > > > le
> > > > ctions-3.2.1.jar\:/usr/lib/hadoop/lib/guava-11.0.2.jar\:/usr/lib/h
> > > > ad
> > > > oo
> > > > p/lib/commons-io-2.1.jar\:/usr/lib/hadoop/lib/junit-4.8.2.jar\:/us
> > > > r/
> > > > li
> > > > b/hadoop/lib/jackson-jaxrs-1.8.8.jar\:/usr/lib/hadoop/lib/asm-3.2.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/commons-codec-1.4.jar\:/usr/lib/hadoop/lib/co
> > > > mm
> > > > on
> > > > s-lang-2.5.jar\:/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar\:/usr/
> > > > li
> > > > b/
> > > >
> hadoop/lib/jersey-core-1.8.jar\:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar\:
> > > > /usr/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.1.1-tests.jar\:/usr/li
> > > > b/
> > > > ha
> > > > doop-hdfs/hadoop-hdfs-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-hdfs/lib
> > > > /l
> > > > og
> > > > 4j-1.2.17.jar\:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.1.jar\
> > > > :/
> > > > us
> > > > r/lib/hadoop-hdfs/lib/servlet-api-2.5.jar\:/usr/lib/hadoop-hdfs/li
> > > > b/
> > > > jl
> > > >
> ine-0.9.94.jar\:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.2.jar\:
> > > > /usr/lib/hadoop-hdfs/lib/zookeeper-3.4.3-cdh4.1.1.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -hdfs/lib/commons-daemon-1.0.3.jar\:/usr/lib/hadoop-hdfs/lib/commo
> > > > ns
> > > > -c
> > > > li-1.2.jar\:/usr/lib/hadoop-hdfs/lib/jersey-server-1.8.jar\:/usr/l
> > > > ib
> > > > /h
> > > > adoop-hdfs/lib/commons-el-1.0.jar\:/usr/lib/hadoop-hdfs/lib/jackso
> > > > n-
> > > > co
> > > > re-asl-1.8.8.jar\:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar\:/usr/l
> > > > ib
> > > > /h
> > > > adoop-hdfs/lib/jsr305-1.3.9.jar\:/usr/lib/hadoop-hdfs/lib/jasper-r
> > > > un
> > > > ti
> > > > me-5.5.23.jar\:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar\:
> > > > /u
> > > > sr
> > > > /lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.2.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -hdfs/lib/jackson-mapper-asl-1.8.8.jar\:/usr/lib/hadoop-hdfs/lib/x
> > > > ml
> > > > en
> > > > c-0.52.jar\:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -hdfs/lib/commons-io-2.1.jar\:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar\:
> > > > /u
> > > > sr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar\:/usr/lib/hadoop-hdfs
> > > > /l
> > > > ib
> > > > /commons-lang-2.5.jar\:/usr/lib/hadoop-hdfs/lib/jersey-core-1.8.ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.1.1.jar\:/usr/li
> > > > b/
> > > > ha
> > > > doop-mapreduce/hadoop-distcp-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-m
> > > > ap
> > > > re
> > > > duce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.1.1-tests.jar\:/
> > > > us
> > > > r/
> > > > lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar\:/usr/lib/hadoo
> > > > p-
> > > > ma
> > > > preduce/hadoop-streaming-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapre
> > > > du
> > > > ce
> > > > /hadoop-mapreduce-client-app.jar\:/usr/lib/hadoop-mapreduce/hadoop
> > > > -d
> > > > at
> > > > ajoin-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoop-mapredu
> > > > ce
> > > > -c
> > > > lient-shuffle.jar\:/usr/lib/hadoop-mapreduce/hadoop-extras.jar\:/u
> > > > sr
> > > > /l
> > > > ib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.1
> > > > .1
> > > > .j
> > > > ar\:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar\:/usr/lib/hadoop-m
> > > > ap
> > > > re
> > > > duce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.1.1.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op-mapreduce/hadoop-rumen.jar\:/usr/lib/hadoop-mapreduce/hadoop-ma
> > > > pr
> > > > ed
> > > > uce-examples-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoop-
> > > > ar
> > > > ch
> > > > ives.jar\:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.1.1.ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar\:/usr/li
> > > > b/
> > > > ha
> > > > doop-mapreduce/hadoop-gridmix.jar\:/usr/lib/hadoop-mapreduce/hadoo
> > > > p-
> > > > ma
> > > > preduce-client-app-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/h
> > > > ad
> > > > oo
> > > > p-mapreduce-client-common.jar\:/usr/lib/hadoop-mapreduce/hadoop-ar
> > > > ch
> > > > iv
> > > >
> es-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar\:
> > > > /usr/lib/hadoop-mapreduce/hadoop-streaming.jar\:/usr/lib/hadoop-ma
> > > > pr
> > > > ed
> > > > uce/hadoop-mapreduce-client-common-2.0.0-cdh4.1.1.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -mapreduce/hadoop-gridmix-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapr
> > > > ed
> > > > uc
> > > > e/hadoop-mapreduce-client-core-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop
> > > > -m
> > > > ap
> > > > reduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.1.1.jar\:/usr/lib/had
> > > > oo
> > > > p-
> > > > mapreduce/hadoop-mapreduce-client-hs.jar\:/usr/lib/hadoop-mapreduc
> > > > e/
> > > > ha
> > > > doop-mapreduce-client-jobclient.jar\:/usr/lib/hadoop-mapreduce/lib
> > > > /l
> > > > og
> > > > 4j-1.2.17.jar\:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar\:/
> > > > us
> > > > r/
> > > > lib/hadoop-mapreduce/lib/netty-3.2.4.Final.jar\:/usr/lib/hadoop-ma
> > > > pr
> > > > ed
> > > > uce/lib/jersey-server-1.8.jar\:/usr/lib/hadoop-mapreduce/lib/jacks
> > > > on
> > > > -c
> > > > ore-asl-1.8.8.jar\:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar\:
> > > > /u
> > > > sr/lib/hadoop-mapreduce/lib/jersey-guice-1.8.jar\:/usr/lib/hadoop-
> > > > ma
> > > > pr
> > > > educe/lib/protobuf-java-2.4.0a.jar\:/usr/lib/hadoop-mapreduce/lib/
> > > > gu
> > > > ic
> > > >
> e-servlet-3.0.jar\:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar\:
> > > > /usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -mapreduce/lib/avro-1.7.1.cloudera.2.jar\:/usr/lib/hadoop-mapreduc
> > > > e/
> > > > li
> > > >
> b/guice-3.0.jar\:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.
> > > > 8.jar\:/usr/lib/hadoop-mapreduce/lib/commons-io-2.1.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op-mapreduce/lib/asm-3.2.jar\:/usr/lib/hadoop-mapreduce/lib/jersey
> > > > -c
> > > > or
> > > >
> e-1.8.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.1.
> > > >
> 1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.
> > > >
> 1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar\:
> > > > /usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar\:
> > > > /u
> > > > sr/lib/hadoop-yarn/hadoop-yarn-api.jar\:/usr/lib/hadoop-yarn/hadoo
> > > > p-
> > > > ya
> > > > rn-api-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server
> > > > -t
> > > > es
> > > > ts-2.0.0-cdh4.1.1-tests.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-site
> > > > -2
> > > > .0
> > > > .0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy
> > > > -2
> > > > .0
> > > > .0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanag
> > > > er
> > > > .j
> > > > ar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.1.1.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar\:/usr/lib/h
> > > > ad
> > > > oo
> > > > p-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.1.1.ja
> > > > r\
> > > > :/
> > > >
> usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.1.1.
> > > > jar\:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.1.1.jar\:/
> > > > us
> > > > r/
> > > > lib/hadoop-yarn/hadoop-yarn-site.jar\:/usr/lib/hadoop-yarn/hadoop-
> > > > ya
> > > > rn
> > > > -server-tests.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop-yarn/hadoop-yarn-common.jar\:/usr/lib/hadoop-yarn
> > > > /l
> > > > ib
> > > > /log4j-1.2.17.jar\:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar\:/u
> > > > sr
> > > > /l
> > > > ib/hadoop-yarn/lib/netty-3.2.4.Final.jar\:/usr/lib/hadoop-yarn/lib
> > > > /j
> > > > er
> > > > sey-server-1.8.jar\:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.
> > > > ja
> > > > r\:/usr/lib/hadoop-yarn/lib/paranamer-2.3.jar\:/usr/lib/hadoop-yar
> > > > n/
> > > > li
> > > >
> b/jersey-guice-1.8.jar\:/usr/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.
> > > > jar\:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar\:/usr/lib/hado
> > > > op
> > > > -y
> > > >
> arn/lib/aopalliance-1.0.jar\:/usr/lib/hadoop-yarn/lib/snappy-java-1.0.
> > > > 4.1.jar\:/usr/lib/hadoop-yarn/lib/avro-1.7.1.cloudera.2.jar\:/usr/
> > > > li
> > > > b/
> > > > hadoop-yarn/lib/guice-3.0.jar\:/usr/lib/hadoop-yarn/lib/jackson-ma
> > > > pp
> > > > er
> > > > -asl-1.8.8.jar\:/usr/lib/hadoop-yarn/lib/commons-io-2.1.jar\:/usr/
> > > > li
> > > > b/
> > > >
> hadoop-yarn/lib/asm-3.2.jar\:/usr/lib/hadoop-yarn/lib/jersey-core-1.8.
> > > > jar\:job.jar/\:job.jar/classes/\:job.jar/lib/*\:/var/lib/hadoop-ya
> > > > rn
> > > > /c
> > > > ache/yarn/nm-local-dir/usercache/cloudera/appcache/application_136
> > > > 06
> > > > 41
> > > > 830618_0011/container_1360641830618_0011_01_000002/zookeeper.jar\:
> > > > /v
> > > > ar
> > > > /lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appcac
> > > > he
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_0000
> > > > 02
> > > > /j
> > > > son-simple-1.1.jar\:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/u
> > > > se
> > > > rc
> > > > ache/cloudera/appcache/application_1360641830618_0011/container_13
> > > > 60
> > > > 64
> > > > 1830618_0011_01_000002/java-launcher.jar\:/var/lib/hadoop-yarn/cac
> > > > he
> > > > /y
> > > > arn/nm-local-dir/usercache/cloudera/appcache/application_136064183
> > > > 06
> > > > 18
> > > > _0011/container_1360641830618_0011_01_000002/cr-1.0-SNAPSHOT.jar\:
> > > > /v
> > > > ar
> > > > /lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appcac
> > > > he
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_0000
> > > > 02
> > > > /h
> > > > base.jar
> > > > user.name=yarn
> > > >
> > > > oozie.action.newId.properties=/var/lib/hadoop-yarn/cache/yarn/nm-l
> > > > oc
> > > > al
> > > > -dir/usercache/cloudera/appcache/application_1360641830618_0011/co
> > > > nt ai ner_1360641830618_0011_01_000002/newId.properties
> > > > java.vm.specification.version=1.0
> > > > sun.java.command=org.apache.hadoop.mapred.YarnChild 192.168.77.82
> > > > 37020
> > > > attempt_1360641830618_0011_m_000000_0 2
> > > > java.home=/usr/java/jdk1.6.0_31/jre
> > > > sun.arch.data.model=64
> > > > user.language=en
> > > > java.specification.vendor=Sun Microsystems Inc.
> > > > java.vm.info=mixed mode
> > > > java.version=1.6.0_31
> > > > java.ext.dirs=/usr/java/jdk1.6.0_31/jre/lib/ext\:/usr/java/package
> > > > s/
> > > > li
> > > > b/ext
> > > >
> > > > sun.boot.class.path=/usr/java/jdk1.6.0_31/jre/lib/resources.jar\:/
> > > > us
> > > > r/
> > > > java/jdk1.6.0_31/jre/lib/rt.jar\:/usr/java/jdk1.6.0_31/jre/lib/sun
> > > > rs
> > > > as
> > > > ign.jar\:/usr/java/jdk1.6.0_31/jre/lib/jsse.jar\:/usr/java/jdk1.6.
> > > > 0_
> > > > 31
> > > > /jre/lib/jce.jar\:/usr/java/jdk1.6.0_31/jre/lib/charsets.jar\:/usr
> > > > /j
> > > > av
> > > > a/jdk1.6.0_31/jre/lib/modules/jdk.boot.jar\:/usr/java/jdk1.6.0_31/
> > > > jr
> > > > e/
> > > > classes
> > > > java.vendor=Sun Microsystems Inc.
> > > > file.separator=/
> > > > oozie.launcher.job.id=job_1360641830618_0011
> > > >
> > > > oozie.action.stats.properties=/var/lib/hadoop-yarn/cache/yarn/nm-l
> > > > oc
> > > > al
> > > > -dir/usercache/cloudera/appcache/application_1360641830618_0011/co
> > > > nt ai ner_1360641830618_0011_01_000002/stats.properties
> > > > java.vendor.url.bug=http\://java.sun.com/cgi-bin/bugreport.cgi
> > > > sun.io.unicode.encoding=UnicodeLittle
> > > > sun.cpu.endian=little
> > > > sun.cpu.isalist=
> > > >
> > > > And here is the stacktrace:
> > > >
> > > > 2013-02-12 00:59:15,743 INFO [main]
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled
> > > > snapshot period at 10 second(s).
> > > > 2013-02-12 00:59:15,743 INFO [main]
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MRAppMaster
> > > > metrics system started
> > > > 2013-02-12 00:59:15,762 INFO [main]
> > > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Adding job
> > > > token for
> > > > job_1360641830618_0015 to jobTokenSecretManager
> > > > 2013-02-12 00:59:15,967 FATAL [main]
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting
> > > > MRAppMaster
> > > > java.lang.NoClassDefFoundError:
> > > > org/apache/hadoop/hbase/client/HTableInterface
> > > >                 at java.lang.Class.forName0(Native Method)
> > > >                 at java.lang.Class.forName(Class.java:169)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.isChainJob(JobIm
> > > pl
> > > .java:918)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.makeUberDecision
> > > (J
> > > obImpl.java:849)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.access$2600(JobI
> > > mp
> > > l.java:114)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.t
> > > ra
> > > nsition(JobImpl.java:995)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.t
> > > ra
> > > nsition(JobImpl.java:938)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc
> > > .d
> > > oTransition(StateMachineFactory.java:380)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateM
> > > ac
> > > hineFactory.java:298)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMac
> > > hi
> > > neFactory.java:43)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.
> > > doTransition(StateMachineFactory.java:443)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl.j
> > > av
> > > a:666)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl.j
> > > av
> > > a:113)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher.ha
> > > nd
> > > le(MRAppMaster.java:890)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.start(MRAppMaster.java:
> > > 840)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:
> > > 1050)
> > > >                 at
> > > > java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > >                 at javax.security.auth.Subject.doAs(Subject.java:396)
> > > >                 at
> > > >
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform
> > > at
> > > ion.java:1332)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster
> > > (M
> > > RAppMaster.java:1046)
> > > >                 at
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.ja
> > > > va
> > > > :1
> > > > 010) Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.hbase.client.HTableInterface
> > > >                 at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > >                 at
> > > > java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > >                 at
> > > > java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > >                 at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > >                 at
> > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > >                 at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >                 ... 21 more
> > > > 2013-02-12 00:59:15,972 INFO [Thread-1]
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster: MRAppMaster
> > > > received a signal. Signaling RMCommunicator and
> JobHistoryEventHandler.
> > > >
> > > >
> > >
> > >
> >
> >
>
>

RE: NoClassDefFoundError: org/apache/hadoop/hbase/client/HTableInterface

Posted by Frank Luo <jl...@merkleinc.com>.
Ok, I have found the cause. 

The method TableMapReduceUtil.addDependencyJars(job) doesn't guarantee adding HTable.jar into the Map/Reduce task class path, hence the error, although its javadoc suggests so. 

Here is the line I added to take case of it:
            TableMapReduceUtil.addDependencyJars(job.getConfiguration(), HTableInterface.class);

Thank Rohini and others spending time on this issue.


-----Original Message-----
From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com] 
Sent: Friday, March 01, 2013 6:18 AM
To: user@oozie.apache.org
Subject: Re: NoClassDefFoundError: org/apache/hadoop/hbase/client/HTableInterface

Did you check if hdfs://hqitrtd91.pclc0.merkle.local:
8020/user/oozie/hbase-shared/hbase.jar is a zero byte file?

Regards,
Rohini


On Tue, Feb 26, 2013 at 9:18 AM, Jinye Luo <jl...@merkleinc.com> wrote:

> Hmm, found something fishy.
>
> By looking at a mapper's log, I have found that all jar files I am 
> interested in are referred in the classpath like below.
>
>
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appcac
> he/application_1360641830618_0183/container_1360641830618_0183_01_0000
> 02/
>
> Then I looked at that directory, all xml/shell files in the directory 
> are normal files, but all jar files are soft links. Then I looked the 
> actual files they are linked to, and guess what, they are all of size 
> 0, except for the application jar! That explains why I have to include 
> hbase jar to my application jar. But WHY?
>
> Here is what I have seen:
>
> ls -l container_1360641830618_0183_01_000002
>
> rwxrwxrwx 1 yarn yarn   100 Feb 26 11:30 hbase-0.92.1-cdh4.1.1.jar ->
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/3126140985408287414/hbase-0.92.1-cdh4.1.1.jar
> lrwxrwxrwx 1 yarn yarn    89 Feb 26 11:30 zookeeper.jar ->
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-57168223131886
> 11937/zookeeper.jar
> ...
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-51567094869560
> 32280/java-launcher.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-51567094869560
> 32280/java-launcher.jar
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-57168223131886
> 11937/zookeeper.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/-57168223131886
> 11937/zookeeper.jar
>
> ls -l
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/312614098540828
> 7414/hbase-0.92.1-cdh4.1.1.jar
> -r-xr-xr-x 1 yarn yarn 0 Feb 26 11:46
> /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/312614098540828
> 7414/hbase-0.92.1-cdh4.1.1.jar
>
>
> -----Original Message-----
> From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> Sent: Tuesday, February 19, 2013 2:12 PM
> To: user@oozie.apache.org
> Subject: Re: NoClassDefFoundError:
> org/apache/hadoop/hbase/client/HTableInterface
>
> Jinye,
>      The most likely possibility at this point is that HTableInterface 
> class is actually not in the jar- hdfs://hqitrtd91.pclc0.merkle.local:
> 8020/user/oozie/hbase-shared/hbase.jar. Can you copy the jar back to 
> local disk, do a jar -tvf and confirm that it actually contains the class?
>
>   Another possibility is that if there is some unsubstituted variable 
> in the classpath set by YARN, the jars will not get included.  In that 
> case, need to go to the node manager that launched AM and dig out 
> launcher_container.sh script and check if the export CLASSPATH 
> statement in that does not throw an error. This is most likely not the 
> case with you as I don't see that happening with oozie.
>
> Regards,
> Rohini
>
>
> On Tue, Feb 19, 2013 at 10:36 AM, Jinye Luo <jl...@merkleinc.com> wrote:
>
> > Rohini,
> >
> > Thanks for pointing out where the Job's configuration is and I did 
> > find it. Learned something today!
> >
> > However, the hbase.jar is actually in the config. I am pasting the 
> > exact values here.
> >
> > mapreduce.job.cache.files:
> >
> > hdfs://hqitrtd91.pclc0.merkle.local:8020/user/cloudera/oozie-oozi/00
> > 00 
> > 052-130210101321239-oozie-oozi-W/java-aggregator--java/java-launcher
> > .j 
> > ar,hdfs://hqitrtd91.pclc0.merkle.local:8020/user/cloudera/ltd/cr-wf2
> > /l 
> > ib/cr-1.0-SNAPSHOT.jar,/user/cloudera/ltd/cr-wf2/lib/cr-run-config.x
> > ml 
> > #cr-run-config.xml,hdfs://hqitrtd91.pclc0.merkle.local:8020/user/ooz
> > ie 
> > /hbase-shared/hbase.jar,hdfs://hqitrtd91.pclc0.merkle.local:8020/use
> > r/
> > oozie/hbase-shared/zookeeper.jar,hdfs://hqitrtd91.pclc0.merkle.local
> > :8 020/user/oozie/share/lib/oozie/json-simple-1.1.jar
> >
> > mapreduce.job.cache.files.filesizes:
> > 19018,147377,1508,3258651,1301515,16046
> > mapreduce.job.cache.files.timestamps:
> >
> 1361298149677,1361297463000,1360880488031,1360619188779,1360619228172,
> 1358357320213
> > mapreduce.job.cache.files.visibilities      true,true,true,true,true,true
> >
> > ===============================
> > And here is the error:
> >
> > Error: java.lang.RuntimeException:
> > java.lang.reflect.InvocationTargetException
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.jav
> a:128)
> >         at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:621)
> >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:396)
> >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat
> ion.java:1332)
> >         at 
> > org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
> > Caused by: java.lang.reflect.InvocationTargetException
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructo
> rAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCo
> nstructorAccessorImpl.java:27)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.jav
> a:126)
> >         ... 7 more
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/hadoop/hbase/client/HTableInterface
> >         at
> >
> com.merkleinc.cr.etl.aggregator.OmnitureHitReducerListener.<init>(Omni
> tureHitReducerListener.java:53)
> >         at
> >
> com.merkleinc.cr.etl.hadoop.ReduceDispatcher.<init>(ReduceDispatcher.j
> ava:18)
> >         at
> com.merkleinc.cr.etl.hadoop.CRReducer.<init>(CRReducer.java:42)
> >         ... 12 more
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.client.HTableInterface
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >
> >
> > -----Original Message-----
> > From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> > Sent: Monday, February 18, 2013 8:15 PM
> > To: user@oozie.apache.org
> > Subject: Re: NoClassDefFoundError:
> > org/apache/hadoop/hbase/client/HTableInterface
> >
> > Jinye,
> >      The shared jars of oozie action are added to the hadoop's 
> > DistributedCache so that it is available for the map/reduce tasks in 
> > their classpath. You can verify if the jars are in the distributed 
> > cache of a job by looking at that Job's Configuration
>  mapreduce.job.cache.files property.
> > In the yarn RM page, go to the job link. On the left side you will 
> > see a Job pane with the following.
> >
> >   Job
> >
> > Overview
> > Counters
> > Configuration
> > Map tasks
> > Reduce tasks
> >
> > Click on the  Configuration link. You can then search for cache in 
> > the key section.
> >
> > Regards,
> > Rohini
> >
> >
> > On Sat, Feb 16, 2013 at 9:46 PM, Jinye Luo <jl...@merkleinc.com> wrote:
> >
> > > Rohini,
> > >
> > > Thanks for your reply.
> > >
> > > I got the java.class.path from the MR driver log. From what I see, 
> > > hbase.jar is visible to the driver class, because 
> > > "HBaseConfiguration.create()" called from the driver was successful.
> > > But somehow the jar is not visible to Reducer running from a 
> > > different
> > machine.
> > >
> > > Can you give me a bit detail how to find "configuration tab of the 
> > > job"? I looked everywhere in "oozie console" and "yarn 
> > > ResourceManager" and not able to find it.
> > >
> > > After all, do you know if there is any document on how to use 
> > > shared jars from an oozie action? I got information from a blog 
> > > which might not be completed.
> > >
> > > Thx.
> > >
> > > Jinye
> > >
> > > -----Original Message-----
> > > From: Rohini Palaniswamy [mailto:rohini.aditya@gmail.com]
> > > Sent: Saturday, February 16, 2013 10:13 PM
> > > To: user@oozie.apache.org
> > > Subject: Re: NoClassDefFoundError:
> > > org/apache/hadoop/hbase/client/HTableInterface
> > >
> > > Jinye,
> > >    This is happening in ApplicationMaster code and you will be 
> > > seeing the error in AM logs.
> > >
> > >
> > > http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.0.1-a
> > > lp
> > > ha
> > > /hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce
> > > -c
> > > li
> > > ent-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/job/impl/
> > > Jo bI mpl.java?revision=1439715&view=markup
> > >
> > >    It tries to load the reducer to see if it is a ChainReducer. It 
> > > is able to find your reducer class, but while trying to load it, 
> > > it failed with java.lang.NoClassDefFoundError as 
> > > org/apache/hadoop/hbase/client/HTableInterface
> > > is not in its classpath.
> > >
> > > One thing I can say for sure is that hbase.jar is not in the 
> > > classpath. In which log do you see the java.class.path system 
> > > property printed? Can you check if hbase.jar is in the 
> > > mapreduce.job.cache.files setting of the failed job (Search in the
> > Configuration tab of the job).
> > >
> > > Regards,
> > > Rohini
> > >
> > >
> > >
> > > On Mon, Feb 11, 2013 at 10:18 PM, Jinye Luo <jl...@merkleinc.com>
> wrote:
> > >
> > > > I figured someone smart out there must have done it, hence asking.
> > > >
> > > > I have a M/R job accessing Hbase and it is launched by oozie. 
> > > > The job is successful if I put all dependencies in a big jar, 
> > > > however, everyone understands it is not a good idea to deploy 
> > > > hadoop/hbase files along with application jars. Hence I am 
> > > > trying to configure it to use the jars in some shared fashion but with no luck.
> > > >
> > > > First I put hbase.jar and zookeeper.jar in 
> > > > /user/oozie/share/lib, but from the log, it is clear that 
> > > > hbase.jar is just not visible in the
> > > classpath.
> > > >
> > > > Then I tried a second option to take advantage of libpath by 
> > > > creating a directory "/user/oozie/hbase-shared" in hdfs and 
> > > > putting the two jars there. And I pointed oozie.libpath to the 
> > > > dir as "oozie.libpath=${nameNode}/user/oozie/hbase-shared" in 
> > > > job.properties
> > > file.
> > > > It is doing better because the call to "HBaseConfiguration.create()"
> > > > from my driver was successful and HBaseConfiguration is a class 
> > > > in hbase.jar, so I know hbase.jar is indeed visible to the driver class.
> > > > However, somewhere down the line I still received exception of
> > > "NoClassDefFoundError:
> > > > org/apache/hadoop/hbase/client/HTableInterface" and I only use 
> > > > HTableInterface in the reducer.
> > > >
> > > > So any idea please? Also, please let me know if there is a 
> > > > better way to do it.
> > > >
> > > > And here is system properties I found in the log, which clearly 
> > > > shows hbase.jar is in the classpath.
> > > >
> > > >
> > > > #Tue Feb 12 00:22:20 EST 2013
> > > > java.runtime.name=Java(TM) SE Runtime Environment
> > > >
> > > > oozie.action.externalChildIDs.properties=/var/lib/hadoop-yarn/ca
> > > > ch
> > > > e/
> > > > ya
> > > > rn/nm-local-dir/usercache/cloudera/appcache/application_13606418
> > > > 30
> > > > 61
> > > > 8_
> > > > 0011/container_1360641830618_0011_01_000002/externalChildIds.pro
> > > > pe
> > > > rt
> > > > ie
> > > > s
> > > > sun.boot.library.path=/usr/java/jdk1.6.0_31/jre/lib/amd64
> > > > java.vm.version=20.6-b01
> > > > hadoop.root.logger=INFO,CLA
> > > > java.vm.vendor=Sun Microsystems Inc.
> > > > java.vendor.url=http\://java.sun.com/
> > > > path.separator=\:
> > > > java.vm.name=Java HotSpot(TM) 64-Bit Server VM 
> > > > file.encoding.pkg=sun.io sun.java.launcher=SUN_STANDARD 
> > > > user.country=US sun.os.patch.level=unknown 
> > > > java.vm.specification.name=Java Virtual Machine Specification
> > > >
> > > > user.dir=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/
> > > > cl
> > > > ou
> > > > de
> > > > ra/appcache/application_1360641830618_0011/container_13606418306
> > > > 18
> > > > _0
> > > > 01
> > > > 1_01_000002
> > > > java.runtime.version=1.6.0_31-b04 
> > > > java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment
> > > > java.endorsed.dirs=/usr/java/jdk1.6.0_31/jre/lib/endorsed
> > > > os.arch=amd64
> > > > oozie.job.id=0000015-130210101321239-oozie-oozi-W
> > > > oozie.action.id=0000015-130210101321239-oozie-oozi-W@java-aggreg
> > > > at
> > > > or
> > > >
> > > > java.io.tmpdir=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/user
> > > > ca
> > > > ch
> > > > e/
> > > > cloudera/appcache/application_1360641830618_0011/container_13606
> > > > 41
> > > > 83
> > > > 06
> > > > 18_0011_01_000002/tmp
> > > > line.separator=\n
> > > >
> > > > oozie.action.output.properties=/var/lib/hadoop-yarn/cache/yarn/n
> > > > m-
> > > > lo
> > > > ca
> > > > l-dir/usercache/cloudera/appcache/application_1360641830618_0011
> > > > /c on ta iner_1360641830618_0011_01_000002/output.properties
> > > > java.vm.specification.vendor=Sun Microsystems Inc.
> > > > os.name=Linux
> > > > log4j.configuration=container-log4j.properties
> > > > sun.jnu.encoding=ANSI_X3.4-1968
> > > >
> > > > java.library.path=/usr/java/jdk1.6.0_31/jre/lib/amd64/server\:/u
> > > > sr
> > > > /j
> > > > av
> > > > a/jdk1.6.0_31/jre/lib/amd64\:/usr/java/jdk1.6.0_31/jre/../lib/am
> > > > d6
> > > > 4\
> > > > :/
> > > > var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/a
> > > > pp
> > > > ca
> > > > ch
> > > > e/application_1360641830618_0011/container_1360641830618_0011_01
> > > > _0
> > > > 00
> > > > 00
> > > > 2\:/usr/lib/hadoop/lib/native\:/usr/java/packages/lib/amd64\:/us
> > > > r/
> > > > li
> > > > b6
> > > > 4\:/lib64\:/lib\:/usr/lib
> > > > yarn.app.mapreduce.container.log.filesize=0
> > > >
> > > > oozie.action.conf.xml=/var/lib/hadoop-yarn/cache/yarn/nm-local-d
> > > > ir
> > > > /u
> > > > se
> > > > rcache/cloudera/appcache/application_1360641830618_0011/containe
> > > > r_
> > > > 13
> > > > 60
> > > > 641830618_0011_01_000002/action.xml
> > > > hadoop.metrics.log.level=WARN
> > > > java.specification.name=Java Platform API Specification
> > > > java.class.version=50.0
> > > > java.net.preferIPv4Stack=true
> > > > sun.management.compiler=HotSpot 64-Bit Tiered Compilers
> > > > os.version=2.6.32-220.23.1.el6.x86_64
> > > >
> > > > yarn.app.mapreduce.container.log.dir=/var/log/hadoop-yarn/contai
> > > > ne
> > > > rs
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_00
> > > > 00
> > > > 02
> > > > user.home=/var/lib/hadoop-yarn
> > > > user.timezone=America/New_York
> > > > java.awt.printerjob=sun.print.PSPrinterJob
> > > > file.encoding=ANSI_X3.4-1968
> > > > java.specification.version=1.6
> > > >
> > > > java.class.path=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/use
> > > > rc
> > > > ac
> > > > he
> > > > /cloudera/appcache/application_1360641830618_0011/container_1360
> > > > 64
> > > > 18
> > > > 30
> > > > 618_0011_01_000002\:/etc/hadoop/conf\:/usr/lib/hadoop/hadoop-ann
> > > > ot
> > > > at
> > > > io
> > > > ns-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop/hadoop-auth.jar\:/usr/lib
> > > > /h
> > > > ad
> > > > oo
> > > > p/hadoop-common-2.0.0-cdh4.1.1-tests.jar\:/usr/lib/hadoop/hadoop
> > > > -a
> > > > ut
> > > > h-
> > > >
> 2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop/hadoop-common-2.0.0-cdh4.1.1.jar\:
> > > >
> /usr/lib/hadoop/hadoop-annotations.jar\:/usr/lib/hadoop/hadoop-common.
> > > > jar\:/usr/lib/hadoop/lib/log4j-1.2.17.jar\:/usr/lib/hadoop/lib/c
> > > > om
> > > > mo
> > > > ns
> > > > -logging-1.1.1.jar\:/usr/lib/hadoop/lib/servlet-api-2.5.jar\:/us
> > > > r/
> > > > li
> > > > b/
> > > > hadoop/lib/jline-0.9.94.jar\:/usr/lib/hadoop/lib/commons-configu
> > > > ra
> > > > ti
> > > > on
> > > > -1.6.jar\:/usr/lib/hadoop/lib/jersey-json-1.8.jar\:/usr/lib/hado
> > > > op
> > > > /l
> > > > ib
> > > > /commons-httpclient-3.1.jar\:/usr/lib/hadoop/lib/jaxb-api-2.2.2.
> > > > ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar\:/usr/lib/hadoop/
> > > > li
> > > > b/
> > > > zo
> > > > okeeper-3.4.3-cdh4.1.1.jar\:/usr/lib/hadoop/lib/commons-cli-1.2.
> > > > ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop/lib/jersey-server-1.8.jar\:/usr/lib/hadoop/lib/hu
> > > > e-
> > > > pl
> > > > ug
> > > > ins-2.1.0-cdh4.1.1.jar\:/usr/lib/hadoop/lib/commons-el-1.0.jar\:
> > > > /u
> > > > sr
> > > > /l
> > > > ib/hadoop/lib/jackson-core-asl-1.8.8.jar\:/usr/lib/hadoop/lib/js
> > > > p-
> > > > ap
> > > > i-
> > > > 2.1.jar\:/usr/lib/hadoop/lib/commons-net-3.1.jar\:/usr/lib/hadoo
> > > > p/
> > > > li
> > > > b/
> > > > jasper-compiler-5.5.23.jar\:/usr/lib/hadoop/lib/jettison-1.1.jar\:
> > > > /u
> > > > sr
> > > > /lib/hadoop/lib/jsr305-1.3.9.jar\:/usr/lib/hadoop/lib/paranamer-2.3.
> > > > ja
> > > > r\:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar\:/usr/lib/hadoop/li
> > > > b/
> > > > co
> > > > mm
> > > > ons-beanutils-1.7.0.jar\:/usr/lib/hadoop/lib/stax-api-1.0.1.jar\
> > > > :/
> > > > us
> > > > r/
> > > >
> lib/hadoop/lib/jets3t-0.6.1.jar\:/usr/lib/hadoop/lib/jasper-runtime-5.
> > > > 5.23.jar\:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > /l
> > > > ib/protobuf-java-2.4.0a.jar\:/usr/lib/hadoop/lib/jackson-xc-1.8.8.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/commons-digester-1.8.jar\:/usr/lib/hadoop/l
> > > > ib
> > > > /j
> > > > et
> > > > ty-util-6.1.26.cloudera.2.jar\:/usr/lib/hadoop/lib/activation-1.1.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar\:/usr/lib/hadoop/li
> > > > b/
> > > > sl
> > > > f4
> > > > j-api-1.6.1.jar\:/usr/lib/hadoop/lib/avro-1.7.1.cloudera.2.jar\:
> > > > /u
> > > > sr
> > > > /l
> > > > ib/hadoop/lib/commons-beanutils-core-1.8.0.jar\:/usr/lib/hadoop/
> > > > li
> > > > b/
> > > > ja
> > > > ckson-mapper-asl-1.8.8.jar\:/usr/lib/hadoop/lib/jsch-0.1.42.jar\
> > > > :/
> > > > us
> > > > r/
> > > >
> lib/hadoop/lib/kfs-0.3.jar\:/usr/lib/hadoop/lib/commons-math-2.1.jar\:
> > > > /usr/lib/hadoop/lib/xmlenc-0.52.jar\:/usr/lib/hadoop/lib/commons
> > > > -c
> > > > ol
> > > > le
> > > > ctions-3.2.1.jar\:/usr/lib/hadoop/lib/guava-11.0.2.jar\:/usr/lib
> > > > /h
> > > > ad
> > > > oo
> > > > p/lib/commons-io-2.1.jar\:/usr/lib/hadoop/lib/junit-4.8.2.jar\:/
> > > > us
> > > > r/
> > > > li
> > > > b/hadoop/lib/jackson-jaxrs-1.8.8.jar\:/usr/lib/hadoop/lib/asm-3.2.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop/lib/commons-codec-1.4.jar\:/usr/lib/hadoop/lib/
> > > > co
> > > > mm
> > > > on
> > > > s-lang-2.5.jar\:/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar\:/us
> > > > r/
> > > > li
> > > > b/
> > > >
> hadoop/lib/jersey-core-1.8.jar\:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar\:
> > > > /usr/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.1.1-tests.jar\:/usr/
> > > > li
> > > > b/
> > > > ha
> > > > doop-hdfs/hadoop-hdfs-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-hdfs/l
> > > > ib
> > > > /l
> > > > og
> > > > 4j-1.2.17.jar\:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.1.ja
> > > > r\
> > > > :/
> > > > us
> > > > r/lib/hadoop-hdfs/lib/servlet-api-2.5.jar\:/usr/lib/hadoop-hdfs/
> > > > li
> > > > b/
> > > > jl
> > > >
> ine-0.9.94.jar\:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.2.jar\:
> > > > /usr/lib/hadoop-hdfs/lib/zookeeper-3.4.3-cdh4.1.1.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op
> > > > -hdfs/lib/commons-daemon-1.0.3.jar\:/usr/lib/hadoop-hdfs/lib/com
> > > > mo
> > > > ns
> > > > -c
> > > > li-1.2.jar\:/usr/lib/hadoop-hdfs/lib/jersey-server-1.8.jar\:/usr
> > > > /l
> > > > ib
> > > > /h
> > > > adoop-hdfs/lib/commons-el-1.0.jar\:/usr/lib/hadoop-hdfs/lib/jack
> > > > so
> > > > n-
> > > > co
> > > > re-asl-1.8.8.jar\:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar\:/usr
> > > > /l
> > > > ib
> > > > /h
> > > > adoop-hdfs/lib/jsr305-1.3.9.jar\:/usr/lib/hadoop-hdfs/lib/jasper
> > > > -r
> > > > un
> > > > ti
> > > > me-5.5.23.jar\:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar\:
> > > > /u
> > > > sr
> > > > /lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.2.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op
> > > > -hdfs/lib/jackson-mapper-asl-1.8.8.jar\:/usr/lib/hadoop-hdfs/lib
> > > > /x
> > > > ml
> > > > en
> > > > c-0.52.jar\:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op
> > > > -hdfs/lib/commons-io-2.1.jar\:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar\:
> > > > /u
> > > > sr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar\:/usr/lib/hadoop-hd
> > > > fs
> > > > /l
> > > > ib
> > > > /commons-lang-2.5.jar\:/usr/lib/hadoop-hdfs/lib/jersey-core-1.8.
> > > > ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.1.1.jar\:/usr/
> > > > li
> > > > b/
> > > > ha
> > > > doop-mapreduce/hadoop-distcp-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop
> > > > -m
> > > > ap
> > > > re
> > > > duce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.1.1-tests.jar\
> > > > :/
> > > > us
> > > > r/
> > > > lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar\:/usr/lib/had
> > > > oo
> > > > p-
> > > > ma
> > > > preduce/hadoop-streaming-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-map
> > > > re
> > > > du
> > > > ce
> > > > /hadoop-mapreduce-client-app.jar\:/usr/lib/hadoop-mapreduce/hado
> > > > op
> > > > -d
> > > > at
> > > > ajoin-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoop-mapre
> > > > du
> > > > ce
> > > > -c
> > > > lient-shuffle.jar\:/usr/lib/hadoop-mapreduce/hadoop-extras.jar\:
> > > > /u
> > > > sr
> > > > /l
> > > > ib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4
> > > > .1
> > > > .1
> > > > .j
> > > > ar\:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar\:/usr/lib/hadoop
> > > > -m
> > > > ap
> > > > re
> > > > duce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.1.1.jar\:/usr/li
> > > > b/
> > > > ha
> > > > do
> > > > op-mapreduce/hadoop-rumen.jar\:/usr/lib/hadoop-mapreduce/hadoop-
> > > > ma
> > > > pr
> > > > ed
> > > > uce-examples-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoo
> > > > p-
> > > > ar
> > > > ch
> > > > ives.jar\:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.1.1.
> > > > ja
> > > > r\
> > > > :/
> > > > usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar\:/usr/
> > > > li
> > > > b/
> > > > ha
> > > > doop-mapreduce/hadoop-gridmix.jar\:/usr/lib/hadoop-mapreduce/had
> > > > oo
> > > > p-
> > > > ma
> > > > preduce-client-app-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce
> > > > /h
> > > > ad
> > > > oo
> > > > p-mapreduce-client-common.jar\:/usr/lib/hadoop-mapreduce/hadoop-
> > > > ar
> > > > ch
> > > > iv
> > > >
> es-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar\:
> > > > /usr/lib/hadoop-mapreduce/hadoop-streaming.jar\:/usr/lib/hadoop-
> > > > ma
> > > > pr
> > > > ed
> > > > uce/hadoop-mapreduce-client-common-2.0.0-cdh4.1.1.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op
> > > > -mapreduce/hadoop-gridmix-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-ma
> > > > pr
> > > > ed
> > > > uc
> > > > e/hadoop-mapreduce-client-core-2.0.0-cdh4.1.1.jar\:/usr/lib/hado
> > > > op
> > > > -m
> > > > ap
> > > > reduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.1.1.jar\:/usr/lib/h
> > > > ad
> > > > oo
> > > > p-
> > > > mapreduce/hadoop-mapreduce-client-hs.jar\:/usr/lib/hadoop-mapred
> > > > uc
> > > > e/
> > > > ha
> > > > doop-mapreduce-client-jobclient.jar\:/usr/lib/hadoop-mapreduce/l
> > > > ib
> > > > /l
> > > > og
> > > > 4j-1.2.17.jar\:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar\
> > > > :/
> > > > us
> > > > r/
> > > > lib/hadoop-mapreduce/lib/netty-3.2.4.Final.jar\:/usr/lib/hadoop-
> > > > ma
> > > > pr
> > > > ed
> > > > uce/lib/jersey-server-1.8.jar\:/usr/lib/hadoop-mapreduce/lib/jac
> > > > ks
> > > > on
> > > > -c
> > > > ore-asl-1.8.8.jar\:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar\:
> > > > /u
> > > > sr/lib/hadoop-mapreduce/lib/jersey-guice-1.8.jar\:/usr/lib/hadoo
> > > > p-
> > > > ma
> > > > pr
> > > > educe/lib/protobuf-java-2.4.0a.jar\:/usr/lib/hadoop-mapreduce/li
> > > > b/
> > > > gu
> > > > ic
> > > >
> e-servlet-3.0.jar\:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar\:
> > > > /usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar\:/usr/lib/
> > > > ha
> > > > do
> > > > op
> > > > -mapreduce/lib/avro-1.7.1.cloudera.2.jar\:/usr/lib/hadoop-mapred
> > > > uc
> > > > e/
> > > > li
> > > >
> b/guice-3.0.jar\:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.
> > > > 8.jar\:/usr/lib/hadoop-mapreduce/lib/commons-io-2.1.jar\:/usr/li
> > > > b/
> > > > ha
> > > > do
> > > > op-mapreduce/lib/asm-3.2.jar\:/usr/lib/hadoop-mapreduce/lib/jers
> > > > ey
> > > > -c
> > > > or
> > > >
> e-1.8.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.1.
> > > >
> 1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.
> > > >
> 1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar\:
> > > > /usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar\:
> > > > /u
> > > > sr/lib/hadoop-yarn/hadoop-yarn-api.jar\:/usr/lib/hadoop-yarn/had
> > > > oo
> > > > p-
> > > > ya
> > > > rn-api-2.0.0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-serv
> > > > er
> > > > -t
> > > > es
> > > > ts-2.0.0-cdh4.1.1-tests.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-si
> > > > te
> > > > -2
> > > > .0
> > > > .0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-pro
> > > > xy
> > > > -2
> > > > .0
> > > > .0-cdh4.1.1.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodeman
> > > > ag
> > > > er
> > > > .j
> > > > ar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.1.1.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar\:/usr/lib
> > > > /h
> > > > ad
> > > > oo
> > > > p-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.1.1.
> > > > ja
> > > > r\
> > > > :/
> > > >
> usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.1.1.
> > > > jar\:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.1.1.jar\
> > > > :/
> > > > us
> > > > r/
> > > > lib/hadoop-yarn/hadoop-yarn-site.jar\:/usr/lib/hadoop-yarn/hadoo
> > > > p-
> > > > ya
> > > > rn
> > > > -server-tests.jar\:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.
> > > > ja
> > > > r\
> > > > :/usr/lib/hadoop-yarn/hadoop-yarn-common.jar\:/usr/lib/hadoop-ya
> > > > rn
> > > > /l
> > > > ib
> > > > /log4j-1.2.17.jar\:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar\:
> > > > /u
> > > > sr
> > > > /l
> > > > ib/hadoop-yarn/lib/netty-3.2.4.Final.jar\:/usr/lib/hadoop-yarn/l
> > > > ib
> > > > /j
> > > > er
> > > > sey-server-1.8.jar\:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.
> > > > ja
> > > > r\:/usr/lib/hadoop-yarn/lib/paranamer-2.3.jar\:/usr/lib/hadoop-y
> > > > ar
> > > > n/
> > > > li
> > > >
> b/jersey-guice-1.8.jar\:/usr/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.
> > > > jar\:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar\:/usr/lib/ha
> > > > do
> > > > op
> > > > -y
> > > >
> arn/lib/aopalliance-1.0.jar\:/usr/lib/hadoop-yarn/lib/snappy-java-1.0.
> > > > 4.1.jar\:/usr/lib/hadoop-yarn/lib/avro-1.7.1.cloudera.2.jar\:/us
> > > > r/
> > > > li
> > > > b/
> > > > hadoop-yarn/lib/guice-3.0.jar\:/usr/lib/hadoop-yarn/lib/jackson-
> > > > ma
> > > > pp
> > > > er
> > > > -asl-1.8.8.jar\:/usr/lib/hadoop-yarn/lib/commons-io-2.1.jar\:/us
> > > > r/
> > > > li
> > > > b/
> > > >
> hadoop-yarn/lib/asm-3.2.jar\:/usr/lib/hadoop-yarn/lib/jersey-core-1.8.
> > > > jar\:job.jar/\:job.jar/classes/\:job.jar/lib/*\:/var/lib/hadoop-
> > > > ya
> > > > rn
> > > > /c
> > > > ache/yarn/nm-local-dir/usercache/cloudera/appcache/application_1
> > > > 36
> > > > 06
> > > > 41
> > > > 830618_0011/container_1360641830618_0011_01_000002/zookeeper.jar\:
> > > > /v
> > > > ar
> > > > /lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appc
> > > > ac
> > > > he
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_00
> > > > 00
> > > > 02
> > > > /j
> > > > son-simple-1.1.jar\:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir
> > > > /u
> > > > se
> > > > rc
> > > > ache/cloudera/appcache/application_1360641830618_0011/container_
> > > > 13
> > > > 60
> > > > 64
> > > > 1830618_0011_01_000002/java-launcher.jar\:/var/lib/hadoop-yarn/c
> > > > ac
> > > > he
> > > > /y
> > > > arn/nm-local-dir/usercache/cloudera/appcache/application_1360641
> > > > 83
> > > > 06
> > > > 18
> > > > _0011/container_1360641830618_0011_01_000002/cr-1.0-SNAPSHOT.jar\:
> > > > /v
> > > > ar
> > > > /lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/cloudera/appc
> > > > ac
> > > > he
> > > > /a
> > > > pplication_1360641830618_0011/container_1360641830618_0011_01_00
> > > > 00
> > > > 02
> > > > /h
> > > > base.jar
> > > > user.name=yarn
> > > >
> > > > oozie.action.newId.properties=/var/lib/hadoop-yarn/cache/yarn/nm
> > > > -l
> > > > oc
> > > > al
> > > > -dir/usercache/cloudera/appcache/application_1360641830618_0011/
> > > > co nt ai ner_1360641830618_0011_01_000002/newId.properties
> > > > java.vm.specification.version=1.0 
> > > > sun.java.command=org.apache.hadoop.mapred.YarnChild 
> > > > 192.168.77.82
> > > > 37020
> > > > attempt_1360641830618_0011_m_000000_0 2 
> > > > java.home=/usr/java/jdk1.6.0_31/jre
> > > > sun.arch.data.model=64
> > > > user.language=en
> > > > java.specification.vendor=Sun Microsystems Inc.
> > > > java.vm.info=mixed mode
> > > > java.version=1.6.0_31
> > > > java.ext.dirs=/usr/java/jdk1.6.0_31/jre/lib/ext\:/usr/java/packa
> > > > ge
> > > > s/
> > > > li
> > > > b/ext
> > > >
> > > > sun.boot.class.path=/usr/java/jdk1.6.0_31/jre/lib/resources.jar\
> > > > :/
> > > > us
> > > > r/
> > > > java/jdk1.6.0_31/jre/lib/rt.jar\:/usr/java/jdk1.6.0_31/jre/lib/s
> > > > un
> > > > rs
> > > > as
> > > > ign.jar\:/usr/java/jdk1.6.0_31/jre/lib/jsse.jar\:/usr/java/jdk1.6.
> > > > 0_
> > > > 31
> > > > /jre/lib/jce.jar\:/usr/java/jdk1.6.0_31/jre/lib/charsets.jar\:/u
> > > > sr
> > > > /j
> > > > av
> > > > a/jdk1.6.0_31/jre/lib/modules/jdk.boot.jar\:/usr/java/jdk1.6.0_3
> > > > 1/
> > > > jr
> > > > e/
> > > > classes
> > > > java.vendor=Sun Microsystems Inc.
> > > > file.separator=/
> > > > oozie.launcher.job.id=job_1360641830618_0011
> > > >
> > > > oozie.action.stats.properties=/var/lib/hadoop-yarn/cache/yarn/nm
> > > > -l
> > > > oc
> > > > al
> > > > -dir/usercache/cloudera/appcache/application_1360641830618_0011/
> > > > co nt ai ner_1360641830618_0011_01_000002/stats.properties
> > > > java.vendor.url.bug=http\://java.sun.com/cgi-bin/bugreport.cgi
> > > > sun.io.unicode.encoding=UnicodeLittle
> > > > sun.cpu.endian=little
> > > > sun.cpu.isalist=
> > > >
> > > > And here is the stacktrace:
> > > >
> > > > 2013-02-12 00:59:15,743 INFO [main]
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled 
> > > > snapshot period at 10 second(s).
> > > > 2013-02-12 00:59:15,743 INFO [main]
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MRAppMaster 
> > > > metrics system started
> > > > 2013-02-12 00:59:15,762 INFO [main]
> > > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Adding job 
> > > > token for
> > > > job_1360641830618_0015 to jobTokenSecretManager
> > > > 2013-02-12 00:59:15,967 FATAL [main]
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting 
> > > > MRAppMaster
> > > > java.lang.NoClassDefFoundError:
> > > > org/apache/hadoop/hbase/client/HTableInterface
> > > >                 at java.lang.Class.forName0(Native Method)
> > > >                 at java.lang.Class.forName(Class.java:169)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.isChainJob(Job
> > > Im
> > > pl
> > > .java:918)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.makeUberDecisi
> > > on
> > > (J
> > > obImpl.java:849)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.access$2600(Jo
> > > bI
> > > mp
> > > l.java:114)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition
> > > .t
> > > ra
> > > nsition(JobImpl.java:995)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition
> > > .t
> > > ra
> > > nsition(JobImpl.java:938)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalA
> > > rc
> > > .d
> > > oTransition(StateMachineFactory.java:380)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(Stat
> > > eM
> > > ac
> > > hineFactory.java:298)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateM
> > > ac
> > > hi
> > > neFactory.java:43)
> > > >                 at
> > > >
> > > org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.
> > > doTransition(StateMachineFactory.java:443)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl
> > > .j
> > > av
> > > a:666)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl
> > > .j
> > > av
> > > a:113)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher.
> > > ha
> > > nd
> > > le(MRAppMaster.java:890)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.start(MRAppMaster.java:
> > > 840)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:
> > > 1050)
> > > >                 at
> > > > java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > >                 at javax.security.auth.Subject.doAs(Subject.java:396)
> > > >                 at
> > > >
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfo
> > > rm
> > > at
> > > ion.java:1332)
> > > >                 at
> > > >
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMast
> > > er
> > > (M
> > > RAppMaster.java:1046)
> > > >                 at
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.
> > > > ja
> > > > va
> > > > :1
> > > > 010) Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.hbase.client.HTableInterface
> > > >                 at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > >                 at
> > > > java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > >                 at
> > > > java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > >                 at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > >                 at
> > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > >                 at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >                 ... 21 more
> > > > 2013-02-12 00:59:15,972 INFO [Thread-1]
> > > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster: MRAppMaster 
> > > > received a signal. Signaling RMCommunicator and
> JobHistoryEventHandler.
> > > >
> > > >
> > >
> > >
> >
> >
>
>