You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by Veli Kerim Celik <vk...@gmail.com> on 2015/12/07 13:54:19 UTC

Kylin does not start correctly (detailed and logs included)

Hello

I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for VirtualBox
(filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it into
VirtualBox.

I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual machine.

After booting it up I login to Ambari at http://localhost:8080/ (from host
machine) and start up HBase. HBase starts up without any problems.

I then ssh into the virtual machine using the following command: "ssh -L
7070:localhost:7070 root@127.0.0.1 -p 2222"

I then download Kylin binary release from "
https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
and extract into the directory /root/bin.

I then change .bash_profile so it looks like this:
############################ /root/.bash_profile
###############################
# .bash_profile

# Get the aliases and functions
if [ -f ~/.bashrc ]; then
        . ~/.bashrc
fi

# User specific environment and startup programs

KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
export KYLIN_HOME

PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin

export PATH
##############################################################################

I then start Kylin up by using the command: "kylin.sh start". I then try to
access Kylin through http://localhost:7070/kylin (from host machine) and
get a blank page (eg. not 404).

I get the following output from kylin.sh start and tomcat log:

########################## kylin.sh start output
###################################
root@sandbox ~]# kylin.sh start
KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize does
not exist
15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
hive.server2.enable.impersonation does not exist

Logging initialized using configuration in
file:/etc/hive/conf/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive dependency:
/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase might
not work
A new Kylin instance is started by root, stop it using "kylin.sh stop"
Please visit http://<your_sandbox_ip>:7070/kylin to play with the cubes!
(Useranme: ADMIN, Password: KYLIN)
You can check the log at
/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
##############################################################################

################################ tomcat/logs/kylin.log
###########################
usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ] [
-nonaming ]  { -help | start | stop }
Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
lifecycleEvent
INFO: The APR based Apache Tomcat Native library which allows optimal
performance in production environments was not found on the
java.library.path:
:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-bio-7070"]
Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 847 ms
Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
startInternal
INFO: Starting service Catalina
Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
startInternal
INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive
/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner scan
WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar] from
classloader hierarchy
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at
org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner scan
WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
from classloader hierarchy
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at
org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner scan
WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
from classloader hierarchy
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at
org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner scan
WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar] from
classloader hierarchy
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at
org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
processAnnotationsJar
SEVERE: contextConfig.jarFile
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
at
org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
processAnnotationsJar
SEVERE: contextConfig.jarFile
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
at
org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
processAnnotationsJar
SEVERE: contextConfig.jarFile
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
at
org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
at
org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
processResourceJARs
SEVERE: Failed to processes JAR found at URL
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
to be included in context with name
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
processResourceJARs
SEVERE: Failed to processes JAR found at URL
[jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
to be included in context with name
[jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
processResourceJARs
SEVERE: Failed to processes JAR found at URL
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
be included in context with name
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
WARNING: Failed to process JAR
[jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
at
org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
WARNING: Failed to process JAR
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
at
org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
WARNING: Failed to process JAR
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
at
org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
WARNING: Failed to process JAR
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:215)
at java.util.zip.ZipFile.<init>(ZipFile.java:145)
at java.util.jar.JarFile.<init>(JarFile.java:154)
at java.util.jar.JarFile.<init>(JarFile.java:91)
at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
at
org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
[localhost-startStop-1]:[2015-12-07
11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
- KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
[localhost-startStop-1]:[2015-12-07
11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
- Loading properties file from resource loaded through InputStream
[localhost-startStop-1]:[2015-12-07
11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
- KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:zookeeper.version=3.4.6-2--1, built on 03/31/2015 19:31
GMT
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:host.name=sandbox.hortonworks.com
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:java.version=1.7.0_79
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:java.vendor=Oracle Corporation
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client
environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client
environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client
environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client
environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:java.compiler=<NA>
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:os.name=Linux
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:os.arch=amd64
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:os.version=2.6.32-504.16.2.el6.x86_64
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:user.name=root
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:user.home=/root
2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Client environment:user.dir=/root
2015-12-07 11:11:04,924 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
Initiating client connection, connectString=sandbox.hortonworks.com:2181
sessionTimeout=30000 watcher=hconnection-0x27eeefd2, quorum=
sandbox.hortonworks.com:2181, baseZNode=/hbase-unsecure
2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
attempt to authenticate using SASL (unknown error)
2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket connection
established to sandbox.hortonworks.com/10.0.2.15:2181, initiating session
2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session establishment
complete on server sandbox.hortonworks.com/10.0.2.15:2181, sessionid =
0x1517c12f0f0000b, negotiated timeout = 30000
2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
address=null
2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient: Use
SIMPLE authentication for service MasterService, sasl=false
2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
Connecting to sandbox.hortonworks.com/10.0.2.15:60000
[localhost-startStop-1]:[2015-12-07
11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
- Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'org.springframework.cache.config.internalCacheAdvisor':
Cannot resolve reference to bean
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
while setting bean property 'cacheOperationSource'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
resolve reference to bean
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at
org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at
org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
at
org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
at
org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
reference to bean
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
while setting bean property 'cacheOperationSource'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
resolve reference to bean
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at
org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
at
org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
at
org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
at
org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
at
org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
... 23 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
resolve reference to bean
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 40 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
resolve reference to bean
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at
org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
at
org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at
org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
at
org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
at
org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
at
org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
at
org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
... 45 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
at
org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
at
org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 64 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name '(inner bean)': Cannot create inner bean '(inner
bean)' of type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
at
org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
at
org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
... 78 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at
org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
at
org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
... 86 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 94 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'permissionEvaluator' defined in class path
resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at
org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
at
org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 104 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at
org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 116 more
Caused by: org.springframework.beans.BeanInstantiationException: Could not
instantiate bean class [org.apache.kylin.rest.service.AclService]:
Constructor threw exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
at
org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
... 124 more
Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
at
org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
at
org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
at
org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
at
org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
... 126 more
Caused by:
org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
at
com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
at
org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
at
org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 136 more
Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
startInternal
SEVERE: Error listenerStart
Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
startInternal
SEVERE: Context [/kylin] startup failed due to previous errors
Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/kylin] appears to have started a thread named
[localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)] but has
failed to stop it. This is very likely to create a memory leak.
Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/kylin] appears to have started a thread named
[localhost-startStop-1-EventThread] but has failed to stop it. This is very
likely to create a memory leak.
Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/kylin] appears to have started a thread named
[Thread-6] but has failed to stop it. This is very likely to create a
memory leak.
Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/kylin] appears to have started a thread named
[IPC Client (514096504) connection to
sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to stop
it. This is very likely to create a memory leak.
Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deployment of web application archive
/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
finished in 15,925 ms
Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["http-bio-7070"]
Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["ajp-bio-9009"]
Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 15987 ms"
##############################################################################

What am I missing?

Kind regards
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Veli Kerim Celik <vk...@gmail.com>.
Hi Li,

Thanks for the reply. I don't how and why, but suddenly now Kylin web page
shows up. I did not change anything beside a restart of the operating
system, HBase and Kylin.

Thanks, for the reply

2015-12-10 9:46 GMT+01:00 Li Yang <li...@apache.org>:

> Actually the exceptions about ScanJar cannot processing ojdbc6.jar are no
> problem at all. Just ignore them, they don't affect Kylin from working
> properly.
>
> As to the GUI issue, could you look at browser's dev console and find any
> error requests?  For Chrome and Firefox, press F12 to bring up dev console,
> refresh the page, then find under the "Network" tab, see if any error.
>
> On Tue, Dec 8, 2015 at 4:38 PM, Veli Kerim Celik <vk...@gmail.com>
> wrote:
>
>> I resolved the problem with insufficient HBase permissions by enabling
>> permissions (set hbase.security.authorization to true) and added root as
>> superuser (append ",root" to hbase.superuser).
>>
>> Now Kylin seems to start fine (e.g. no exceptions in
>> tomcat/logs/kylin.log), but I got another problem. The Kylin webpage is
>> found, but still blank. Curl confirms this:
>>
>> ############## Kylin HTTP header ############
>> veli@cdev ~ $ curl -I http://localhost:7070/kylin
>> HTTP/1.1 302 Found
>> Server: Apache-Coyote/1.1
>> Location: http://localhost:7070/kylin/
>> Transfer-Encoding: chunked
>> Date: Tue, 08 Dec 2015 08:29:50 GMT
>> ###########################################
>>
>> What could be the problem?
>>
>>
>> 2015-12-08 9:07 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>
>>> And by the way I have run the script bin/check-env.sh. It did not give
>>> any errors.
>>>
>>> I also checked whether port forwarding works by running the following
>>> command on my host machine. It works fine.
>>>
>>> ############## check port forwarding ############
>>> veli@cdev ~ $ curl -I http://localhost:7070/kylin
>>> HTTP/1.1 404 Not Found
>>> Server: Apache-Coyote/1.1
>>> Content-Length: 0
>>> Date: Tue, 08 Dec 2015 07:48:19 GMT
>>> ############################################
>>>
>>> At the moment both hbase.security.authorization and dfs.permissions.enabled
>>> is set to false (in Ambari). But HBase still say "Insufficient
>>> permissions for user 'root (auth:SIMPLE)'..." in the log file
>>> (bin/tomcat/logs/kylin.log).
>>>
>>>
>>>
>>> 2015-12-07 17:53 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>>
>>>> I futhermore tried to disable HBase authorization
>>>> (set hbase.security.authorization to false), and restarted HBase and Kylin.
>>>> It did not get rid of the exception.
>>>>
>>>> 2015-12-07 17:28 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>>>
>>>>> I am doing port forwarding through ssh (i.e. "ssh -L
>>>>> 7070:localhost:7070 root@127.0.0.1 -p 2222"). It seems to be working.
>>>>>
>>>>> I have downloaded the file ojdbc6.jar (from
>>>>> http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
>>>>> and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The
>>>>> lines with java.io.FileNotFoundException are gone. Nice. Thanks.
>>>>>
>>>>> But now I am getting some new exceptions like:
>>>>>
>>>>> ########################## exception start
>>>>> #################################
>>>>> [localhost-startStop-1]:[2015-12-07
>>>>> 15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>>> - Context initialization failed
>>>>> org.springframework.beans.factory.BeanCreationException: Error
>>>>> creating bean with name
>>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>>> Cannot resolve reference to bean
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>> resolve reference to bean
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>
>>>>> ########################################################################
>>>>>
>>>>> I disabled HDFS permissions (set dfs.permissions.enabled to false),
>>>>> and restarted HDFS and Kylin, but it did not get rid of the
>>>>> exception.
>>>>>
>>>>> Kind regards Veli
>>>>>
>>>>>
>>>>> 2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:
>>>>>
>>>>>> Hi Veli,
>>>>>>
>>>>>> You need to put the download obdbc6.jar  into this location
>>>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin
>>>>>> in host machine you need to do port forwarding for port 7070.
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>> Sudeep
>>>>>>
>>>>>> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello
>>>>>>>
>>>>>>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for
>>>>>>> VirtualBox (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it
>>>>>>> into VirtualBox.
>>>>>>>
>>>>>>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>>>>>>> machine.
>>>>>>>
>>>>>>> After booting it up I login to Ambari at http://localhost:8080/
>>>>>>> (from host machine) and start up HBase. HBase starts up without any
>>>>>>> problems.
>>>>>>>
>>>>>>> I then ssh into the virtual machine using the following command:
>>>>>>> "ssh -L 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>>>>>>
>>>>>>> I then download Kylin binary release from "
>>>>>>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>>>>>>> and extract into the directory /root/bin.
>>>>>>>
>>>>>>> I then change .bash_profile so it looks like this:
>>>>>>> ############################ /root/.bash_profile
>>>>>>> ###############################
>>>>>>> # .bash_profile
>>>>>>>
>>>>>>> # Get the aliases and functions
>>>>>>> if [ -f ~/.bashrc ]; then
>>>>>>>         . ~/.bashrc
>>>>>>> fi
>>>>>>>
>>>>>>> # User specific environment and startup programs
>>>>>>>
>>>>>>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>>>>>>> export KYLIN_HOME
>>>>>>>
>>>>>>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>>>>>>
>>>>>>> export PATH
>>>>>>>
>>>>>>> ##############################################################################
>>>>>>>
>>>>>>> I then start Kylin up by using the command: "kylin.sh start". I then
>>>>>>> try to access Kylin through http://localhost:7070/kylin (from host
>>>>>>> machine) and get a blank page (eg. not 404).
>>>>>>>
>>>>>>> I get the following output from kylin.sh start and tomcat log:
>>>>>>>
>>>>>>> ########################## kylin.sh start output
>>>>>>> ###################################
>>>>>>> root@sandbox ~]# kylin.sh start
>>>>>>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>>>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize
>>>>>>> does not exist
>>>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>>>>>>> hive.server2.enable.impersonation does not exist
>>>>>>>
>>>>>>> Logging initialized using configuration in
>>>>>>> file:/etc/hive/conf/hive-log4j.properties
>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>>>> hive dependency:
>>>>>>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>>>>>>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>>>>>>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>>>>>>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>>>>>>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and
>>>>>>> hbase might not work
>>>>>>> A new Kylin instance is started by root, stop it using "kylin.sh
>>>>>>> stop"
>>>>>>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the
>>>>>>> cubes! (Useranme: ADMIN, Password: KYLIN)
>>>>>>> You can check the log at
>>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>>>>>>
>>>>>>> ##############################################################################
>>>>>>>
>>>>>>> ################################ tomcat/logs/kylin.log
>>>>>>> ###########################
>>>>>>> usage: java org.apache.catalina.startup.Catalina [ -config
>>>>>>> {pathname} ] [ -nonaming ]  { -help | start | stop }
>>>>>>> Dec 07, 2015 11:10:49 AM
>>>>>>> org.apache.catalina.core.AprLifecycleListener lifecycleEvent
>>>>>>> INFO: The APR based Apache Tomcat Native library which allows
>>>>>>> optimal performance in production environments was not found on the
>>>>>>> java.library.path:
>>>>>>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>>>>>>> INFO: Initialization processed in 847 ms
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>>>>>>> startInternal
>>>>>>> INFO: Starting service Catalina
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>>>>>>> startInternal
>>>>>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig
>>>>>>> deployWAR
>>>>>>> INFO: Deploying web application archive
>>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>>> WARNING: Failed to scan
>>>>>>> [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar] from classloader hierarchy
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>>> WARNING: Failed to scan
>>>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>>> WARNING: Failed to scan
>>>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>>> WARNING: Failed to scan
>>>>>>> [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar] from classloader hierarchy
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processAnnotationsJar
>>>>>>> SEVERE: contextConfig.jarFile
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processAnnotationsJar
>>>>>>> SEVERE: contextConfig.jarFile
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processAnnotationsJar
>>>>>>> SEVERE: contextConfig.jarFile
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processResourceJARs
>>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>>>>>>> to be included in context with name
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processResourceJARs
>>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>>>>>>> to be included in context with name
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>>> processResourceJARs
>>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>>>>>>> be included in context with name
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>>>> tldScanJar
>>>>>>> WARNING: Failed to process JAR
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>>>> tldScanJar
>>>>>>> WARNING: Failed to process JAR
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>>> body
>>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already
>>>>>>> defined
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>>>> tldScanJar
>>>>>>> WARNING: Failed to process JAR
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>>>> tldScanJar
>>>>>>> WARNING: Failed to process JAR
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>>>>>>> java.io.FileNotFoundException:
>>>>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No such file or directory)
>>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>>> at
>>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>>>>>>> - Loading properties file from resource loaded through InputStream
>>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>>>>>>> on 03/31/2015 19:31 GMT
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:host.name=
>>>>>>> sandbox.hortonworks.com
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client
>>>>>>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client
>>>>>>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client
>>>>>>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client
>>>>>>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client
>>>>>>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:user.name=root
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:user.home=/root
>>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>>>>>>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>>>>>>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>>>>>>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>>>>>>> baseZNode=/hbase-unsecure
>>>>>>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>>>>>>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>>>>>>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>>>>>>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>>>>>>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will
>>>>>>> not attempt to authenticate using SASL (unknown error)
>>>>>>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket
>>>>>>> connection established to sandbox.hortonworks.com/10.0.2.15:2181,
>>>>>>> initiating session
>>>>>>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>>>>>>> establishment complete on server
>>>>>>> sandbox.hortonworks.com/10.0.2.15:2181, sessionid =
>>>>>>> 0x1517c12f0f0000b, negotiated timeout = 30000
>>>>>>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>>>>>>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>>>>>>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>>>>>>> address=null
>>>>>>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>>> Use SIMPLE authentication for service MasterService, sasl=false
>>>>>>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>>>>> - Context initialization failed
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error
>>>>>>> creating bean with name
>>>>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>>>>> Cannot resolve reference to bean
>>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>>> resolve reference to bean
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>>> type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>>>>>>> at
>>>>>>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>>>>>>> at
>>>>>>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>>>>>>> at
>>>>>>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>>>>>>> at
>>>>>>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>>>>>>> at
>>>>>>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>>>>>>> at
>>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>>> at
>>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name
>>>>>>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>>>>>>> reference to bean
>>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>>> resolve reference to bean
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>>> type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>>>> at
>>>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>>>> at
>>>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>>>> ... 23 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name
>>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>>> resolve reference to bean
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>>> type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>>> ... 40 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name
>>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>>> resolve reference to bean
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>>> type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>>>> at
>>>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>>>> at
>>>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>>>> ... 45 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name
>>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>>> type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>>> ... 64 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name '(inner bean)': Cannot create inner bean
>>>>>>> '(inner bean)' of type
>>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>>>> ... 78 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name '(inner bean)': Cannot resolve reference to
>>>>>>> bean 'expressionHandler' while setting constructor argument; nested
>>>>>>> exception is org.springframework.beans.factory.BeanCreationException: Error
>>>>>>> creating bean with name 'expressionHandler' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>>>> ... 86 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name 'expressionHandler' defined in class path
>>>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean
>>>>>>> 'permissionEvaluator' while setting bean property 'permissionEvaluator';
>>>>>>> nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>>> setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>>> ... 94 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name 'permissionEvaluator' defined in class path
>>>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>>>>>>> while setting constructor argument; nested exception is
>>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>>> bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>>> ... 104 more
>>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>>> Error creating bean with name 'aclService' defined in file
>>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>>> Instantiation of bean failed; nested exception is
>>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>>> exception; nested exception is
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>>> ... 116 more
>>>>>>> Caused by: org.springframework.beans.BeanInstantiationException:
>>>>>>> Could not instantiate bean class
>>>>>>> [org.apache.kylin.rest.service.AclService]: Constructor threw exception;
>>>>>>> nested exception is org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at
>>>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>>>>>>> at
>>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>>>>>>> ... 124 more
>>>>>>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>>> Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>>>> at
>>>>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>>>>> at
>>>>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>>>>>>> at
>>>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>>>>>>> at
>>>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>>>>>>> at
>>>>>>> org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>>> Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>>>> at
>>>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>>>>>>> ... 126 more
>>>>>>> Caused by:
>>>>>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>>> at
>>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>> at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>>>>>> ... 136 more
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>>>> startInternal
>>>>>>> SEVERE: Error listenerStart
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>>>> startInternal
>>>>>>> SEVERE: Context [/kylin] startup failed due to previous errors
>>>>>>> Dec 07, 2015 11:11:05 AM
>>>>>>> org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
>>>>>>> SEVERE: The web application [/kylin] appears to have started a
>>>>>>> thread named [localhost-startStop-1-SendThread(
>>>>>>> sandbox.hortonworks.com:2181)] but has failed to stop it. This is
>>>>>>> very likely to create a memory leak.
>>>>>>> Dec 07, 2015 11:11:05 AM
>>>>>>> org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
>>>>>>> SEVERE: The web application [/kylin] appears to have started a
>>>>>>> thread named [localhost-startStop-1-EventThread] but has failed to stop it.
>>>>>>> This is very likely to create a memory leak.
>>>>>>> Dec 07, 2015 11:11:05 AM
>>>>>>> org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
>>>>>>> SEVERE: The web application [/kylin] appears to have started a
>>>>>>> thread named [Thread-6] but has failed to stop it. This is very likely to
>>>>>>> create a memory leak.
>>>>>>> Dec 07, 2015 11:11:05 AM
>>>>>>> org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
>>>>>>> SEVERE: The web application [/kylin] appears to have started a
>>>>>>> thread named [IPC Client (514096504) connection to
>>>>>>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed
>>>>>>> to stop it. This is very likely to create a memory leak.
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig
>>>>>>> deployWAR
>>>>>>> INFO: Deployment of web application archive
>>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>>>>>>> finished in 15,925 ms
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>>>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>>>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>>>>>>> INFO: Server startup in 15987 ms"
>>>>>>>
>>>>>>> ##############################################################################
>>>>>>>
>>>>>>> What am I missing?
>>>>>>>
>>>>>>> Kind regards
>>>>>>> Veli K. Celik
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Thanks and Regards
>>>>>>
>>>>>> Sudeep Dey
>>>>>> Zaloni,Inc. | www.zaloni.com
>>>>>> 633 Davis Drive, Suite 450
>>>>>> Durham, NC 27713
>>>>>> e: s <jb...@zaloni.com>dey@zaloni.com
>>>>>>
>>>>>>
>>>>>> "This e-mail, including attachments, may include confidential and/or
>>>>>> proprietary information, and may be used only by the person or entity
>>>>>> to which it is addressed. If the reader of this e-mail is not the
>>>>>> intended
>>>>>> recipient or his or her authorized agent, the reader is hereby
>>>>>> notified
>>>>>> that any dissemination, distribution or copying of this e-mail is
>>>>>> prohibited. If you have received this e-mail in error, please notify
>>>>>> the
>>>>>> sender by replying to this message and delete this e-mail
>>>>>> immediately."
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Med venlig hilsen
>>>>> Veli K. Celik
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Med venlig hilsen
>>>> Veli K. Celik
>>>>
>>>
>>>
>>>
>>> --
>>> Med venlig hilsen
>>> Veli K. Celik
>>>
>>
>>
>>
>> --
>> Med venlig hilsen
>> Veli K. Celik
>>
>
>


-- 
Med venlig hilsen
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Li Yang <li...@apache.org>.
Actually the exceptions about ScanJar cannot processing ojdbc6.jar are no
problem at all. Just ignore them, they don't affect Kylin from working
properly.

As to the GUI issue, could you look at browser's dev console and find any
error requests?  For Chrome and Firefox, press F12 to bring up dev console,
refresh the page, then find under the "Network" tab, see if any error.

On Tue, Dec 8, 2015 at 4:38 PM, Veli Kerim Celik <vk...@gmail.com> wrote:

> I resolved the problem with insufficient HBase permissions by enabling
> permissions (set hbase.security.authorization to true) and added root as
> superuser (append ",root" to hbase.superuser).
>
> Now Kylin seems to start fine (e.g. no exceptions in
> tomcat/logs/kylin.log), but I got another problem. The Kylin webpage is
> found, but still blank. Curl confirms this:
>
> ############## Kylin HTTP header ############
> veli@cdev ~ $ curl -I http://localhost:7070/kylin
> HTTP/1.1 302 Found
> Server: Apache-Coyote/1.1
> Location: http://localhost:7070/kylin/
> Transfer-Encoding: chunked
> Date: Tue, 08 Dec 2015 08:29:50 GMT
> ###########################################
>
> What could be the problem?
>
>
> 2015-12-08 9:07 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>
>> And by the way I have run the script bin/check-env.sh. It did not give
>> any errors.
>>
>> I also checked whether port forwarding works by running the following
>> command on my host machine. It works fine.
>>
>> ############## check port forwarding ############
>> veli@cdev ~ $ curl -I http://localhost:7070/kylin
>> HTTP/1.1 404 Not Found
>> Server: Apache-Coyote/1.1
>> Content-Length: 0
>> Date: Tue, 08 Dec 2015 07:48:19 GMT
>> ############################################
>>
>> At the moment both hbase.security.authorization and dfs.permissions.enabled
>> is set to false (in Ambari). But HBase still say "Insufficient
>> permissions for user 'root (auth:SIMPLE)'..." in the log file
>> (bin/tomcat/logs/kylin.log).
>>
>>
>>
>> 2015-12-07 17:53 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>
>>> I futhermore tried to disable HBase authorization
>>> (set hbase.security.authorization to false), and restarted HBase and Kylin.
>>> It did not get rid of the exception.
>>>
>>> 2015-12-07 17:28 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>>
>>>> I am doing port forwarding through ssh (i.e. "ssh -L
>>>> 7070:localhost:7070 root@127.0.0.1 -p 2222"). It seems to be working.
>>>>
>>>> I have downloaded the file ojdbc6.jar (from
>>>> http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
>>>> and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The
>>>> lines with java.io.FileNotFoundException are gone. Nice. Thanks.
>>>>
>>>> But now I am getting some new exceptions like:
>>>>
>>>> ########################## exception start
>>>> #################################
>>>> [localhost-startStop-1]:[2015-12-07
>>>> 15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>> - Context initialization failed
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>> Cannot resolve reference to bean
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>> resolve reference to bean
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> ########################################################################
>>>>
>>>> I disabled HDFS permissions (set dfs.permissions.enabled to false), and
>>>> restarted HDFS and Kylin, but it did not get rid of the exception.
>>>>
>>>> Kind regards Veli
>>>>
>>>>
>>>> 2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:
>>>>
>>>>> Hi Veli,
>>>>>
>>>>> You need to put the download obdbc6.jar  into this location
>>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin
>>>>> in host machine you need to do port forwarding for port 7070.
>>>>>
>>>>> Regards
>>>>>
>>>>> Sudeep
>>>>>
>>>>> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hello
>>>>>>
>>>>>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for
>>>>>> VirtualBox (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it
>>>>>> into VirtualBox.
>>>>>>
>>>>>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>>>>>> machine.
>>>>>>
>>>>>> After booting it up I login to Ambari at http://localhost:8080/
>>>>>> (from host machine) and start up HBase. HBase starts up without any
>>>>>> problems.
>>>>>>
>>>>>> I then ssh into the virtual machine using the following command: "ssh
>>>>>> -L 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>>>>>
>>>>>> I then download Kylin binary release from "
>>>>>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>>>>>> and extract into the directory /root/bin.
>>>>>>
>>>>>> I then change .bash_profile so it looks like this:
>>>>>> ############################ /root/.bash_profile
>>>>>> ###############################
>>>>>> # .bash_profile
>>>>>>
>>>>>> # Get the aliases and functions
>>>>>> if [ -f ~/.bashrc ]; then
>>>>>>         . ~/.bashrc
>>>>>> fi
>>>>>>
>>>>>> # User specific environment and startup programs
>>>>>>
>>>>>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>>>>>> export KYLIN_HOME
>>>>>>
>>>>>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>>>>>
>>>>>> export PATH
>>>>>>
>>>>>> ##############################################################################
>>>>>>
>>>>>> I then start Kylin up by using the command: "kylin.sh start". I then
>>>>>> try to access Kylin through http://localhost:7070/kylin (from host
>>>>>> machine) and get a blank page (eg. not 404).
>>>>>>
>>>>>> I get the following output from kylin.sh start and tomcat log:
>>>>>>
>>>>>> ########################## kylin.sh start output
>>>>>> ###################################
>>>>>> root@sandbox ~]# kylin.sh start
>>>>>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize
>>>>>> does not exist
>>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>>>>>> hive.server2.enable.impersonation does not exist
>>>>>>
>>>>>> Logging initialized using configuration in
>>>>>> file:/etc/hive/conf/hive-log4j.properties
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>> explanation.
>>>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>>> hive dependency:
>>>>>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>>>>>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>>>>>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>>>>>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>>>>>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
>>>>>> might not work
>>>>>> A new Kylin instance is started by root, stop it using "kylin.sh stop"
>>>>>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the
>>>>>> cubes! (Useranme: ADMIN, Password: KYLIN)
>>>>>> You can check the log at
>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>>>>>
>>>>>> ##############################################################################
>>>>>>
>>>>>> ################################ tomcat/logs/kylin.log
>>>>>> ###########################
>>>>>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname}
>>>>>> ] [ -nonaming ]  { -help | start | stop }
>>>>>> Dec 07, 2015 11:10:49 AM
>>>>>> org.apache.catalina.core.AprLifecycleListener lifecycleEvent
>>>>>> INFO: The APR based Apache Tomcat Native library which allows optimal
>>>>>> performance in production environments was not found on the
>>>>>> java.library.path:
>>>>>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>> explanation.
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>>>>>> INFO: Initialization processed in 847 ms
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>>>>>> startInternal
>>>>>> INFO: Starting service Catalina
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>>>>>> startInternal
>>>>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig
>>>>>> deployWAR
>>>>>> INFO: Deploying web application archive
>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>> WARNING: Failed to scan
>>>>>> [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar] from classloader hierarchy
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>> WARNING: Failed to scan
>>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>> WARNING: Failed to scan
>>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:10:50 AM
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar]
>>>>>> from classloader hierarchy
>>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>>> (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>>>>>> processAnnotationsJar
>>>>>> SEVERE: contextConfig.jarFile
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>>>>>> processAnnotationsJar
>>>>>> SEVERE: contextConfig.jarFile
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>>>>>> processAnnotationsJar
>>>>>> SEVERE: contextConfig.jarFile
>>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>>> (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>>> at
>>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>> processResourceJARs
>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>>>>>> to be included in context with name
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>> processResourceJARs
>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>>>>>> to be included in context with name
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>>> processResourceJARs
>>>>>> SEVERE: Failed to processes JAR found at URL
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>>>>>> be included in context with name
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>>> tldScanJar
>>>>>> WARNING: Failed to process JAR
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>>> tldScanJar
>>>>>> WARNING: Failed to process JAR
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule
>>>>>> body
>>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>>> tldScanJar
>>>>>> WARNING: Failed to process JAR
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>>> java.io.FileNotFoundException:
>>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>>> tldScanJar
>>>>>> WARNING: Failed to process JAR
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>>> (No such file or directory)
>>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>>> at
>>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>>> at
>>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>>> at
>>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>> explanation.
>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>>>>>> - Loading properties file from resource loaded through InputStream
>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>>>>>> on 03/31/2015 19:31 GMT
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:host.name=
>>>>>> sandbox.hortonworks.com
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client
>>>>>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client
>>>>>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client
>>>>>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client
>>>>>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client
>>>>>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:user.name=root
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:user.home=/root
>>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>>>>>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>>>>>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>>>>>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>>>>>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>>>>>> baseZNode=/hbase-unsecure
>>>>>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>>>>>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>>>>>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>>>>>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>>>>>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will
>>>>>> not attempt to authenticate using SASL (unknown error)
>>>>>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket
>>>>>> connection established to sandbox.hortonworks.com/10.0.2.15:2181,
>>>>>> initiating session
>>>>>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>>>>>> establishment complete on server
>>>>>> sandbox.hortonworks.com/10.0.2.15:2181, sessionid =
>>>>>> 0x1517c12f0f0000b, negotiated timeout = 30000
>>>>>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>>>>>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>>>>>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>>>>>> address=null
>>>>>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>> Use SIMPLE authentication for service MasterService, sasl=false
>>>>>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>>>>>> [localhost-startStop-1]:[2015-12-07
>>>>>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>>>> - Context initialization failed
>>>>>> org.springframework.beans.factory.BeanCreationException: Error
>>>>>> creating bean with name
>>>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>>>> Cannot resolve reference to bean
>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>> resolve reference to bean
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>> type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>>>>>> at
>>>>>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>>>>>> at
>>>>>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>>>>>> at
>>>>>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>>>>>> at
>>>>>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>>>>>> at
>>>>>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>>>>>> at
>>>>>> org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>>> at
>>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>>> at
>>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name
>>>>>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>>>>>> reference to bean
>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>> resolve reference to bean
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>> type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>>> at
>>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>>> at
>>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>>> ... 23 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name
>>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>> resolve reference to bean
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>> type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>> ... 40 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name
>>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>>> resolve reference to bean
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>> type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>>> at
>>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>>> at
>>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>>> at
>>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>>> ... 45 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name
>>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>>> Cannot create inner bean '(inner bean)' of type
>>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>>> while setting constructor argument with key [0]; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>>> type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>> ... 64 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name '(inner bean)': Cannot create inner bean
>>>>>> '(inner bean)' of type
>>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>>> ... 78 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name '(inner bean)': Cannot resolve reference to
>>>>>> bean 'expressionHandler' while setting constructor argument; nested
>>>>>> exception is org.springframework.beans.factory.BeanCreationException: Error
>>>>>> creating bean with name 'expressionHandler' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>>> ... 86 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name 'expressionHandler' defined in class path
>>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean
>>>>>> 'permissionEvaluator' while setting bean property 'permissionEvaluator';
>>>>>> nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>>> setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>> ... 94 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name 'permissionEvaluator' defined in class path
>>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>>>>>> while setting constructor argument; nested exception is
>>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>>> bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>> ... 104 more
>>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>>> Error creating bean with name 'aclService' defined in file
>>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>>> Instantiation of bean failed; nested exception is
>>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>>> exception; nested exception is
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>>> ... 116 more
>>>>>> Caused by: org.springframework.beans.BeanInstantiationException:
>>>>>> Could not instantiate bean class
>>>>>> [org.apache.kylin.rest.service.AclService]: Constructor threw exception;
>>>>>> nested exception is org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at
>>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>>>>>> at
>>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>>>>>> ... 124 more
>>>>>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>> Method)
>>>>>> at
>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>>> at
>>>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>>>> at
>>>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>>>>>> at
>>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>>>>>> at
>>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>>>>>> at
>>>>>> org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>> Method)
>>>>>> at
>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>>> at
>>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>>>>>> ... 126 more
>>>>>> Caused by:
>>>>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>>> at
>>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>
>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>>>>> ... 136 more
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>>> startInternal
>>>>>> SEVERE: Error listenerStart
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>>> startInternal
>>>>>> SEVERE: Context [/kylin] startup failed due to previous errors
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>>> clearReferencesThreads
>>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>>> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
>>>>>> but has failed to stop it. This is very likely to create a memory leak.
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>>> clearReferencesThreads
>>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>>> named [localhost-startStop-1-EventThread] but has failed to stop it. This
>>>>>> is very likely to create a memory leak.
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>>> clearReferencesThreads
>>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>>> named [Thread-6] but has failed to stop it. This is very likely to create a
>>>>>> memory leak.
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>>> clearReferencesThreads
>>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>>> named [IPC Client (514096504) connection to
>>>>>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to
>>>>>> stop it. This is very likely to create a memory leak.
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig
>>>>>> deployWAR
>>>>>> INFO: Deployment of web application archive
>>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>>>>>> finished in 15,925 ms
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>>>>>> INFO: Server startup in 15987 ms"
>>>>>>
>>>>>> ##############################################################################
>>>>>>
>>>>>> What am I missing?
>>>>>>
>>>>>> Kind regards
>>>>>> Veli K. Celik
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Thanks and Regards
>>>>>
>>>>> Sudeep Dey
>>>>> Zaloni,Inc. | www.zaloni.com
>>>>> 633 Davis Drive, Suite 450
>>>>> Durham, NC 27713
>>>>> e: s <jb...@zaloni.com>dey@zaloni.com
>>>>>
>>>>>
>>>>> "This e-mail, including attachments, may include confidential and/or
>>>>> proprietary information, and may be used only by the person or entity
>>>>> to which it is addressed. If the reader of this e-mail is not the
>>>>> intended
>>>>> recipient or his or her authorized agent, the reader is hereby notified
>>>>> that any dissemination, distribution or copying of this e-mail is
>>>>> prohibited. If you have received this e-mail in error, please notify
>>>>> the
>>>>> sender by replying to this message and delete this e-mail immediately."
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Med venlig hilsen
>>>> Veli K. Celik
>>>>
>>>
>>>
>>>
>>> --
>>> Med venlig hilsen
>>> Veli K. Celik
>>>
>>
>>
>>
>> --
>> Med venlig hilsen
>> Veli K. Celik
>>
>
>
>
> --
> Med venlig hilsen
> Veli K. Celik
>

Re: Kylin does not start correctly (detailed and logs included)

Posted by Veli Kerim Celik <vk...@gmail.com>.
I resolved the problem with insufficient HBase permissions by enabling
permissions (set hbase.security.authorization to true) and added root as
superuser (append ",root" to hbase.superuser).

Now Kylin seems to start fine (e.g. no exceptions in
tomcat/logs/kylin.log), but I got another problem. The Kylin webpage is
found, but still blank. Curl confirms this:

############## Kylin HTTP header ############
veli@cdev ~ $ curl -I http://localhost:7070/kylin
HTTP/1.1 302 Found
Server: Apache-Coyote/1.1
Location: http://localhost:7070/kylin/
Transfer-Encoding: chunked
Date: Tue, 08 Dec 2015 08:29:50 GMT
###########################################

What could be the problem?


2015-12-08 9:07 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:

> And by the way I have run the script bin/check-env.sh. It did not give any
> errors.
>
> I also checked whether port forwarding works by running the following
> command on my host machine. It works fine.
>
> ############## check port forwarding ############
> veli@cdev ~ $ curl -I http://localhost:7070/kylin
> HTTP/1.1 404 Not Found
> Server: Apache-Coyote/1.1
> Content-Length: 0
> Date: Tue, 08 Dec 2015 07:48:19 GMT
> ############################################
>
> At the moment both hbase.security.authorization and dfs.permissions.enabled
> is set to false (in Ambari). But HBase still say "Insufficient
> permissions for user 'root (auth:SIMPLE)'..." in the log file
> (bin/tomcat/logs/kylin.log).
>
>
>
> 2015-12-07 17:53 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>
>> I futhermore tried to disable HBase authorization
>> (set hbase.security.authorization to false), and restarted HBase and Kylin.
>> It did not get rid of the exception.
>>
>> 2015-12-07 17:28 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>>
>>> I am doing port forwarding through ssh (i.e. "ssh -L
>>> 7070:localhost:7070 root@127.0.0.1 -p 2222"). It seems to be working.
>>>
>>> I have downloaded the file ojdbc6.jar (from
>>> http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
>>> and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The
>>> lines with java.io.FileNotFoundException are gone. Nice. Thanks.
>>>
>>> But now I am getting some new exceptions like:
>>>
>>> ########################## exception start
>>> #################################
>>> [localhost-startStop-1]:[2015-12-07
>>> 15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>> - Context initialization failed
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>> Cannot resolve reference to bean
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>> while setting bean property 'cacheOperationSource'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>> resolve reference to bean
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> ########################################################################
>>>
>>> I disabled HDFS permissions (set dfs.permissions.enabled to false), and
>>> restarted HDFS and Kylin, but it did not get rid of the exception.
>>>
>>> Kind regards Veli
>>>
>>>
>>> 2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:
>>>
>>>> Hi Veli,
>>>>
>>>> You need to put the download obdbc6.jar  into this location
>>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin in
>>>> host machine you need to do port forwarding for port 7070.
>>>>
>>>> Regards
>>>>
>>>> Sudeep
>>>>
>>>> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hello
>>>>>
>>>>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for
>>>>> VirtualBox (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it
>>>>> into VirtualBox.
>>>>>
>>>>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>>>>> machine.
>>>>>
>>>>> After booting it up I login to Ambari at http://localhost:8080/ (from
>>>>> host machine) and start up HBase. HBase starts up without any problems.
>>>>>
>>>>> I then ssh into the virtual machine using the following command: "ssh
>>>>> -L 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>>>>
>>>>> I then download Kylin binary release from "
>>>>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>>>>> and extract into the directory /root/bin.
>>>>>
>>>>> I then change .bash_profile so it looks like this:
>>>>> ############################ /root/.bash_profile
>>>>> ###############################
>>>>> # .bash_profile
>>>>>
>>>>> # Get the aliases and functions
>>>>> if [ -f ~/.bashrc ]; then
>>>>>         . ~/.bashrc
>>>>> fi
>>>>>
>>>>> # User specific environment and startup programs
>>>>>
>>>>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>>>>> export KYLIN_HOME
>>>>>
>>>>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>>>>
>>>>> export PATH
>>>>>
>>>>> ##############################################################################
>>>>>
>>>>> I then start Kylin up by using the command: "kylin.sh start". I then
>>>>> try to access Kylin through http://localhost:7070/kylin (from host
>>>>> machine) and get a blank page (eg. not 404).
>>>>>
>>>>> I get the following output from kylin.sh start and tomcat log:
>>>>>
>>>>> ########################## kylin.sh start output
>>>>> ###################################
>>>>> root@sandbox ~]# kylin.sh start
>>>>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize
>>>>> does not exist
>>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>>>>> hive.server2.enable.impersonation does not exist
>>>>>
>>>>> Logging initialized using configuration in
>>>>> file:/etc/hive/conf/hive-log4j.properties
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>> hive dependency:
>>>>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>>>>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>>>>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>>>>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>>>>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
>>>>> might not work
>>>>> A new Kylin instance is started by root, stop it using "kylin.sh stop"
>>>>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the
>>>>> cubes! (Useranme: ADMIN, Password: KYLIN)
>>>>> You can check the log at
>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>>>>
>>>>> ##############################################################################
>>>>>
>>>>> ################################ tomcat/logs/kylin.log
>>>>> ###########################
>>>>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname}
>>>>> ] [ -nonaming ]  { -help | start | stop }
>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
>>>>> lifecycleEvent
>>>>> INFO: The APR based Apache Tomcat Native library which allows optimal
>>>>> performance in production environments was not found on the
>>>>> java.library.path:
>>>>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>>>>> INFO: Initialization processed in 847 ms
>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>>>>> startInternal
>>>>> INFO: Starting service Catalina
>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>>>>> startInternal
>>>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig
>>>>> deployWAR
>>>>> INFO: Deploying web application archive
>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>>>>> Dec 07, 2015 11:10:50 AM
>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar]
>>>>> from classloader hierarchy
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:10:50 AM
>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>> WARNING: Failed to scan
>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>> java.io.FileNotFoundException:
>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:10:50 AM
>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>> WARNING: Failed to scan
>>>>> [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar] from classloader hierarchy
>>>>> java.io.FileNotFoundException:
>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:10:50 AM
>>>>> org.apache.tomcat.util.scan.StandardJarScanner scan
>>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar]
>>>>> from classloader hierarchy
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>>>>> processAnnotationsJar
>>>>> SEVERE: contextConfig.jarFile
>>>>> java.io.FileNotFoundException:
>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>>>>> processAnnotationsJar
>>>>> SEVERE: contextConfig.jarFile
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>>>>> processAnnotationsJar
>>>>> SEVERE: contextConfig.jarFile
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>>> at
>>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>> processResourceJARs
>>>>> SEVERE: Failed to processes JAR found at URL
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>>>>> to be included in context with name
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>> processResourceJARs
>>>>> SEVERE: Failed to processes JAR found at URL
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>>>>> to be included in context with name
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>>> processResourceJARs
>>>>> SEVERE: Failed to processes JAR found at URL
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>>>>> be included in context with name
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>> tldScanJar
>>>>> WARNING: Failed to process JAR
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>>> tldScanJar
>>>>> WARNING: Failed to process JAR
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>> java.io.FileNotFoundException:
>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>> tldScanJar
>>>>> WARNING: Failed to process JAR
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>>> java.io.FileNotFoundException:
>>>>> /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>>> tldScanJar
>>>>> WARNING: Failed to process JAR
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>>> (No such file or directory)
>>>>> at java.util.zip.ZipFile.open(Native Method)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>>> at
>>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>>> at
>>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>>> at
>>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>>> at
>>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>>> at
>>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> [localhost-startStop-1]:[2015-12-07
>>>>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>> [localhost-startStop-1]:[2015-12-07
>>>>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>>>>> - Loading properties file from resource loaded through InputStream
>>>>> [localhost-startStop-1]:[2015-12-07
>>>>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>>>>> on 03/31/2015 19:31 GMT
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:host.name=
>>>>> sandbox.hortonworks.com
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client
>>>>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client
>>>>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client
>>>>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client
>>>>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client
>>>>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:user.name=root
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:user.home=/root
>>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>>>>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>>>>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>>>>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>>>>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>>>>> baseZNode=/hbase-unsecure
>>>>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>>>>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>>>>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>>>>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>>>>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
>>>>> attempt to authenticate using SASL (unknown error)
>>>>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket
>>>>> connection established to sandbox.hortonworks.com/10.0.2.15:2181,
>>>>> initiating session
>>>>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>>>>> establishment complete on server
>>>>> sandbox.hortonworks.com/10.0.2.15:2181, sessionid =
>>>>> 0x1517c12f0f0000b, negotiated timeout = 30000
>>>>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>>>>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>>>>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>>>>> address=null
>>>>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>> Use SIMPLE authentication for service MasterService, sasl=false
>>>>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>>>>> [localhost-startStop-1]:[2015-12-07
>>>>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>>> - Context initialization failed
>>>>> org.springframework.beans.factory.BeanCreationException: Error
>>>>> creating bean with name
>>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>>> Cannot resolve reference to bean
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>> resolve reference to bean
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>>>>> at
>>>>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>>>>> at
>>>>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>>>>> at
>>>>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>>>>> at
>>>>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>>>>> at
>>>>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>>>>> at
>>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>>> at
>>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>>> at
>>>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>>> at
>>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name
>>>>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>>>>> reference to bean
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>> resolve reference to bean
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>> at
>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>> at
>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>> ... 23 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name
>>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>> resolve reference to bean
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>> ... 40 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name
>>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>>> resolve reference to bean
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>>> at
>>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>>> at
>>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>>> at
>>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>>> ... 45 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name
>>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>>> Cannot create inner bean '(inner bean)' of type
>>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>>> while setting constructor argument with key [0]; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>>> type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>> ... 64 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name '(inner bean)': Cannot create inner bean
>>>>> '(inner bean)' of type
>>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>> ... 78 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name '(inner bean)': Cannot resolve reference to
>>>>> bean 'expressionHandler' while setting constructor argument; nested
>>>>> exception is org.springframework.beans.factory.BeanCreationException: Error
>>>>> creating bean with name 'expressionHandler' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>>> ... 86 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name 'expressionHandler' defined in class path
>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean
>>>>> 'permissionEvaluator' while setting bean property 'permissionEvaluator';
>>>>> nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>>> setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>> ... 94 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name 'permissionEvaluator' defined in class path
>>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>>>>> while setting constructor argument; nested exception is
>>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>>> bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>>> at
>>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>> ... 104 more
>>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>>> Error creating bean with name 'aclService' defined in file
>>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>>> Instantiation of bean failed; nested exception is
>>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>>> exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>>> at
>>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>>> at
>>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>>> ... 116 more
>>>>> Caused by: org.springframework.beans.BeanInstantiationException: Could
>>>>> not instantiate bean class [org.apache.kylin.rest.service.AclService]:
>>>>> Constructor threw exception; nested exception is
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at
>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>>>>> at
>>>>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>>>>> at
>>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>>>>> ... 124 more
>>>>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>> at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>> at
>>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>>> at
>>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>>>>> at
>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>>>>> at
>>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>>>>> at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>> at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>> at
>>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>>>>> ... 126 more
>>>>> Caused by:
>>>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>>> at
>>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>>> at
>>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>>>>> at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>>>> ... 136 more
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>> startInternal
>>>>> SEVERE: Error listenerStart
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>>> startInternal
>>>>> SEVERE: Context [/kylin] startup failed due to previous errors
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>> clearReferencesThreads
>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
>>>>> but has failed to stop it. This is very likely to create a memory leak.
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>> clearReferencesThreads
>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>> named [localhost-startStop-1-EventThread] but has failed to stop it. This
>>>>> is very likely to create a memory leak.
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>> clearReferencesThreads
>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>> named [Thread-6] but has failed to stop it. This is very likely to create a
>>>>> memory leak.
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>>> clearReferencesThreads
>>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>>> named [IPC Client (514096504) connection to
>>>>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to
>>>>> stop it. This is very likely to create a memory leak.
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig
>>>>> deployWAR
>>>>> INFO: Deployment of web application archive
>>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>>>>> finished in 15,925 ms
>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>>>>> INFO: Server startup in 15987 ms"
>>>>>
>>>>> ##############################################################################
>>>>>
>>>>> What am I missing?
>>>>>
>>>>> Kind regards
>>>>> Veli K. Celik
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Thanks and Regards
>>>>
>>>> Sudeep Dey
>>>> Zaloni,Inc. | www.zaloni.com
>>>> 633 Davis Drive, Suite 450
>>>> Durham, NC 27713
>>>> e: s <jb...@zaloni.com>dey@zaloni.com
>>>>
>>>>
>>>> "This e-mail, including attachments, may include confidential and/or
>>>> proprietary information, and may be used only by the person or entity
>>>> to which it is addressed. If the reader of this e-mail is not the
>>>> intended
>>>> recipient or his or her authorized agent, the reader is hereby notified
>>>> that any dissemination, distribution or copying of this e-mail is
>>>> prohibited. If you have received this e-mail in error, please notify the
>>>> sender by replying to this message and delete this e-mail immediately."
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Med venlig hilsen
>>> Veli K. Celik
>>>
>>
>>
>>
>> --
>> Med venlig hilsen
>> Veli K. Celik
>>
>
>
>
> --
> Med venlig hilsen
> Veli K. Celik
>



-- 
Med venlig hilsen
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Veli Kerim Celik <vk...@gmail.com>.
And by the way I have run the script bin/check-env.sh. It did not give any
errors.

I also checked whether port forwarding works by running the following
command on my host machine. It works fine.

############## check port forwarding ############
veli@cdev ~ $ curl -I http://localhost:7070/kylin
HTTP/1.1 404 Not Found
Server: Apache-Coyote/1.1
Content-Length: 0
Date: Tue, 08 Dec 2015 07:48:19 GMT
############################################

At the moment both hbase.security.authorization and dfs.permissions.enabled
is set to false (in Ambari). But HBase still say "Insufficient permissions
for user 'root (auth:SIMPLE)'..." in the log file
(bin/tomcat/logs/kylin.log).



2015-12-07 17:53 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:

> I futhermore tried to disable HBase authorization
> (set hbase.security.authorization to false), and restarted HBase and Kylin.
> It did not get rid of the exception.
>
> 2015-12-07 17:28 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:
>
>> I am doing port forwarding through ssh (i.e. "ssh -L 7070:localhost:7070
>> root@127.0.0.1 -p 2222"). It seems to be working.
>>
>> I have downloaded the file ojdbc6.jar (from
>> http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
>> and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The
>> lines with java.io.FileNotFoundException are gone. Nice. Thanks.
>>
>> But now I am getting some new exceptions like:
>>
>> ########################## exception start
>> #################################
>> [localhost-startStop-1]:[2015-12-07
>> 15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>> - Context initialization failed
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>> Cannot resolve reference to bean
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>> while setting bean property 'cacheOperationSource'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>> resolve reference to bean
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> ########################################################################
>>
>> I disabled HDFS permissions (set dfs.permissions.enabled to false), and
>> restarted HDFS and Kylin, but it did not get rid of the exception.
>>
>> Kind regards Veli
>>
>>
>> 2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:
>>
>>> Hi Veli,
>>>
>>> You need to put the download obdbc6.jar  into this location
>>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin in
>>> host machine you need to do port forwarding for port 7070.
>>>
>>> Regards
>>>
>>> Sudeep
>>>
>>> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
>>> wrote:
>>>
>>>> Hello
>>>>
>>>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for
>>>> VirtualBox (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it
>>>> into VirtualBox.
>>>>
>>>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>>>> machine.
>>>>
>>>> After booting it up I login to Ambari at http://localhost:8080/ (from
>>>> host machine) and start up HBase. HBase starts up without any problems.
>>>>
>>>> I then ssh into the virtual machine using the following command: "ssh
>>>> -L 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>>>
>>>> I then download Kylin binary release from "
>>>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>>>> and extract into the directory /root/bin.
>>>>
>>>> I then change .bash_profile so it looks like this:
>>>> ############################ /root/.bash_profile
>>>> ###############################
>>>> # .bash_profile
>>>>
>>>> # Get the aliases and functions
>>>> if [ -f ~/.bashrc ]; then
>>>>         . ~/.bashrc
>>>> fi
>>>>
>>>> # User specific environment and startup programs
>>>>
>>>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>>>> export KYLIN_HOME
>>>>
>>>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>>>
>>>> export PATH
>>>>
>>>> ##############################################################################
>>>>
>>>> I then start Kylin up by using the command: "kylin.sh start". I then
>>>> try to access Kylin through http://localhost:7070/kylin (from host
>>>> machine) and get a blank page (eg. not 404).
>>>>
>>>> I get the following output from kylin.sh start and tomcat log:
>>>>
>>>> ########################## kylin.sh start output
>>>> ###################################
>>>> root@sandbox ~]# kylin.sh start
>>>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize
>>>> does not exist
>>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>>>> hive.server2.enable.impersonation does not exist
>>>>
>>>> Logging initialized using configuration in
>>>> file:/etc/hive/conf/hive-log4j.properties
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>> hive dependency:
>>>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>>>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>>>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>>>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>>>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
>>>> might not work
>>>> A new Kylin instance is started by root, stop it using "kylin.sh stop"
>>>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the
>>>> cubes! (Useranme: ADMIN, Password: KYLIN)
>>>> You can check the log at
>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>>>
>>>> ##############################################################################
>>>>
>>>> ################################ tomcat/logs/kylin.log
>>>> ###########################
>>>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ]
>>>> [ -nonaming ]  { -help | start | stop }
>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
>>>> lifecycleEvent
>>>> INFO: The APR based Apache Tomcat Native library which allows optimal
>>>> performance in production environments was not found on the
>>>> java.library.path:
>>>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>>>> INFO: Initialization processed in 847 ms
>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>>>> startInternal
>>>> INFO: Starting service Catalina
>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>>>> startInternal
>>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig
>>>> deployWAR
>>>> INFO: Deploying web application archive
>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>>> scan
>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar]
>>>> from classloader hierarchy
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>>> scan
>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>>>> from classloader hierarchy
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>>> scan
>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>>>> from classloader hierarchy
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>>> scan
>>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar]
>>>> from classloader hierarchy
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>>>> processAnnotationsJar
>>>> SEVERE: contextConfig.jarFile
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>>>> processAnnotationsJar
>>>> SEVERE: contextConfig.jarFile
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>>>> processAnnotationsJar
>>>> SEVERE: contextConfig.jarFile
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>>> at
>>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>> processResourceJARs
>>>> SEVERE: Failed to processes JAR found at URL
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>>>> to be included in context with name
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>> processResourceJARs
>>>> SEVERE: Failed to processes JAR found at URL
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>>>> to be included in context with name
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>>> processResourceJARs
>>>> SEVERE: Failed to processes JAR found at URL
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>>>> be included in context with name
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>> tldScanJar
>>>> WARNING: Failed to process JAR
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>> at
>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>> at
>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig
>>>> tldScanJar
>>>> WARNING: Failed to process JAR
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>> at
>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>> at
>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>> tldScanJar
>>>> WARNING: Failed to process JAR
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>> at
>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>> at
>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig
>>>> tldScanJar
>>>> WARNING: Failed to process JAR
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>>> (No such file or directory)
>>>> at java.util.zip.ZipFile.open(Native Method)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>>> at
>>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>>> at
>>>> org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>>> at
>>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>>> at
>>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>>> at
>>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>>> at
>>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>>> at
>>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>> [localhost-startStop-1]:[2015-12-07
>>>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>> [localhost-startStop-1]:[2015-12-07
>>>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>>>> - Loading properties file from resource loaded through InputStream
>>>> [localhost-startStop-1]:[2015-12-07
>>>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>>>> on 03/31/2015 19:31 GMT
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:host.name=
>>>> sandbox.hortonworks.com
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client
>>>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client
>>>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client
>>>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client
>>>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client
>>>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:user.name=root
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:user.home=/root
>>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>>>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>>>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>>>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>>>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>>>> baseZNode=/hbase-unsecure
>>>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>>>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>>>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>>>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>>>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
>>>> attempt to authenticate using SASL (unknown error)
>>>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket connection
>>>> established to sandbox.hortonworks.com/10.0.2.15:2181, initiating
>>>> session
>>>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>>>> establishment complete on server sandbox.hortonworks.com/10.0.2.15:2181,
>>>> sessionid = 0x1517c12f0f0000b, negotiated timeout = 30000
>>>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>>>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>>>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>>>> address=null
>>>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>> Use SIMPLE authentication for service MasterService, sasl=false
>>>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>>>> [localhost-startStop-1]:[2015-12-07
>>>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>>> - Context initialization failed
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>>> Cannot resolve reference to bean
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>> resolve reference to bean
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>>>> at
>>>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>>>> at
>>>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>>>> at
>>>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>>>> at
>>>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>>>> at
>>>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>>>> at
>>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>>>> at
>>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>>> at
>>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>>> at
>>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>>> at
>>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name
>>>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>>>> reference to bean
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>>> while setting bean property 'cacheOperationSource'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>> resolve reference to bean
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>> at
>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>> at
>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>> ... 23 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name
>>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>> resolve reference to bean
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>> ... 40 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name
>>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>>> resolve reference to bean
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>>> at
>>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>>> at
>>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>>> at
>>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>>> ... 45 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name
>>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>>> Cannot create inner bean '(inner bean)' of type
>>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>>> while setting constructor argument with key [0]; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>>> type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>> ... 64 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name '(inner bean)': Cannot create inner bean
>>>> '(inner bean)' of type
>>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>>> 'expressionHandler' while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>> ... 78 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name '(inner bean)': Cannot resolve reference to
>>>> bean 'expressionHandler' while setting constructor argument; nested
>>>> exception is org.springframework.beans.factory.BeanCreationException: Error
>>>> creating bean with name 'expressionHandler' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>>> while setting bean property 'permissionEvaluator'; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>>> ... 86 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name 'expressionHandler' defined in class path
>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean
>>>> 'permissionEvaluator' while setting bean property 'permissionEvaluator';
>>>> nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'permissionEvaluator' defined in class path resource
>>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>>> setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>> ... 94 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name 'permissionEvaluator' defined in class path
>>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>>>> while setting constructor argument; nested exception is
>>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>>> bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>>> at
>>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>> ... 104 more
>>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>>> Error creating bean with name 'aclService' defined in file
>>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>>> Instantiation of bean failed; nested exception is
>>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>>> exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>>> at
>>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>>> at
>>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>>> ... 116 more
>>>> Caused by: org.springframework.beans.BeanInstantiationException: Could
>>>> not instantiate bean class [org.apache.kylin.rest.service.AclService]:
>>>> Constructor threw exception; nested exception is
>>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>>>> at
>>>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>>>> at
>>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>>>> ... 124 more
>>>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>> at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>> at
>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>> at
>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>> at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>>>> at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>>>> at
>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>>>> at
>>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>>>> at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>> at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>> at
>>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>>>> ... 126 more
>>>> Caused by:
>>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>>> tableName:kylin_metadata_acl, family:null,column: null
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>>> at
>>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>> at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>>>> at
>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>>>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>>>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>>> ... 136 more
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>> startInternal
>>>> SEVERE: Error listenerStart
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>>> startInternal
>>>> SEVERE: Context [/kylin] startup failed due to previous errors
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>> clearReferencesThreads
>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
>>>> but has failed to stop it. This is very likely to create a memory leak.
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>> clearReferencesThreads
>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>> named [localhost-startStop-1-EventThread] but has failed to stop it. This
>>>> is very likely to create a memory leak.
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>> clearReferencesThreads
>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>> named [Thread-6] but has failed to stop it. This is very likely to create a
>>>> memory leak.
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>>> clearReferencesThreads
>>>> SEVERE: The web application [/kylin] appears to have started a thread
>>>> named [IPC Client (514096504) connection to
>>>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to
>>>> stop it. This is very likely to create a memory leak.
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig
>>>> deployWAR
>>>> INFO: Deployment of web application archive
>>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>>>> finished in 15,925 ms
>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>>>> INFO: Server startup in 15987 ms"
>>>>
>>>> ##############################################################################
>>>>
>>>> What am I missing?
>>>>
>>>> Kind regards
>>>> Veli K. Celik
>>>>
>>>
>>>
>>>
>>> --
>>> Thanks and Regards
>>>
>>> Sudeep Dey
>>> Zaloni,Inc. | www.zaloni.com
>>> 633 Davis Drive, Suite 450
>>> Durham, NC 27713
>>> e: s <jb...@zaloni.com>dey@zaloni.com
>>>
>>>
>>> "This e-mail, including attachments, may include confidential and/or
>>> proprietary information, and may be used only by the person or entity
>>> to which it is addressed. If the reader of this e-mail is not the
>>> intended
>>> recipient or his or her authorized agent, the reader is hereby notified
>>> that any dissemination, distribution or copying of this e-mail is
>>> prohibited. If you have received this e-mail in error, please notify the
>>> sender by replying to this message and delete this e-mail immediately."
>>>
>>>
>>>
>>
>>
>> --
>> Med venlig hilsen
>> Veli K. Celik
>>
>
>
>
> --
> Med venlig hilsen
> Veli K. Celik
>



-- 
Med venlig hilsen
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Veli Kerim Celik <vk...@gmail.com>.
I futhermore tried to disable HBase authorization
(set hbase.security.authorization to false), and restarted HBase and Kylin.
It did not get rid of the exception.

2015-12-07 17:28 GMT+01:00 Veli Kerim Celik <vk...@gmail.com>:

> I am doing port forwarding through ssh (i.e. "ssh -L 7070:localhost:7070
> root@127.0.0.1 -p 2222"). It seems to be working.
>
> I have downloaded the file ojdbc6.jar (from
> http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
> and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The
> lines with java.io.FileNotFoundException are gone. Nice. Thanks.
>
> But now I am getting some new exceptions like:
>
> ########################## exception start
> #################################
> [localhost-startStop-1]:[2015-12-07
> 15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
> - Context initialization failed
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
> Cannot resolve reference to bean
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
> while setting bean property 'cacheOperationSource'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
> resolve reference to bean
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> ########################################################################
>
> I disabled HDFS permissions (set dfs.permissions.enabled to false), and
> restarted HDFS and Kylin, but it did not get rid of the exception.
>
> Kind regards Veli
>
>
> 2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:
>
>> Hi Veli,
>>
>> You need to put the download obdbc6.jar  into this location
>> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin in
>> host machine you need to do port forwarding for port 7070.
>>
>> Regards
>>
>> Sudeep
>>
>> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
>> wrote:
>>
>>> Hello
>>>
>>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for VirtualBox
>>> (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it into
>>> VirtualBox.
>>>
>>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>>> machine.
>>>
>>> After booting it up I login to Ambari at http://localhost:8080/ (from
>>> host machine) and start up HBase. HBase starts up without any problems.
>>>
>>> I then ssh into the virtual machine using the following command: "ssh -L
>>> 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>>
>>> I then download Kylin binary release from "
>>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>>> and extract into the directory /root/bin.
>>>
>>> I then change .bash_profile so it looks like this:
>>> ############################ /root/.bash_profile
>>> ###############################
>>> # .bash_profile
>>>
>>> # Get the aliases and functions
>>> if [ -f ~/.bashrc ]; then
>>>         . ~/.bashrc
>>> fi
>>>
>>> # User specific environment and startup programs
>>>
>>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>>> export KYLIN_HOME
>>>
>>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>>
>>> export PATH
>>>
>>> ##############################################################################
>>>
>>> I then start Kylin up by using the command: "kylin.sh start". I then try
>>> to access Kylin through http://localhost:7070/kylin (from host machine)
>>> and get a blank page (eg. not 404).
>>>
>>> I get the following output from kylin.sh start and tomcat log:
>>>
>>> ########################## kylin.sh start output
>>> ###################################
>>> root@sandbox ~]# kylin.sh start
>>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize
>>> does not exist
>>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>>> hive.server2.enable.impersonation does not exist
>>>
>>> Logging initialized using configuration in
>>> file:/etc/hive/conf/hive-log4j.properties
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> hive dependency:
>>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
>>> might not work
>>> A new Kylin instance is started by root, stop it using "kylin.sh stop"
>>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the
>>> cubes! (Useranme: ADMIN, Password: KYLIN)
>>> You can check the log at
>>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>>
>>> ##############################################################################
>>>
>>> ################################ tomcat/logs/kylin.log
>>> ###########################
>>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ]
>>> [ -nonaming ]  { -help | start | stop }
>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
>>> lifecycleEvent
>>> INFO: The APR based Apache Tomcat Native library which allows optimal
>>> performance in production environments was not found on the
>>> java.library.path:
>>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>>> INFO: Initialization processed in 847 ms
>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>>> startInternal
>>> INFO: Starting service Catalina
>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>>> startInternal
>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig deployWAR
>>> INFO: Deploying web application archive
>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar]
>>> from classloader hierarchy
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>>> from classloader hierarchy
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>>> from classloader hierarchy
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar]
>>> from classloader hierarchy
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>>> processAnnotationsJar
>>> SEVERE: contextConfig.jarFile
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>>> processAnnotationsJar
>>> SEVERE: contextConfig.jarFile
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>>> processAnnotationsJar
>>> SEVERE: contextConfig.jarFile
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>>> at
>>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>>> at
>>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>>> at
>>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>>> at
>>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>> processResourceJARs
>>> SEVERE: Failed to processes JAR found at URL
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>>> to be included in context with name
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>> processResourceJARs
>>> SEVERE: Failed to processes JAR found at URL
>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>>> to be included in context with name
>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>>> processResourceJARs
>>> SEVERE: Failed to processes JAR found at URL
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>>> be included in context with name
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
>>> WARNING: Failed to process JAR
>>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>> at
>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>> at
>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
>>> WARNING: Failed to process JAR
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>> at
>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>> at
>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
>>> WARNING: Failed to process JAR
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>> at
>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>> at
>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
>>> WARNING: Failed to process JAR
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar
>>> (No such file or directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>>> at
>>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>>> at
>>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>>> at
>>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>>> at
>>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>>> at
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>>> at
>>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> [localhost-startStop-1]:[2015-12-07
>>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>> [localhost-startStop-1]:[2015-12-07
>>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>>> - Loading properties file from resource loaded through InputStream
>>> [localhost-startStop-1]:[2015-12-07
>>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>>> on 03/31/2015 19:31 GMT
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:host.name=
>>> sandbox.hortonworks.com
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client
>>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client
>>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client
>>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client
>>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client
>>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:user.name=root
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:user.home=/root
>>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>>> baseZNode=/hbase-unsecure
>>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
>>> attempt to authenticate using SASL (unknown error)
>>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket connection
>>> established to sandbox.hortonworks.com/10.0.2.15:2181, initiating
>>> session
>>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>>> establishment complete on server sandbox.hortonworks.com/10.0.2.15:2181,
>>> sessionid = 0x1517c12f0f0000b, negotiated timeout = 30000
>>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>>> address=null
>>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient: Use
>>> SIMPLE authentication for service MasterService, sasl=false
>>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>>> [localhost-startStop-1]:[2015-12-07
>>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>>> - Context initialization failed
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>>> Cannot resolve reference to bean
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>> while setting bean property 'cacheOperationSource'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>> resolve reference to bean
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>>> at
>>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>>> at
>>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>>> at
>>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>>> at
>>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>>> at
>>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>>> at
>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>>> at
>>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>>> at
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>>> at
>>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name
>>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>>> reference to bean
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>>> while setting bean property 'cacheOperationSource'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>> resolve reference to bean
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>> at
>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>> at
>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>> at
>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>> at
>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>> at
>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>> ... 23 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name
>>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>>> BeanPostProcessor before instantiation of bean failed; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>> resolve reference to bean
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>> ... 40 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name
>>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>>> resolve reference to bean
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>>> at
>>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>>> at
>>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>>> at
>>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>>> at
>>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>>> at
>>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>>> ... 45 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name
>>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>>> Cannot create inner bean '(inner bean)' of type
>>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>>> while setting constructor argument with key [0]; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>>> type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>> ... 64 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name '(inner bean)': Cannot create inner bean
>>> '(inner bean)' of type
>>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name '(inner bean)': Cannot resolve reference to bean
>>> 'expressionHandler' while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>> ... 78 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name '(inner bean)': Cannot resolve reference to
>>> bean 'expressionHandler' while setting constructor argument; nested
>>> exception is org.springframework.beans.factory.BeanCreationException: Error
>>> creating bean with name 'expressionHandler' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>>> while setting bean property 'permissionEvaluator'; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>>> ... 86 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name 'expressionHandler' defined in class path
>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean
>>> 'permissionEvaluator' while setting bean property 'permissionEvaluator';
>>> nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'permissionEvaluator' defined in class path resource
>>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>>> setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>> ... 94 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name 'permissionEvaluator' defined in class path
>>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>>> while setting constructor argument; nested exception is
>>> org.springframework.beans.factory.BeanCreationException: Error creating
>>> bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>>> at
>>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>> ... 104 more
>>> Caused by: org.springframework.beans.factory.BeanCreationException:
>>> Error creating bean with name 'aclService' defined in file
>>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>>> Instantiation of bean failed; nested exception is
>>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>>> exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>>> at
>>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>>> at
>>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>>> at
>>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>>> ... 116 more
>>> Caused by: org.springframework.beans.BeanInstantiationException: Could
>>> not instantiate bean class [org.apache.kylin.rest.service.AclService]:
>>> Constructor threw exception; nested exception is
>>> org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at
>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>>> at
>>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>>> at
>>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>>> ... 124 more
>>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at
>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>> at
>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>> at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>>> at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>>> at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>> at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>>> at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>>> at
>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>>> at
>>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>>> at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at
>>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>>> ... 126 more
>>> Caused by:
>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>>> tableName:kylin_metadata_acl, family:null,column: null
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>>> at
>>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>>> at
>>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>>> at
>>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>> at
>>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>>> at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>> ... 136 more
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>> startInternal
>>> SEVERE: Error listenerStart
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>>> startInternal
>>> SEVERE: Context [/kylin] startup failed due to previous errors
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>> clearReferencesThreads
>>> SEVERE: The web application [/kylin] appears to have started a thread
>>> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
>>> but has failed to stop it. This is very likely to create a memory leak.
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>> clearReferencesThreads
>>> SEVERE: The web application [/kylin] appears to have started a thread
>>> named [localhost-startStop-1-EventThread] but has failed to stop it. This
>>> is very likely to create a memory leak.
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>> clearReferencesThreads
>>> SEVERE: The web application [/kylin] appears to have started a thread
>>> named [Thread-6] but has failed to stop it. This is very likely to create a
>>> memory leak.
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>>> clearReferencesThreads
>>> SEVERE: The web application [/kylin] appears to have started a thread
>>> named [IPC Client (514096504) connection to
>>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to
>>> stop it. This is very likely to create a memory leak.
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig deployWAR
>>> INFO: Deployment of web application archive
>>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>>> finished in 15,925 ms
>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>>> INFO: Server startup in 15987 ms"
>>>
>>> ##############################################################################
>>>
>>> What am I missing?
>>>
>>> Kind regards
>>> Veli K. Celik
>>>
>>
>>
>>
>> --
>> Thanks and Regards
>>
>> Sudeep Dey
>> Zaloni,Inc. | www.zaloni.com
>> 633 Davis Drive, Suite 450
>> Durham, NC 27713
>> e: s <jb...@zaloni.com>dey@zaloni.com
>>
>>
>> "This e-mail, including attachments, may include confidential and/or
>> proprietary information, and may be used only by the person or entity
>> to which it is addressed. If the reader of this e-mail is not the intended
>> recipient or his or her authorized agent, the reader is hereby notified
>> that any dissemination, distribution or copying of this e-mail is
>> prohibited. If you have received this e-mail in error, please notify the
>> sender by replying to this message and delete this e-mail immediately."
>>
>>
>>
>
>
> --
> Med venlig hilsen
> Veli K. Celik
>



-- 
Med venlig hilsen
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Veli Kerim Celik <vk...@gmail.com>.
I am doing port forwarding through ssh (i.e. "ssh -L 7070:localhost:7070
root@127.0.0.1 -p 2222"). It seems to be working.

I have downloaded the file ojdbc6.jar (from
http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html)
and put it at the path "/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar". The lines
with java.io.FileNotFoundException are gone. Nice. Thanks.

But now I am getting some new exceptions like:

########################## exception start #################################
[localhost-startStop-1]:[2015-12-07
15:49:14,751][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
- Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'org.springframework.cache.config.internalCacheAdvisor':
Cannot resolve reference to bean
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
while setting bean property 'cacheOperationSource'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
BeanPostProcessor before instantiation of bean failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
resolve reference to bean
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name
'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
Cannot create inner bean '(inner bean)' of type
[org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
while setting constructor argument with key [0]; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
type
[org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name '(inner bean)': Cannot resolve reference to bean
'expressionHandler' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'expressionHandler' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
while setting bean property 'permissionEvaluator'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'permissionEvaluator' defined in class path resource
[kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error creating
bean with name 'aclService' defined in file
[/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
Instantiation of bean failed; nested exception is
org.springframework.beans.BeanInstantiationException: Could not instantiate
bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
exception; nested exception is
org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'root (auth:SIMPLE)',action: ADMIN,
tableName:kylin_metadata_acl, family:null,column: null
########################################################################

I disabled HDFS permissions (set dfs.permissions.enabled to false), and
restarted HDFS and Kylin, but it did not get rid of the exception.

Kind regards Veli


2015-12-07 15:29 GMT+01:00 Sudeep Dey <sd...@zaloni.com>:

> Hi Veli,
>
> You need to put the download obdbc6.jar  into this location
> /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin in
> host machine you need to do port forwarding for port 7070.
>
> Regards
>
> Sudeep
>
> On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com>
> wrote:
>
>> Hello
>>
>> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for VirtualBox
>> (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it into
>> VirtualBox.
>>
>> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual
>> machine.
>>
>> After booting it up I login to Ambari at http://localhost:8080/ (from
>> host machine) and start up HBase. HBase starts up without any problems.
>>
>> I then ssh into the virtual machine using the following command: "ssh -L
>> 7070:localhost:7070 root@127.0.0.1 -p 2222"
>>
>> I then download Kylin binary release from "
>> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
>> and extract into the directory /root/bin.
>>
>> I then change .bash_profile so it looks like this:
>> ############################ /root/.bash_profile
>> ###############################
>> # .bash_profile
>>
>> # Get the aliases and functions
>> if [ -f ~/.bashrc ]; then
>>         . ~/.bashrc
>> fi
>>
>> # User specific environment and startup programs
>>
>> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
>> export KYLIN_HOME
>>
>> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>>
>> export PATH
>>
>> ##############################################################################
>>
>> I then start Kylin up by using the command: "kylin.sh start". I then try
>> to access Kylin through http://localhost:7070/kylin (from host machine)
>> and get a blank page (eg. not 404).
>>
>> I get the following output from kylin.sh start and tomcat log:
>>
>> ########################## kylin.sh start output
>> ###################################
>> root@sandbox ~]# kylin.sh start
>> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize does
>> not exist
>> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>>
>> Logging initialized using configuration in
>> file:/etc/hive/conf/hive-log4j.properties
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> hive dependency:
>> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
>> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
>> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
>> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
>> might not work
>> A new Kylin instance is started by root, stop it using "kylin.sh stop"
>> Please visit http://<your_sandbox_ip>:7070/kylin to play with the cubes!
>> (Useranme: ADMIN, Password: KYLIN)
>> You can check the log at
>> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>>
>> ##############################################################################
>>
>> ################################ tomcat/logs/kylin.log
>> ###########################
>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ] [
>> -nonaming ]  { -help | start | stop }
>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
>> lifecycleEvent
>> INFO: The APR based Apache Tomcat Native library which allows optimal
>> performance in production environments was not found on the
>> java.library.path:
>> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
>> INFO: Initialization processed in 847 ms
>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
>> startInternal
>> INFO: Starting service Catalina
>> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
>> startInternal
>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
>> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig deployWAR
>> INFO: Deploying web application archive
>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar]
>> from classloader hierarchy
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at
>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>> from classloader hierarchy
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at
>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
>> from classloader hierarchy
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at
>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar]
>> from classloader hierarchy
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
>> such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at
>> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
>> processAnnotationsJar
>> SEVERE: contextConfig.jarFile
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
>> processAnnotationsJar
>> SEVERE: contextConfig.jarFile
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
>> processAnnotationsJar
>> SEVERE: contextConfig.jarFile
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
>> such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
>> at
>> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
>> at
>> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
>> at
>> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
>> at
>> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>> processResourceJARs
>> SEVERE: Failed to processes JAR found at URL
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
>> to be included in context with name
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>> processResourceJARs
>> SEVERE: Failed to processes JAR found at URL
>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
>> to be included in context with name
>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
>> processResourceJARs
>> SEVERE: Failed to processes JAR found at URL
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
>> be included in context with name
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
>> WARNING: Failed to process JAR
>> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>> at
>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>> at
>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
>> WARNING: Failed to process JAR
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>> at
>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>> at
>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
>> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
>> WARNING: Failed to process JAR
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
>> (No such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>> at
>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>> at
>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
>> WARNING: Failed to process JAR
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
>> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
>> such file or directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
>> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
>> at java.util.jar.JarFile.<init>(JarFile.java:154)
>> at java.util.jar.JarFile.<init>(JarFile.java:91)
>> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
>> at
>> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
>> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
>> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
>> at
>> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
>> at
>> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
>> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
>> at
>> org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
>> at
>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
>> at
>> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> [localhost-startStop-1]:[2015-12-07
>> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>> [localhost-startStop-1]:[2015-12-07
>> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
>> - Loading properties file from resource loaded through InputStream
>> [localhost-startStop-1]:[2015-12-07
>> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
>> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-2--1, built
>> on 03/31/2015 19:31 GMT
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:host.name=sandbox.hortonworks.com
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:java.version=1.7.0_79
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client
>> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client
>> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
>> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client
>> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client
>> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:os.name=Linux
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:os.arch=amd64
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client
>> environment:os.version=2.6.32-504.16.2.el6.x86_64
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:user.name=root
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:user.home=/root
>> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Client environment:user.dir=/root
>> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1]
>> zookeeper.ZooKeeper: Initiating client connection, connectString=
>> sandbox.hortonworks.com:2181 sessionTimeout=30000
>> watcher=hconnection-0x27eeefd2, quorum=sandbox.hortonworks.com:2181,
>> baseZNode=/hbase-unsecure
>> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
>> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
>> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
>> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
>> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
>> attempt to authenticate using SASL (unknown error)
>> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket connection
>> established to sandbox.hortonworks.com/10.0.2.15:2181, initiating session
>> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
>> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
>> establishment complete on server sandbox.hortonworks.com/10.0.2.15:2181,
>> sessionid = 0x1517c12f0f0000b, negotiated timeout = 30000
>> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
>> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
>> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
>> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
>> address=null
>> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient: Use
>> SIMPLE authentication for service MasterService, sasl=false
>> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
>> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
>> [localhost-startStop-1]:[2015-12-07
>> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
>> - Context initialization failed
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
>> Cannot resolve reference to bean
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>> while setting bean property 'cacheOperationSource'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>> resolve reference to bean
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
>> at
>> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
>> at
>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
>> at
>> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
>> at
>> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
>> at
>> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
>> at
>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
>> at
>> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
>> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
>> at
>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
>> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
>> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
>> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
>> at
>> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name
>> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
>> reference to bean
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
>> while setting bean property 'cacheOperationSource'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>> resolve reference to bean
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>> at
>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>> at
>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>> at
>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>> at
>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>> at
>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>> ... 23 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name
>> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
>> BeanPostProcessor before instantiation of bean failed; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>> resolve reference to bean
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>> ... 40 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name
>> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
>> resolve reference to bean
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>> at
>> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
>> at
>> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
>> at
>> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
>> at
>> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
>> at
>> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
>> ... 45 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name
>> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
>> Cannot create inner bean '(inner bean)' of type
>> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
>> while setting constructor argument with key [0]; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
>> type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>> ... 64 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name '(inner bean)': Cannot create inner bean '(inner
>> bean)' of type
>> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>> ... 78 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name '(inner bean)': Cannot resolve reference to bean
>> 'expressionHandler' while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
>> ... 86 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name 'expressionHandler' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
>> while setting bean property 'permissionEvaluator'; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'permissionEvaluator' defined in class path resource
>> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
>> setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>> ... 94 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name 'permissionEvaluator' defined in class path
>> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
>> while setting constructor argument; nested exception is
>> org.springframework.beans.factory.BeanCreationException: Error creating
>> bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
>> at
>> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>> ... 104 more
>> Caused by: org.springframework.beans.factory.BeanCreationException: Error
>> creating bean with name 'aclService' defined in file
>> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
>> Instantiation of bean failed; nested exception is
>> org.springframework.beans.BeanInstantiationException: Could not instantiate
>> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
>> exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
>> at
>> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
>> at
>> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
>> at
>> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
>> ... 116 more
>> Caused by: org.springframework.beans.BeanInstantiationException: Could
>> not instantiate bean class [org.apache.kylin.rest.service.AclService]:
>> Constructor threw exception; nested exception is
>> org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at
>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
>> at
>> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
>> at
>> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
>> ... 124 more
>> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> at
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
>> at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
>> at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>> at
>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
>> at
>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
>> at
>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
>> at
>> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
>> at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at
>> org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
>> ... 126 more
>> Caused by:
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
>> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
>> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
>> tableName:kylin_metadata_acl, family:null,column: null
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
>> at
>> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
>> at
>> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
>> at
>> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> at
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
>> at
>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
>> at
>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
>> at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
>> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
>> at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> ... 136 more
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>> startInternal
>> SEVERE: Error listenerStart
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
>> startInternal
>> SEVERE: Context [/kylin] startup failed due to previous errors
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>> clearReferencesThreads
>> SEVERE: The web application [/kylin] appears to have started a thread
>> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
>> but has failed to stop it. This is very likely to create a memory leak.
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>> clearReferencesThreads
>> SEVERE: The web application [/kylin] appears to have started a thread
>> named [localhost-startStop-1-EventThread] but has failed to stop it. This
>> is very likely to create a memory leak.
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>> clearReferencesThreads
>> SEVERE: The web application [/kylin] appears to have started a thread
>> named [Thread-6] but has failed to stop it. This is very likely to create a
>> memory leak.
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
>> clearReferencesThreads
>> SEVERE: The web application [/kylin] appears to have started a thread
>> named [IPC Client (514096504) connection to
>> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to
>> stop it. This is very likely to create a memory leak.
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig deployWAR
>> INFO: Deployment of web application archive
>> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
>> finished in 15,925 ms
>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>> INFO: Starting ProtocolHandler ["http-bio-7070"]
>> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
>> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
>> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
>> INFO: Server startup in 15987 ms"
>>
>> ##############################################################################
>>
>> What am I missing?
>>
>> Kind regards
>> Veli K. Celik
>>
>
>
>
> --
> Thanks and Regards
>
> Sudeep Dey
> Zaloni,Inc. | www.zaloni.com
> 633 Davis Drive, Suite 450
> Durham, NC 27713
> e: s <jb...@zaloni.com>dey@zaloni.com
>
>
> "This e-mail, including attachments, may include confidential and/or
> proprietary information, and may be used only by the person or entity
> to which it is addressed. If the reader of this e-mail is not the intended
> recipient or his or her authorized agent, the reader is hereby notified
> that any dissemination, distribution or copying of this e-mail is
> prohibited. If you have received this e-mail in error, please notify the
> sender by replying to this message and delete this e-mail immediately."
>
>
>


-- 
Med venlig hilsen
Veli K. Celik

Re: Kylin does not start correctly (detailed and logs included)

Posted by Sudeep Dey <sd...@zaloni.com>.
Hi Veli,

You need to put the download obdbc6.jar  into this location
/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar and moreover to access kylin in host
machine you need to do port forwarding for port 7070.

Regards

Sudeep

On Mon, Dec 7, 2015 at 7:54 AM, Veli Kerim Celik <vk...@gmail.com> wrote:

> Hello
>
> I have downloaded Hortonworks HDP Sandbox version 2.2.4.2 for VirtualBox
> (filename "Sandbox_HDP_2.2.4.2_VirtualBox.ova") and imported it into
> VirtualBox.
>
> I have assinged 4 CPU cores and 12 gigabyte of RAM to the virtual machine.
>
> After booting it up I login to Ambari at http://localhost:8080/ (from
> host machine) and start up HBase. HBase starts up without any problems.
>
> I then ssh into the virtual machine using the following command: "ssh -L
> 7070:localhost:7070 root@127.0.0.1 -p 2222"
>
> I then download Kylin binary release from "
> https://dist.apache.org/repos/dist/release/kylin/apache-kylin-1.1.1-incubating/apache-kylin-1.1.1-incubating-bin.tar.gz"
> and extract into the directory /root/bin.
>
> I then change .bash_profile so it looks like this:
> ############################ /root/.bash_profile
> ###############################
> # .bash_profile
>
> # Get the aliases and functions
> if [ -f ~/.bashrc ]; then
>         . ~/.bashrc
> fi
>
> # User specific environment and startup programs
>
> KYLIN_HOME=$HOME/bin/apache-kylin-1.1.1-incubating-bin
> export KYLIN_HOME
>
> PATH=$PATH:$HOME/bin:$KYLIN_HOME/bin
>
> export PATH
>
> ##############################################################################
>
> I then start Kylin up by using the command: "kylin.sh start". I then try
> to access Kylin through http://localhost:7070/kylin (from host machine)
> and get a blank page (eg. not 404).
>
> I get the following output from kylin.sh start and tomcat log:
>
> ########################## kylin.sh start output
> ###################################
> root@sandbox ~]# kylin.sh start
> KYLIN_HOME is set to /root/bin/apache-kylin-1.1.1-incubating-bin
> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name hive.heapsize does
> not exist
> 15/12/07 11:10:32 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
>
> Logging initialized using configuration in
> file:/etc/hive/conf/hive-log4j.properties
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> hive dependency:
> /etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar
> hbase dependency: /usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar
> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
> KYLIN_DEBUG_SETTINGS is not set, will not enable remote debuging
> KYLIN_LD_LIBRARY_SETTINGS is not set, lzo compression at MR and hbase
> might not work
> A new Kylin instance is started by root, stop it using "kylin.sh stop"
> Please visit http://<your_sandbox_ip>:7070/kylin to play with the cubes!
> (Useranme: ADMIN, Password: KYLIN)
> You can check the log at
> /root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/logs/kylin.log"
>
> ##############################################################################
>
> ################################ tomcat/logs/kylin.log
> ###########################
> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ] [
> -nonaming ]  { -help | start | stop }
> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.AprLifecycleListener
> lifecycleEvent
> INFO: The APR based Apache Tomcat Native library which allows optimal
> performance in production environments was not found on the
> java.library.path:
> :/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
> INFO: Initializing ProtocolHandler ["http-bio-7070"]
> Dec 07, 2015 11:10:49 AM org.apache.coyote.AbstractProtocol init
> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.Catalina load
> INFO: Initialization processed in 847 ms
> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardService
> startInternal
> INFO: Starting service Catalina
> Dec 07, 2015 11:10:49 AM org.apache.catalina.core.StandardEngine
> startInternal
> INFO: Starting Servlet Engine: Apache Tomcat/7.0.59
> Dec 07, 2015 11:10:49 AM org.apache.catalina.startup.HostConfig deployWAR
> INFO: Deploying web application archive
> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war
> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
> scan
> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar]
> from classloader hierarchy
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at
> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
> scan
> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
> from classloader hierarchy
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
> (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at
> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
> scan
> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar]
> from classloader hierarchy
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
> (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at
> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:10:50 AM org.apache.tomcat.util.scan.StandardJarScanner
> scan
> WARNING: Failed to scan [file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar] from
> classloader hierarchy
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2647)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at
> org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1902)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1272)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:10:53 AM org.apache.catalina.startup.ContextConfig
> processAnnotationsJar
> SEVERE: contextConfig.jarFile
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
> (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:10:56 AM org.apache.catalina.startup.ContextConfig
> processAnnotationsJar
> SEVERE: contextConfig.jarFile
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:11:00 AM org.apache.catalina.startup.ContextConfig
> processAnnotationsJar
> SEVERE: contextConfig.jarFile
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsJar(ContextConfig.java:1956)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(ContextConfig.java:1931)
> at
> org.apache.catalina.startup.ContextConfig.processAnnotations(ContextConfig.java:1916)
> at
> org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1330)
> at
> org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:889)
> at
> org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:386)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
> processResourceJARs
> SEVERE: Failed to processes JAR found at URL
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for static resources
> to be included in context with name
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/]
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
> processResourceJARs
> SEVERE: Failed to processes JAR found at URL
> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for static resources
> to be included in context with name
> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/]
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.ContextConfig
> processResourceJARs
> SEVERE: Failed to processes JAR found at URL
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for static resources to
> be included in context with name
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/]
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
> WARNING: Failed to process JAR
> [jar:file:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar!/] for TLD files
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
> at
> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
> at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:02 AM org.apache.catalina.startup.TldConfig tldScanJar
> WARNING: Failed to process JAR
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
> (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
> at
> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
> at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TaglibUriRule body
> INFO: TLD skipped. URI: urn:com:sun:jersey:api:view is already defined
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
> WARNING: Failed to process JAR
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar!/] for TLD files
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar
> (No such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
> at
> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
> at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> Dec 07, 2015 11:11:03 AM org.apache.catalina.startup.TldConfig tldScanJar
> WARNING: Failed to process JAR
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar!/] for TLD files
> java.io.FileNotFoundException: /usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar (No
> such file or directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.<init>(ZipFile.java:215)
> at java.util.zip.ZipFile.<init>(ZipFile.java:145)
> at java.util.jar.JarFile.<init>(JarFile.java:154)
> at java.util.jar.JarFile.<init>(JarFile.java:91)
> at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
> at
> sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.<init>(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
> at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
> at
> org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
> at
> org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
> at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
> at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
> at
> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
> at
> org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> [localhost-startStop-1]:[2015-12-07
> 11:11:04,251][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
> [localhost-startStop-1]:[2015-12-07
> 11:11:04,297][INFO][org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:177)]
> - Loading properties file from resource loaded through InputStream
> [localhost-startStop-1]:[2015-12-07
> 11:11:04,430][WARN][org.apache.kylin.common.KylinConfig.getKylinProperties(KylinConfig.java:576)]
> - KYLIN_CONF property was not set, will seek KYLIN_HOME env variable
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:zookeeper.version=3.4.6-2--1, built on 03/31/2015 19:31
> GMT
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:host.name=sandbox.hortonworks.com
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:java.version=1.7.0_79
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:java.vendor=Oracle Corporation
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client
> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client
> environment:java.class.path=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/bootstrap.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/bin/tomcat-juli.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-jdbc.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-tribes.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/annotations-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jsp-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-coyote.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper-el.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat7-websocket.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/el-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-dbcp.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ha.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/ecj-4.4.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-es.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-util.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/servlet-api.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/catalina-ant.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/jasper.jar:/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/lib/websocket-api.jar::/usr/hdp/2.2.4.2-2/hbase/conf:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/lib/tools.jar:/usr/hdp/2.2.4.2-2/hbase:/usr/hdp/2.2.4.2-2/hbase/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hbase/lib/asm-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-codec-1.7.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-examples.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-it.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2-tests.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-shell.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-testing-util.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift-0.98.4.2.2.4.2-2-hadoop2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/hbase-thrift.jar:/usr/hdp/2.2.4.2-2/hbase/lib/high-scale-lib-1.1.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/httpcore-4.1.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-core-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-json-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jersey-server-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jettison-1.3.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hbase/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/netty-3.6.6.Final.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hbase/lib/phoenix-server.jar:/usr/hdp/2.2.4.2-2/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-hbase-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.2.4.2-2/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hbase/lib/slf4j-api-1.6.4.jar:/usr/hdp/2.2.4.2-2/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hbase/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hbase/lib/zookeeper.jar:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/./:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop-hdfs/.//hadoop-hdfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-client-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//joda-time-2.7.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//aws-java-sdk-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop-mapreduce/.//httpcore-4.2.5.jar::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/hadoop-mapreduce-client/jsp-api-2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/activation-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/junit-4.11.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-io-2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang3-3.3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-recipes-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-server-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jsr305-1.3.9.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-framework-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-openstack.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-ant-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-httpclient-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-net-3.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/current/hadoop-mapreduce-client/jsch-0.1.42.jar:/usr/hdp/current/hadoop-mapreduce-client/xz-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-el-1.0.jar:/usr/hdp/current/hadoop-mapreduce-client/servlet-api-2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-runtime-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-json-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/jettison-1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-gridmix.jar:/usr/hdp/current/hadoop-mapreduce-client/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/paranamer-2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle.jar:/usr/hdp/current/hadoop-mapreduce-client/jersey-core-1.9.jar:/usr/hdp/current/hadoop-mapreduce-client/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-mapreduce-client/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app.jar:/usr/hdp/current/hadoop-mapreduce-client/htrace-core-3.0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-auth.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/current/hadoop-mapreduce-client/httpclient-4.2.5.jar:/usr/hdp/current/hadoop-mapreduce-client/jasper-compiler-5.5.23.jar:/usr/hdp/current/hadoop-mapreduce-client/joda-time-2.7.jar:/usr/hdp/current/hadoop-mapreduce-client/avro-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-mapreduce-client/xmlenc-0.52.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-digester-1.8.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-cli-1.2.jar:/usr/hdp/current/hadoop-mapreduce-client/aws-java-sdk-1.7.4.jar:/usr/hdp/current/hadoop-mapreduce-client/gson-2.2.4.jar:/usr/hdp/current/hadoop-mapreduce-client/curator-client-2.6.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-mapreduce-client/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-mapreduce-client/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-collections-3.2.1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-mapreduce-client/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-datajoin.jar:/usr/hdp/current/hadoop-mapreduce-client/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-aws.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-rumen.jar:/usr/hdp/current/hadoop-mapreduce-client/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-archives.jar:/usr/hdp/current/hadoop-mapreduce-client/asm-3.2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-codec-1.4.jar:/usr/hdp/current/hadoop-mapreduce-client/log4j-1.2.17.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-extras.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-distcp-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/commons-lang-2.6.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-sls.jar:/usr/hdp/current/hadoop-mapreduce-client/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar:/usr/hdp/current/hadoop-mapreduce-client/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-app-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/hadoop-mapreduce-client/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-mapreduce-client/guava-11.0.2.jar:/usr/hdp/current/hadoop-mapreduce-client/httpcore-4.2.5.jar:/usr/hdp/current/tez-client/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-io-2.4.jar:/usr/hdp/current/tez-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections4-4.0.jar:/usr/hdp/current/tez-client/lib/servlet-api-2.5.jar:/usr/hdp/current/tez-client/lib/jsr305-2.0.3.jar:/usr/hdp/current/tez-client/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/jettison-1.3.4.jar:/usr/hdp/current/tez-client/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/commons-cli-1.2.jar:/usr/hdp/current/tez-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/tez-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/tez-client/lib/commons-collections-3.2.1.jar:/usr/hdp/current/tez-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/tez-client/lib/commons-codec-1.4.jar:/usr/hdp/current/tez-client/lib/log4j-1.2.17.jar:/usr/hdp/current/tez-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/tez-client/lib/commons-lang-2.6.jar:/usr/hdp/current/tez-client/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/current/tez-client/lib/guava-11.0.2.jar:/etc/tez/conf/:/usr/hdp/2.2.4.2-2/tez/tez-api-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-tests-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-internals-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-examples-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-yarn-timeline-history-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-dag-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mbeans-resource-calculator-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-common-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-runtime-library-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/tez-mapreduce-0.5.2.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-core-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections4-4.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-mapreduce-client-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/tez/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/tez/lib/hadoop-yarn-server-web-proxy-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/tez/lib/guava-11.0.2.jar:/etc/tez/conf:/usr/hdp/2.2.4.2-2/hadoop/conf:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-auth.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs-2.6.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2-tests.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-common.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-azure.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-nfs.jar:/usr/hdp/2.2.4.2-2/hadoop/hadoop-annotations.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang3-3.3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-el-1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/azure-storage-2.0.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/htrace-core-3.0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-log4j12-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-hdfs-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/asm-3.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper.jar:/usr/hdp/2.2.4.2-2/zookeeper/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpcore-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jsoup-1.7.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/ant-1.8.0.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-logging-1.1.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-settings-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-profile-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/netty-3.7.0.Final.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-model-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-io-2.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/commons-codec-1.6.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/httpclient-4.2.3.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/2.2.4.2-2/zookeeper/lib/maven-project-2.2.1.jar:/etc/hive/conf:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2-standalone.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/activation-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-httpclient-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/junit-4.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/regexp-1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-configuration-1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jcommander-1.32.jar:/usr/hdp/2.2.4.2-2/hive/lib/avro-1.7.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-io-2.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi.jar:/usr/hdp/2.2.4.2-2/hive/lib/mysql-connector-java.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libthrift-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/derbynet-10.11.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-recipes-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-tree-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.23-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jsr305-1.3.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/derby-10.10.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-2.7.7.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-framework-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compress-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims.jar:/usr/hdp/2.2.4.2-2/hive/lib/velocity-1.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-fate-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline.jar:/usr/hdp/2.2.4.2-2/hive/lib/log4j-1.2.16.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/xz-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/servlet-api-2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/super-csv-2.2.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-math-2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/asm-commons-3.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-metastore.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-cred-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ojdbc6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-start-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/eclipselink-2.5.2-M1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-audit-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/opencsv-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-core-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/paranamer-2.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/snappy-java-1.0.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/quidem-0.1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpclient-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/eigenbase-properties-1.1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hamcrest-core-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jpam-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/antlr-runtime-3.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/netty-3.4.0.Final.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/accumulo-trace-1.6.1.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-digester-1.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-cli-1.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hbase-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-common.jar:/usr/hdp/2.2.4.2-2/hive/lib/stax-api-1.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/pentaho-aggdesigner-algorithm-5.1.3-jhyde.jar:/usr/hdp/2.2.4.2-2/hive/lib/curator-client-2.6.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/stringtemplate-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/groovy-all-2.1.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/jline-0.9.94.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-contrib.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-collections-3.2.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-exec-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-cli.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-service.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-common-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-ant.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-logging-1.1.3.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jta-1.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-plugins-impl-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-serde-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-codec-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/oro-2.0.8.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-testutils-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/javax.persistence-2.1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/janino-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-lang-2.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-hwi-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/jansi-1.11.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-common-secure-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-accumulo-handler.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-shims-0.20S-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/zookeeper-3.4.6.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/mail-1.4.1.jar:/usr/hdp/2.2.4.2-2/hive/lib/ranger-hive-plugin-0.4.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-beeline-0.14.0.2.2.4.2-2.jar:/usr/hdp/2.2.4.2-2/hive/lib/libfb303-0.9.0.jar:/usr/hdp/2.2.4.2-2/hive/lib/guava-11.0.2.jar:/usr/hdp/2.2.4.2-2/hive/lib/hive-jdbc.jar:/usr/hdp/2.2.4.2-2/hive/lib/linq4j-0.4.jar:/usr/hdp/2.2.4.2-2/hive/lib/httpcore-4.2.5.jar:/usr/hdp/2.2.4.2-2/hive/lib/ST4-4.0.4.jar:/usr/hdp/2.2.4.2-2/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.14.0.2.2.4.2-2.jar:
> 2015-12-07 11:11:04,922 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client
> environment:java.library.path=:/usr/hdp/2.2.4.2-2/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.2.4.2-2/hadoop/lib/native
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client
> environment:java.io.tmpdir=/root/bin/apache-kylin-1.1.1-incubating-bin/bin/../tomcat/temp
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:java.compiler=<NA>
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:os.name=Linux
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:os.arch=amd64
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:os.version=2.6.32-504.16.2.el6.x86_64
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:user.name=root
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:user.home=/root
> 2015-12-07 11:11:04,923 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Client environment:user.dir=/root
> 2015-12-07 11:11:04,924 INFO  [localhost-startStop-1] zookeeper.ZooKeeper:
> Initiating client connection, connectString=sandbox.hortonworks.com:2181
> sessionTimeout=30000 watcher=hconnection-0x27eeefd2, quorum=
> sandbox.hortonworks.com:2181, baseZNode=/hbase-unsecure
> 2015-12-07 11:11:04,949 INFO  [localhost-startStop-1]
> zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x27eeefd2
> connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
> 2015-12-07 11:11:04,976 INFO  [localhost-startStop-1-SendThread(
> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Opening socket
> connection to server sandbox.hortonworks.com/10.0.2.15:2181. Will not
> attempt to authenticate using SASL (unknown error)
> 2015-12-07 11:11:04,993 INFO  [localhost-startStop-1-SendThread(
> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Socket connection
> established to sandbox.hortonworks.com/10.0.2.15:2181, initiating session
> 2015-12-07 11:11:05,000 INFO  [localhost-startStop-1-SendThread(
> sandbox.hortonworks.com:2181)] zookeeper.ClientCnxn: Session
> establishment complete on server sandbox.hortonworks.com/10.0.2.15:2181,
> sessionid = 0x1517c12f0f0000b, negotiated timeout = 30000
> 2015-12-07 11:11:05,699 DEBUG [localhost-startStop-1] ipc.RpcClient:
> Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@784cfcf2,
> compressor=null, tcpKeepAlive=true, tcpNoDelay=true,
> minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind
> address=null
> 2015-12-07 11:11:05,840 DEBUG [localhost-startStop-1] ipc.RpcClient: Use
> SIMPLE authentication for service MasterService, sasl=false
> 2015-12-07 11:11:05,851 DEBUG [localhost-startStop-1] ipc.RpcClient:
> Connecting to sandbox.hortonworks.com/10.0.2.15:60000
> [localhost-startStop-1]:[2015-12-07
> 11:11:05,880][ERROR][org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:307)]
> - Context initialization failed
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'org.springframework.cache.config.internalCacheAdvisor':
> Cannot resolve reference to bean
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
> while setting bean property 'cacheOperationSource'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
> resolve reference to bean
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
> at
> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
> at
> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
> at
> org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
> at
> org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
> at
> org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
> at
> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5016)
> at
> org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5524)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
> at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
> at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
> at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
> at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
> at
> org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name
> 'org.springframework.cache.config.internalCacheAdvisor': Cannot resolve
> reference to bean
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0'
> while setting bean property 'cacheOperationSource'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
> resolve reference to bean
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
> at
> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
> at
> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
> at
> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
> at
> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
> at
> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
> ... 23 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name
> 'org.springframework.cache.annotation.AnnotationCacheOperationSource#0':
> BeanPostProcessor before instantiation of bean failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
> resolve reference to bean
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:452)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
> ... 40 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name
> 'org.springframework.security.methodSecurityMetadataSourceAdvisor': Cannot
> resolve reference to bean
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
> at
> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
> at
> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
> at
> org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:86)
> at
> org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:100)
> at
> org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:84)
> at
> org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:107)
> at
> org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:278)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:880)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:852)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:446)
> ... 45 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name
> 'org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource#0':
> Cannot create inner bean '(inner bean)' of type
> [org.springframework.security.access.prepost.PrePostAnnotationSecurityMetadataSource]
> while setting constructor argument with key [0]; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot create inner bean '(inner bean)' of
> type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:353)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:153)
> at
> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
> at
> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
> ... 64 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name '(inner bean)': Cannot create inner bean '(inner
> bean)' of type
> [org.springframework.security.access.expression.method.ExpressionBasedAnnotationAttributeFactory]
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:125)
> at
> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
> at
> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
> ... 78 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name '(inner bean)': Cannot resolve reference to bean
> 'expressionHandler' while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
> at
> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
> at
> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
> ... 86 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name 'expressionHandler' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'permissionEvaluator'
> while setting bean property 'permissionEvaluator'; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'permissionEvaluator' defined in class path resource
> [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService' while
> setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1360)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1118)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
> ... 94 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name 'permissionEvaluator' defined in class path
> resource [kylinSecurity.xml]: Cannot resolve reference to bean 'aclService'
> while setting constructor argument; nested exception is
> org.springframework.beans.factory.BeanCreationException: Error creating
> bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
> at
> org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:630)
> at
> org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1035)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:939)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
> ... 104 more
> Caused by: org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name 'aclService' defined in file
> [/root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin/WEB-INF/classes/org/apache/kylin/rest/service/AclService.class]:
> Instantiation of bean failed; nested exception is
> org.springframework.beans.BeanInstantiationException: Could not instantiate
> bean class [org.apache.kylin.rest.service.AclService]: Constructor threw
> exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:997)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:943)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
> at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
> at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
> at
> org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
> ... 116 more
> Caused by: org.springframework.beans.BeanInstantiationException: Could not
> instantiate bean class [org.apache.kylin.rest.service.AclService]:
> Constructor threw exception; nested exception is
> org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
> at
> org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
> at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
> ... 124 more
> Caused by: org.apache.hadoop.hbase.security.AccessDeniedException:
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:230)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:244)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3390)
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:408)
> at
> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:95)
> at
> org.apache.kylin.common.persistence.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:86)
> at org.apache.kylin.rest.service.AclService.<init>(AclService.java:127)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
> ... 126 more
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
> org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
> permissions for user 'root (auth:SIMPLE)',action: ADMIN,
> tableName:kylin_metadata_acl, family:null,column: null
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.requirePermission(XaSecureAuthorizationCoprocessor.java:353)
> at
> com.xasecure.authorization.hbase.XaSecureAuthorizationCoprocessor.preGetTableDescriptors(XaSecureAuthorizationCoprocessor.java:930)
> at
> org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetTableDescriptors(MasterCoprocessorHost.java:1536)
> at
> org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2746)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40438)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1538)
> at
> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1724)
> at
> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1777)
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.getTableDescriptors(MasterProtos.java:42525)
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$5.getTableDescriptors(ConnectionManager.java:2165)
> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:414)
> at org.apache.hadoop.hbase.client.HBaseAdmin$1.call(HBaseAdmin.java:409)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> ... 136 more
> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
> startInternal
> SEVERE: Error listenerStart
> Dec 07, 2015 11:11:05 AM org.apache.catalina.core.StandardContext
> startInternal
> SEVERE: Context [/kylin] startup failed due to previous errors
> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
> clearReferencesThreads
> SEVERE: The web application [/kylin] appears to have started a thread
> named [localhost-startStop-1-SendThread(sandbox.hortonworks.com:2181)]
> but has failed to stop it. This is very likely to create a memory leak.
> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
> clearReferencesThreads
> SEVERE: The web application [/kylin] appears to have started a thread
> named [localhost-startStop-1-EventThread] but has failed to stop it. This
> is very likely to create a memory leak.
> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
> clearReferencesThreads
> SEVERE: The web application [/kylin] appears to have started a thread
> named [Thread-6] but has failed to stop it. This is very likely to create a
> memory leak.
> Dec 07, 2015 11:11:05 AM org.apache.catalina.loader.WebappClassLoader
> clearReferencesThreads
> SEVERE: The web application [/kylin] appears to have started a thread
> named [IPC Client (514096504) connection to
> sandbox.hortonworks.com/10.0.2.15:60000 from root] but has failed to stop
> it. This is very likely to create a memory leak.
> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.HostConfig deployWAR
> INFO: Deployment of web application archive
> /root/bin/apache-kylin-1.1.1-incubating-bin/tomcat/webapps/kylin.war has
> finished in 15,925 ms
> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
> INFO: Starting ProtocolHandler ["http-bio-7070"]
> Dec 07, 2015 11:11:05 AM org.apache.coyote.AbstractProtocol start
> INFO: Starting ProtocolHandler ["ajp-bio-9009"]
> Dec 07, 2015 11:11:05 AM org.apache.catalina.startup.Catalina start
> INFO: Server startup in 15987 ms"
>
> ##############################################################################
>
> What am I missing?
>
> Kind regards
> Veli K. Celik
>



-- 
Thanks and Regards

Sudeep Dey
Zaloni,Inc. | www.zaloni.com
633 Davis Drive, Suite 450
Durham, NC 27713
e: s <jb...@zaloni.com>dey@zaloni.com


"This e-mail, including attachments, may include confidential and/or
proprietary information, and may be used only by the person or entity
to which it is addressed. If the reader of this e-mail is not the intended
recipient or his or her authorized agent, the reader is hereby notified
that any dissemination, distribution or copying of this e-mail is
prohibited. If you have received this e-mail in error, please notify the
sender by replying to this message and delete this e-mail immediately."