You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ranger.apache.org by Mahesh Sankaran <sa...@gmail.com> on 2015/01/13 13:13:06 UTC

Hdfs agent not created

Hi all,

I successfully configured ranger admin,user sync.now am trying to configure
hdfs plugin.My steps are following,

1.Created repository testhdfs.
2.cd /usr/local
3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
5.cd ranger-hdfs-plugin
6.vi install.properties
  POLICY_MGR_URL=http://IP:6080
          REPOSITORY_NAME=testhdfs
          XAAUDIT.DB.HOSTNAME=localhost
          XAAUDIT.DB.DATABASE_NAME=ranger
          XAAUDIT.DB.USER_NAME=rangerlogger
          XAAUDIT.DB.PASSWORD=rangerlogger
7.cd /usr/local/hadoop
8.ln -s /usr/local/hadoop/etc/hadoop conf
9.export HADOOP_HOME=/usr/local/hadoop
10.cd /usr/local/ranger-hdfs-plugin
11../enable-hdfs-plugin.sh
12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
13.vi xasecure-audit.xml
 <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
                   <value>jdbc:mysql://localhost/ranger</value>
                   </property>
                   <property>

 <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
                   <value>rangerlogger</value>
                   </property>
                   <property>
<name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
                   <value>rangerlogger</value>
                   </property>
14.Restarted hadoop
when i see Ranger Admin Web interface -> Audit -> Agents
agent is not created.Am i missed any steps.

*NOTE:I am not using HDP.*

*here is my xa_portal.log*

2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_default.properties]
2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_system.properties]
2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_custom.properties]
2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_ldap.properties]
2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [db_message_bundle.properties]
2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
requestId=10.10.10.53
2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
 org.apache.ranger.rest.UserREST (UserREST.java:186) -
create:nfsnobody@bigdata
2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_default.properties]
2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_system.properties]
2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_custom.properties]
2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_ldap.properties]
2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [db_message_bundle.properties]
2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
requestId=10.10.10.53
2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
requestId=10.10.10.53
2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
 apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
Login: security not enabled, using username
2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
RangerDaoManager.getEntityManager(loggingPU)
2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
 apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
Login: security not enabled, using username
2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
 org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
 org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
permission on parent folder. Do you want to save this policy?
javax.ws.rs.WebApplicationException
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at
org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
at
org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at
org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at
org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at
org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at
org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
 org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
Validation error:logMessage=null,
response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
msgDesc={Mahesh may not have read permission on parent folder. Do you want
to save this policy?}
messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
rbKey={xa.error.oper_no_permission} message={User doesn't have permission
to perform this operation} objectId={null} fieldName={parentPermission} }]}
}
javax.ws.rs.WebApplicationException
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at
org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
at
org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at
org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at
org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at
org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at
org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
requestId=10.10.10.53
2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_default.properties]
2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_system.properties]
2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_custom.properties]
2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [xa_ldap.properties]
2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
 org.springframework.core.io.support.PropertiesLoaderSupport
(PropertiesLoaderSupport.java:177) - Loading properties file from class
path resource [db_message_bundle.properties]
2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
requestId=10.10.10.53
2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
 org.apache.ranger.service.filter.RangerRESTAPIFilter
(RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
RangerDaoManager.getEntityManager(loggingPU)
2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
 org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
permission on parent folder. Do you want to save this policy?
javax.ws.rs.WebApplicationException
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at
org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
at
org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at
org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at
org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at
org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at
org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
 org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
Validation error:logMessage=null,
response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
msgDesc={Mahesh may not have read permission on parent folder. Do you want
to save this policy?}
messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
rbKey={xa.error.oper_no_permission} message={User doesn't have permission
to perform this operation} objectId={null} fieldName={parentPermission} }]}
}
javax.ws.rs.WebApplicationException
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at
org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at
org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
at
org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at
org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at
org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at
org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at
org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
 org.apache.ranger.security.listener.SpringEventListener
(SpringEventListener.java:69) - Login Successful:admin | Ip
Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
user
2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
 org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
requestId=10.10.10.53

Thanks
Mahesh.S

Re: Hdfs agent not created

Posted by Ramesh Mani <rm...@hortonworks.com>.
Hi Mahesh,

Looking at the namenode process I see that ranger-hdfs plugin agent is not injected. you will see  something like   “—javaagent……/ranger-hdfs-plugin-0.4.0.jar”  if it was successfully injected.

Now I wanted you to check if you have the script "set-hdfs-plugin-env.sh” present in hadoop/conf directory and  if present when you restart the namenode ( hoping that you are using hadoop/sbib/hadoop-daemon.sh to restart)   it should be source it from the hadoop/libexec/hadoop-config.sh file.

You need to check hadoop/libexec/hadoop-config.sh has reference to  set-hdfs-plugin-env.sh. It will be last line of script.  Debugging these scripts will help to find why the hdfs-agent is not coming up. Mainly the sourcing of this script should take place when you restart the name node.

Try debugging with this info and send the output.

Regards,
Ramesh



On Jan 15, 2015, at 8:38 PM, Mahesh Sankaran <sa...@gmail.com> wrote:

> Hi Ramesh,
> 
>                 Here is the output of  ps -ef | grep namenode.
>  
> bigdata: nodemanager running as process 13470. Stop it first.
> [hadoop2@bigdata ~]$  ps -ef | grep namenode
> hadoop2    419 32264  0 09:43 pts/0    00:00:00 grep namenode
> hadoop2  12842     1  0 Jan14 ?        00:03:03 /usr/lib/java/bin/java -Dproc_namenode -Xmx1000m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2 -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop-hadoop2-namenode-bigdata.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2 -Dhadoop.root.logger=INFO,RFA -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.namenode.NameNode
> hadoop2  13167     1  0 Jan14 ?        00:01:21 /usr/lib/java/bin/java -Dproc_secondarynamenode -Xmx1000m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2 -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop-hadoop2-secondarynamenode-bigdata.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2 -Dhadoop.root.logger=INFO,RFA -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
> 
> Thanks 
> Mahesh.S
> 
> On Wed, Jan 14, 2015 at 10:34 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
> Mahesh,
> 
> /usr/local/hadoop/lib/ is the path where it is checking for ranger*jar  I feel.
> 
> Just to confirm can you post the ps -ef | grep namenode output.  
> 
> If that is the case can you change it to classpath where you have ranger*jar and restart the namenode
> 
> Regards,
> Ramesh
> 
> On Jan 14, 2015, at 3:28 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
> 
>> Hi Gautam,
>>                 I debugged  set-hdfs-plugin-env.sh script it returns the following,
>> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
>> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
>> 
>> Thanks
>> Mahesh.S
>> 
>> 
>> On Wed, Jan 14, 2015 at 4:54 PM, Gautam Borad <gb...@gmail.com> wrote:
>> It is not guaranteed that the values will be preserved in your current bash session. Please try to put an echo statement in the set-hdfs-plugin-env.sh script to debug.
>> 
>> 
>> On Wed, Jan 14, 2015 at 4:35 PM, Mahesh Sankaran <sa...@gmail.com> wrote:
>> Hi Gautam and Hanish,
>> 
>>                     Thank you for the quick reply.the echo statements of  HADOOP_NAMENODE_OPTS and
>>  HADOOP_SECONDARYNAMENODE_OPTS did not return any values.
>> 
>> [root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS
>> 
>> [root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS
>> 
>> [root@bigdata conf]# 
>> 
>> 
>> Thanks
>> Mahesh.S
>> 
>> On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:
>> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs are getting loaded in the class path :
>> 
>> /usr/local/hadoop/
>> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>> 
>> @Mahesh, I suspect some other problem. Can you put echo statements and debug the set-hdfs-plugin-env.sh script? Ideally after the script is executed the HADOOP_NAMENODE_OPTS and HADOOP_SECONDARYNAMENODE_OPTS should contain the -javaagent line. 
>> 
>> 
>> 
>> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <ha...@impetus.co.in> wrote:
>> ​​​​​Hi Mahesh,
>> 
>> Could you try one thing that copy all the jar files from ${hadoop_home}/lib to hadoop share directory.
>> 
>> 
>> $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>> 
>> This may be an issue that hadoop is not able to pick ranger jars from lib directory.
>> 
>> After copying jars, restart hadoop and check if agent is started.
>> 
>> 
>> -------
>> Thanks & Regards,
>> Hanish Bansal
>> Software Engineer, iLabs
>> Impetus Infotech Pvt. Ltd.
>> 
>> From: Mahesh Sankaran <sa...@gmail.com>
>> Sent: Wednesday, January 14, 2015 3:33 PM
>> To: user@ranger.incubator.apache.org
>> Subject: Re: Hdfs agent not created
>>  
>> Hi Ramesh,
>>                ranger*.jar is added in classpath.i can see in hadoop/lib directory.Can i know the meaning of following error.
>> 
>> 2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>> 
>> thanks
>> 
>> Mahesh.S
>> 
>> 
>> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
>> Hi Mahesh,
>> 
>> This exception is related to datanode not coming of for some reason, but Ranger plugins will be in the name node.
>> 
>> Do you see the namenode and secondarynamenode running after ranger installation and restarting the name node and secondarynamenode?
>> 
>> In the classpath of the namenode I don’t see any ranger*.jar? do you have it in the hadoop/lib directory?
>> 
>> Also can I get the details of xasecure-hdfs-security.xml  from the conf directory?
>> 
>> Regards,
>> Ramesh
>> 
>> On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com> wrote:
>> 
>>> Hi Gautam,
>>> 
>>>                 Now am seeing following exception. is this causes the problem?
>>> 
>>> 2015-01-14 11:41:23,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>>> java.io.EOFException: End of File Exception between local host is: "bigdata/10.10.10.63"; destination host is: "bigdata":9000; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>>> at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>>> at java.lang.Thread.run(Thread.java:744)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> 2015-01-14 11:41:25,981 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>>> 2015-01-14 11:41:25,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:42:03,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
>>> /************************************************************
>>> 
>>> Thanks
>>> Mahesh.S
>>> 
>>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>>> Hi Gautam,
>>> 
>>>               Here is my namenode log.Kindly see it.
>>> 
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:01:27,345 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.6.0
>>> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
>>> STARTUP_MSG:   java = 1.7.0_45
>>> ************************************************************/
>>> 2015-01-14 11:01:27,363 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
>>> 2015-01-14 11:01:27,368 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>> 2015-01-14 11:01:28,029 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
>>> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
>>> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
>>> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
>>> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
>>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
>>> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
>>> 2015-01-14 11:01:30,184 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>> 2015-01-14 11:01:30,196 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
>>> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
>>> 2015-01-14 11:01:30,266 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>> 2015-01-14 11:01:30,268 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:01:30
>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
>>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
>>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
>>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
>>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
>>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
>>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
>>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
>>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
>>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
>>> 2015-01-14 11:01:30,298 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
>>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
>>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
>>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>> 2015-01-14 11:01:30,302 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
>>> 2015-01-14 11:01:30,648 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
>>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
>>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
>>> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
>>> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
>>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
>>> 2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 11417@bigdata
>>> 2015-01-14 11:01:30,963 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>> 2015-01-14 11:01:31,065 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:01:31,210 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
>>> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
>>> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5 expecting start txid #84
>>> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> 2015-01-14 11:01:31,299 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
>>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #86
>>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
>>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #88
>>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
>>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #90
>>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
>>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #92
>>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
>>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #94
>>> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
>>> 2015-01-14 11:01:31,313 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:01:31,317 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>> 2015-01-14 11:01:31,346 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
>>> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1216 msecs
>>> 2015-01-14 11:01:32,427 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
>>> 2015-01-14 11:01:32,568 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
>>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
>>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
>>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
>>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
>>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
>>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 52 msec
>>> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
>>> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
>>> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
>>> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4192060 milliseconds
>>> 2015-01-14 11:01:32,704 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>> 2015-01-14 11:01:38,039 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
>>> 2015-01-14 11:01:38,557 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>>> 2015-01-14 11:01:38,562 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
>>> 2015-01-14 11:01:38,692 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>> 2015-01-14 11:02:02,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
>>> 2015-01-14 11:02:02,698 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:02:21,288 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>> 2015-01-14 11:02:21,291 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: 
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:03:02,845 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.6.0
>>> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
>>> STARTUP_MSG:   java = 1.7.0_45
>>> ************************************************************/
>>> 2015-01-14 11:03:02,861 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
>>> 2015-01-14 11:03:02,866 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>> 2015-01-14 11:03:03,521 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
>>> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
>>> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
>>> 2015-01-14 11:03:03,700 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
>>> 2015-01-14 11:03:03,701 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
>>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
>>> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
>>> 2015-01-14 11:03:05,746 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>> 2015-01-14 11:03:05,761 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
>>> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
>>> 2015-01-14 11:03:05,841 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>> 2015-01-14 11:03:05,843 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:03:05
>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
>>> 2015-01-14 11:03:05,864 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
>>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
>>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
>>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
>>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
>>> 2015-01-14 11:03:05,875 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>> 2015-01-14 11:03:05,878 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
>>> 2015-01-14 11:03:06,284 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
>>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
>>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
>>> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
>>> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
>>> 2015-01-14 11:03:06,317 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
>>> 2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 13312@bigdata
>>> 2015-01-14 11:03:06,532 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>> 2015-01-14 11:03:06,622 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>> 2015-01-14 11:03:06,807 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
>>> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
>>> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #84
>>> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> 2015-01-14 11:03:06,893 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
>>> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #86
>>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
>>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #88
>>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
>>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #90
>>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
>>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #92
>>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
>>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b expecting start txid #94
>>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
>>> 2015-01-14 11:03:06,907 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2 expecting start txid #95
>>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095' to transaction ID 84
>>> 2015-01-14 11:03:07,266 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095 of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:03:07,274 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>> 2015-01-14 11:03:07,313 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>> 2015-01-14 11:03:07,558 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
>>> 2015-01-14 11:03:07,559 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1240 msecs
>>> 2015-01-14 11:03:08,011 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
>>> 2015-01-14 11:03:08,151 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
>>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 18 msec
>>> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
>>> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
>>> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
>>> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4287712 milliseconds
>>> 2015-01-14 11:03:08,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>> 2015-01-14 11:03:13,244 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
>>> 2015-01-14 11:03:13,743 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>>> 2015-01-14 11:03:13,750 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
>>> 2015-01-14 11:03:13,959 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>> 2015-01-14 11:03:38,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
>>> 2015-01-14 11:03:38,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>> 
>>> 
>>> Thanks
>>> Mahesh.S
>>> 
>>> 
>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
>>> Hi Mahesh,
>>>     We will need the namenode logs to debug this further. Can you restart namenode and paste the logs of that somewhere for us to analyze? Thanks.  
>>> 
>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>>> Hi Ramesh,
>>> 
>>>                   I didnt see any exception in the hdfs logs.my problem is agent for hdfs is not created.
>>> 
>>> Regards,
>>> Mahesh.S
>>> 
>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
>>> Hi Mahesh,
>>> 
>>> The error you are seeing in is just a notice that  parent folder of the resource you are creating doesn’t have read permission for the user whom you are creating the policy.
>>> 
>>> when you start the hdfs namenode and secondarynode do you see any exception in the hdfs logs?
>>> 
>>> Regards,
>>> Ramesh
>>> 
>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>>> 
>>>> Hi all,
>>>> 
>>>> I successfully configured ranger admin,user sync.now am trying to configure hdfs plugin.My steps are following,
>>>> 
>>>> 1.Created repository testhdfs.
>>>> 2.cd /usr/local
>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>> 5.cd ranger-hdfs-plugin
>>>> 6.vi install.properties
>>>>  POLICY_MGR_URL=http://IP:6080
>>>>           REPOSITORY_NAME=testhdfs
>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>> 7.cd /usr/local/hadoop
>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>> 11../enable-hdfs-plugin.sh
>>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>> 13.vi xasecure-audit.xml
>>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>                    </property>
>>>>                    <property>
>>>>                    <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>                    <value>rangerlogger</value>
>>>>                    </property>
>>>>                    <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>                    <value>rangerlogger</value>
>>>>                    </property>
>>>> 14.Restarted hadoop
>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>> agent is not created.Am i missed any steps.
>>>> 
>>>> NOTE:I am not using HDP.
>>>> 
>>>> here is my xa_portal.log
>>>> 
>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, requestId=10.10.10.53
>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO  org.apache.ranger.rest.UserREST (UserREST.java:186) - create:nfsnobody@bigdata
>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, requestId=10.10.10.53
>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, requestId=10.10.10.53
>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>>>> javax.ws.rs.WebApplicationException
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>>>> javax.ws.rs.WebApplicationException
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655, requestId=10.10.10.53
>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A, requestId=10.10.10.53
>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>>>> javax.ws.rs.WebApplicationException
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>>>> javax.ws.rs.WebApplicationException
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264, requestId=10.10.10.53
>>>> 
>>>> Thanks
>>>> Mahesh.S
>>> 
>>> 
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
>>> 
>>> 
>>> 
>>> 
>>> -- 
>>> Regards,
>>> Gautam.
>>> 
>>> 
>> 
>> 
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>> 
>> 
>> 
>> -- 
>> Regards,
>> Gautam.
>> 
>> 
>> 
>> 
>> -- 
>> Regards,
>> Gautam.
>> 
> 
> 
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Ramesh,

                Here is the output of  ps -ef | grep namenode.

bigdata: nodemanager running as process 13470. Stop it first.
[hadoop2@bigdata ~]$  ps -ef | grep namenode
hadoop2    419 32264  0 09:43 pts/0    00:00:00 grep namenode
hadoop2  12842     1  0 Jan14 ?        00:03:03 /usr/lib/java/bin/java
-Dproc_namenode -Xmx1000m -Djava.net.preferIPv4Stack=true
-Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log
-Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2
-Dhadoop.root.logger=INFO,console
-Djava.library.path=/usr/local/hadoop/lib/native
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true
-Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true
-Dhadoop.log.dir=/usr/local/hadoop/logs
-Dhadoop.log.file=hadoop-hadoop2-namenode-bigdata.log
-Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2
-Dhadoop.root.logger=INFO,RFA
-Djava.library.path=/usr/local/hadoop/lib/native
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS
org.apache.hadoop.hdfs.server.namenode.NameNode
hadoop2  13167     1  0 Jan14 ?        00:01:21 /usr/lib/java/bin/java
-Dproc_secondarynamenode -Xmx1000m -Djava.net.preferIPv4Stack=true
-Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log
-Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2
-Dhadoop.root.logger=INFO,console
-Djava.library.path=/usr/local/hadoop/lib/native
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true
-Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true
-Dhadoop.log.dir=/usr/local/hadoop/logs
-Dhadoop.log.file=hadoop-hadoop2-secondarynamenode-bigdata.log
-Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop2
-Dhadoop.root.logger=INFO,RFA
-Djava.library.path=/usr/local/hadoop/lib/native
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
-Dhadoop.security.logger=INFO,RFAS
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode

Thanks
Mahesh.S

On Wed, Jan 14, 2015 at 10:34 PM, Ramesh Mani <rm...@hortonworks.com> wrote:

> Mahesh,
>
> /usr/local/hadoop/lib/ is the path where it is checking for ranger*jar  I
> feel.
>
> Just to confirm can you post the ps -ef | grep namenode output.
>
> If that is the case can you change it to classpath where you have
> ranger*jar and restart the namenode
>
> Regards,
> Ramesh
>
> On Jan 14, 2015, at 3:28 AM, Mahesh Sankaran <sa...@gmail.com>
> wrote:
>
> Hi Gautam,
>                 I debugged  set-hdfs-plugin-env.sh script it returns the
> following,
> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
>
> Thanks
> Mahesh.S
>
>
> On Wed, Jan 14, 2015 at 4:54 PM, Gautam Borad <gb...@gmail.com> wrote:
>
>> It is not guaranteed that the values will be preserved in your current
>> bash session. Please try to put an echo statement in the
>> set-hdfs-plugin-env.sh script to debug.
>>
>>
>> On Wed, Jan 14, 2015 at 4:35 PM, Mahesh Sankaran <
>> sankarmahesh37@gmail.com> wrote:
>>
>>> Hi Gautam and Hanish,
>>>
>>>                     Thank you for the quick reply.the echo statements of
>>>  *HADOOP_NAMENODE_OPTS* and
>>>  *HADOOP_SECONDARYNAMENODE_OPTS *did not return any values.
>>>
>>> [root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS
>>>
>>> [root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS
>>>
>>> [root@bigdata conf]#
>>>
>>>
>>> Thanks
>>> Mahesh.S
>>>
>>> On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:
>>>
>>>> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs
>>>> are getting loaded in the class path :
>>>>
>>>> /usr/local/hadoop/
>>>>>
>>>>> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>>>>>
>>>>
>>>> @Mahesh, I suspect some other problem. Can you put echo statements and
>>>> debug the set-hdfs-plugin-env.sh script? Ideally after the script is
>>>> executed the *HADOOP_NAMENODE_OPTS* and *HADOOP_SECONDARYNAMENODE_OPTS*
>>>> should contain the -javaagent line.
>>>>
>>>>
>>>>
>>>> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <
>>>> hanish.bansal@impetus.co.in> wrote:
>>>>
>>>>>    ​​​​​Hi Mahesh,
>>>>>
>>>>>
>>>>> Could you try one thing that copy all the jar files from
>>>>> ${hadoop_home}/lib to hadoop share directory.
>>>>>
>>>>>
>>>>>  $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>>>>>
>>>>>
>>>>> This may be an issue that hadoop is not able to pick ranger jars from
>>>>> lib directory.
>>>>>
>>>>>
>>>>> After copying jars, restart hadoop and check if agent is started.
>>>>>
>>>>>
>>>>>
>>>>>      -------
>>>>>
>>>>> *Thanks & Regards, Hanish Bansal*
>>>>> Software Engineer, iLabs
>>>>> Impetus Infotech Pvt. Ltd.
>>>>>
>>>>>      ------------------------------
>>>>> *From:* Mahesh Sankaran <sa...@gmail.com>
>>>>> *Sent:* Wednesday, January 14, 2015 3:33 PM
>>>>> *To:* user@ranger.incubator.apache.org
>>>>> *Subject:* Re: Hdfs agent not created
>>>>>
>>>>>  Hi Ramesh,
>>>>>                ranger*.jar is added in classpath.i can see in
>>>>> hadoop/lib directory.Can i know the meaning of following error.
>>>>>
>>>>>  2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>
>>>>>  thanks
>>>>>
>>>>>  Mahesh.S
>>>>>
>>>>>
>>>>> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>
>>>>> wrote:
>>>>>
>>>>>> Hi Mahesh,
>>>>>>
>>>>>>   This exception is related to datanode not coming of for some
>>>>>> reason, but Ranger plugins will be in the name node.
>>>>>>
>>>>>>  Do you see the namenode and secondarynamenode running after ranger
>>>>>> installation and restarting the name node and secondarynamenode?
>>>>>>
>>>>>>  In the classpath of the namenode I don’t see any ranger*.jar? do
>>>>>> you have it in the hadoop/lib directory?
>>>>>>
>>>>>>  Also can I get the details of xasecure-hdfs-security.xml  from the
>>>>>> conf directory?
>>>>>>
>>>>>>  Regards,
>>>>>> Ramesh
>>>>>>
>>>>>>  On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <
>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>
>>>>>>  Hi Gautam,
>>>>>>
>>>>>>                  Now am seeing following exception. is this causes
>>>>>> the problem?
>>>>>>
>>>>>>  2015-01-14 11:41:23,102 WARN
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>>>>>> java.io.EOFException: End of File Exception between local host is:
>>>>>> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
>>>>>> java.io.EOFException; For more details see:
>>>>>> http://wiki.apache.org/hadoop/EOFException
>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>> Method)
>>>>>> at
>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>>> at
>>>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>>>>>> at
>>>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>>>>>> at
>>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>>>>>> at
>>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>>>>>> at
>>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>> Caused by: java.io.EOFException
>>>>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>>>>> at
>>>>>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>>> 2015-01-14 11:41:25,981 ERROR
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>>>>>> 2015-01-14 11:41:25,984 INFO
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
>>>>>> /************************************************************
>>>>>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>>>>>> ************************************************************/
>>>>>> 2015-01-14 11:42:03,054 INFO
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
>>>>>> /************************************************************
>>>>>>
>>>>>>  Thanks
>>>>>> Mahesh.S
>>>>>>
>>>>>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Gautam,
>>>>>>>
>>>>>>>                Here is my namenode log.Kindly see it.
>>>>>>>
>>>>>>>  /************************************************************
>>>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>>>> ************************************************************/
>>>>>>> 2015-01-14 11:01:27,345 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>>>> /************************************************************
>>>>>>> STARTUP_MSG: Starting NameNode
>>>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>>>> STARTUP_MSG:   args = []
>>>>>>> STARTUP_MSG:   version = 2.6.0
>>>>>>> STARTUP_MSG:   classpath =
>>>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>>>> STARTUP_MSG:   build =
>>>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>>>> 2014-11-13T21:10Z
>>>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>>>> ************************************************************/
>>>>>>> 2015-01-14 11:01:27,363 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>>>> handlers for [TERM, HUP, INT]
>>>>>>> 2015-01-14 11:01:27,368 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>>>> 2015-01-14 11:01:28,029 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>>>> hadoop-metrics2.properties
>>>>>>> 2015-01-14 11:01:28,205 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>>>> period at 10 second(s).
>>>>>>> 2015-01-14 11:01:28,205 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>>>> started
>>>>>>> 2015-01-14 11:01:28,209 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>>>> hdfs://bigdata:9000
>>>>>>> 2015-01-14 11:01:28,209 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>>>> bigdata:9000 to access this namenode/service.
>>>>>>> 2015-01-14 11:01:28,433 WARN
>>>>>>> org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
>>>>>>> library for your platform... using builtin-java classes where applicable
>>>>>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil:
>>>>>>> Starting Web-server for hdfs at: http://0.0.0.0:50070
>>>>>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>>>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>>>> org.mortbay.log.Slf4jLog
>>>>>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>>>> Http request log for http.requests.namenode is not defined
>>>>>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added global filter 'safety'
>>>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context hdfs
>>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context static
>>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context logs
>>>>>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> addJerseyResourcePackage:
>>>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>>>> pathSpec=/webhdfs/v1/*
>>>>>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Jetty bound to port 50070
>>>>>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>>>>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>>>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>>>> 2015-01-14 11:01:30,093 WARN
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>>>> lack of redundant storage directories!
>>>>>>> 2015-01-14 11:01:30,093 WARN
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>>>> loss due to lack of redundant storage directories!
>>>>>>> 2015-01-14 11:01:30,184 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>>>> 2015-01-14 11:01:30,196 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>>>> 2015-01-14 11:01:30,262 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>>> dfs.block.invalidate.limit=1000
>>>>>>> 2015-01-14 11:01:30,262 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>>>> 2015-01-14 11:01:30,266 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>>>> 2015-01-14 11:01:30,268 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>>>> deletion will start around 2015 Jan 14 11:01:30
>>>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map BlocksMap
>>>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>>>> memory 889 MB = 17.8 MB
>>>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^21 = 2097152 entries
>>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> dfs.block.access.token.enable=false
>>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> defaultReplication         = 1
>>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>>>             = 512
>>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>>>             = 1
>>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> maxReplicationStreams      = 2
>>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> shouldCheckForEnoughRacks  = false
>>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> replicationRecheckInterval = 3000
>>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> encryptDataTransfer        = false
>>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> maxNumBlocksToLog          = 1000
>>>>>>> 2015-01-14 11:01:30,298 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>>>> hadoop2 (auth:SIMPLE)
>>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>>>> supergroup
>>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>>>> true
>>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>>>> 2015-01-14 11:01:30,302 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map INodeMap
>>>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>>>> memory 889 MB = 8.9 MB
>>>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^20 = 1048576 entries
>>>>>>> 2015-01-14 11:01:30,648 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>>>> occuring more than 10 times
>>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map cachedBlocks
>>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>>>> memory 889 MB = 2.2 MB
>>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^18 = 262144 entries
>>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.extension     = 30000
>>>>>>> 2015-01-14 11:01:30,674 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>>>> namenode is enabled
>>>>>>> 2015-01-14 11:01:30,674 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map NameNodeRetryCache
>>>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>>>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^15 = 32768 entries
>>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>>>> 16384
>>>>>>> 2015-01-14 11:01:30,729 INFO
>>>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>>>> 11417@bigdata
>>>>>>> 2015-01-14 11:01:30,963 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>>>> 2015-01-14 11:01:31,065 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>>>> file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>>>>>>> ->
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>>> 2015-01-14 11:01:31,210 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>>>> INodes.
>>>>>>> 2015-01-14 11:01:31,293 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>>>> FSImage in 0 seconds.
>>>>>>> 2015-01-14 11:01:31,293 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>>>> 2015-01-14 11:01:31,294 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>>>>>>> expecting start txid #84
>>>>>>> 2015-01-14 11:01:31,294 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>>> 2015-01-14 11:01:31,299 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>>>> expecting start txid #86
>>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>>>> expecting start txid #88
>>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>>>> expecting start txid #90
>>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>>>> expecting start txid #92
>>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>>>> expecting start txid #94
>>>>>>> 2015-01-14 11:01:31,308 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>>> 2015-01-14 11:01:31,308 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:01:31,313 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>>> 2015-01-14 11:01:31,317 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>>>> 2015-01-14 11:01:31,346 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>>>>>> 2015-01-14 11:01:31,904 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>>>> entries 0 lookups
>>>>>>> 2015-01-14 11:01:31,904 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>>>> FSImage in 1216 msecs
>>>>>>> 2015-01-14 11:01:32,427 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>>>> bigdata:9000
>>>>>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>>>>>>> Socket Reader #1 for port 9000
>>>>>>> 2015-01-14 11:01:32,568 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>>>> FSNamesystemState MBean
>>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>>> construction: 0
>>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>>> construction: 0
>>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>>>> replication queues
>>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Leaving safe mode after 2 secs
>>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>>>> blocks            = 0
>>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> invalid blocks          = 0
>>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> under-replicated blocks = 0
>>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>>  over-replicated blocks = 0
>>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> blocks being written    = 0
>>>>>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>>>> under-replicated blocks completed in 52 msec
>>>>>>> 2015-01-14 11:01:32,676 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>>>> bigdata/10.10.10.63:9000
>>>>>>> 2015-01-14 11:01:32,676 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>>>> required for active state
>>>>>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>> Server Responder: starting
>>>>>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>> Server listener on 9000: starting
>>>>>>> 2015-01-14 11:01:32,697 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>>>> 2015-01-14 11:01:32,697 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Rescanning after 4192060 milliseconds
>>>>>>> 2015-01-14 11:01:32,704 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>>>>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>>> ipcPort=50020,
>>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>>>> 2015-01-14 11:01:38,039 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>>> failed storage changes from 0 to 0
>>>>>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>>>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>>>> 2015-01-14 11:01:38,557 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>>> failed storage changes from 0 to 0
>>>>>>> 2015-01-14 11:01:38,562 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>>>> 10.10.10.63:50010
>>>>>>> 2015-01-14 11:01:38,692 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>>>> processReport: Received first block report from
>>>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>>>> stale
>>>>>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport:
>>>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>>>> DatanodeRegistration(10.10.10.63,
>>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>>> ipcPort=50020,
>>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>>>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>>>>>> 2015-01-14 11:02:02,697 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Rescanning after 30000 milliseconds
>>>>>>> 2015-01-14 11:02:02,698 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>>> 2015-01-14 11:02:21,288 ERROR
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>>>>>> 2015-01-14 11:02:21,291 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>>>>>>> /************************************************************
>>>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>>>> ************************************************************/
>>>>>>> 2015-01-14 11:03:02,845 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>>>> /************************************************************
>>>>>>> STARTUP_MSG: Starting NameNode
>>>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>>>> STARTUP_MSG:   args = []
>>>>>>> STARTUP_MSG:   version = 2.6.0
>>>>>>> STARTUP_MSG:   classpath =
>>>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>>>> STARTUP_MSG:   build =
>>>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>>>> 2014-11-13T21:10Z
>>>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>>>> ************************************************************/
>>>>>>> 2015-01-14 11:03:02,861 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>>>> handlers for [TERM, HUP, INT]
>>>>>>> 2015-01-14 11:03:02,866 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>>>> 2015-01-14 11:03:03,521 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>>>> hadoop-metrics2.properties
>>>>>>> 2015-01-14 11:03:03,697 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>>>> period at 10 second(s).
>>>>>>> 2015-01-14 11:03:03,697 INFO
>>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>>>> started
>>>>>>> 2015-01-14 11:03:03,700 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>>>> hdfs://bigdata:9000
>>>>>>> 2015-01-14 11:03:03,701 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>>>> bigdata:9000 to access this namenode/service.
>>>>>>> 2015-01-14 11:03:03,925 WARN
>>>>>>> org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
>>>>>>> library for your platform... using builtin-java classes where applicable
>>>>>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil:
>>>>>>> Starting Web-server for hdfs at: http://0.0.0.0:50070
>>>>>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>>>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>>>> org.mortbay.log.Slf4jLog
>>>>>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>>>> Http request log for http.requests.namenode is not defined
>>>>>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added global filter 'safety'
>>>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context hdfs
>>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context logs
>>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter static_user_filter
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>>> context static
>>>>>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Added filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> addJerseyResourcePackage:
>>>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>>>> pathSpec=/webhdfs/v1/*
>>>>>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2:
>>>>>>> Jetty bound to port 50070
>>>>>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>>>>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>>>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>>>> 2015-01-14 11:03:05,645 WARN
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>>>> lack of redundant storage directories!
>>>>>>> 2015-01-14 11:03:05,645 WARN
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>>>> loss due to lack of redundant storage directories!
>>>>>>> 2015-01-14 11:03:05,746 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>>>> 2015-01-14 11:03:05,761 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>>>> 2015-01-14 11:03:05,837 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>>> dfs.block.invalidate.limit=1000
>>>>>>> 2015-01-14 11:03:05,837 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>>>> 2015-01-14 11:03:05,841 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>>>> 2015-01-14 11:03:05,843 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>>>> deletion will start around 2015 Jan 14 11:03:05
>>>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map BlocksMap
>>>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>>>> memory 889 MB = 17.8 MB
>>>>>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^21 = 2097152 entries
>>>>>>> 2015-01-14 11:03:05,864 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> dfs.block.access.token.enable=false
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> defaultReplication         = 1
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>>>             = 512
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>>>             = 1
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> maxReplicationStreams      = 2
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> shouldCheckForEnoughRacks  = false
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> replicationRecheckInterval = 3000
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> encryptDataTransfer        = false
>>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>>> maxNumBlocksToLog          = 1000
>>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>>>> hadoop2 (auth:SIMPLE)
>>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>>>> supergroup
>>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>>>> true
>>>>>>> 2015-01-14 11:03:05,875 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>>>> 2015-01-14 11:03:05,878 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map INodeMap
>>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>>>> memory 889 MB = 8.9 MB
>>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^20 = 1048576 entries
>>>>>>> 2015-01-14 11:03:06,284 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>>>> occuring more than 10 times
>>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map cachedBlocks
>>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>>>> memory 889 MB = 2.2 MB
>>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^18 = 262144 entries
>>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>>> dfs.namenode.safemode.extension     = 30000
>>>>>>> 2015-01-14 11:03:06,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>>>> namenode is enabled
>>>>>>> 2015-01-14 11:03:06,304 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>>>>>>> capacity for map NameNodeRetryCache
>>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>>     = 64-bit
>>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>>>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>>    = 2^15 = 32768 entries
>>>>>>> 2015-01-14 11:03:06,317 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>>>> 2015-01-14 11:03:06,318 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>>>> 2015-01-14 11:03:06,318 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>>>> 16384
>>>>>>> 2015-01-14 11:03:06,368 INFO
>>>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>>>> 13312@bigdata
>>>>>>> 2015-01-14 11:03:06,532 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>>>> 2015-01-14 11:03:06,622 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>>>> file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>>>>>>> ->
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>>> 2015-01-14 11:03:06,807 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>>>> INodes.
>>>>>>> 2015-01-14 11:03:06,888 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>>>> FSImage in 0 seconds.
>>>>>>> 2015-01-14 11:03:06,888 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>>>> 2015-01-14 11:03:06,889 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>>>> expecting start txid #84
>>>>>>> 2015-01-14 11:03:06,889 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>>> 2015-01-14 11:03:06,893 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,897 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,897 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>>>> expecting start txid #86
>>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>>>> expecting start txid #88
>>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>>>> expecting start txid #90
>>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>>>> expecting start txid #92
>>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>>>>>>> expecting start txid #94
>>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:06,907 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>>>>>>> expecting start txid #95
>>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>>> stream
>>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>>>>>>> to transaction ID 84
>>>>>>> 2015-01-14 11:03:07,266 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>>> 2015-01-14 11:03:07,274 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>>>> 2015-01-14 11:03:07,313 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>>>>>> 2015-01-14 11:03:07,558 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>>>> entries 0 lookups
>>>>>>> 2015-01-14 11:03:07,559 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>>>> FSImage in 1240 msecs
>>>>>>> 2015-01-14 11:03:08,011 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>>>> bigdata:9000
>>>>>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>>>>>>> Socket Reader #1 for port 9000
>>>>>>> 2015-01-14 11:03:08,151 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>>>> FSNamesystemState MBean
>>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>>> construction: 0
>>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>>> construction: 0
>>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>>>> replication queues
>>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Leaving safe mode after 2 secs
>>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>>>> blocks            = 0
>>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> invalid blocks          = 0
>>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> under-replicated blocks = 0
>>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>>  over-replicated blocks = 0
>>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>> blocks being written    = 0
>>>>>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>>>> under-replicated blocks completed in 18 msec
>>>>>>> 2015-01-14 11:03:08,322 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>>>> bigdata/10.10.10.63:9000
>>>>>>> 2015-01-14 11:03:08,322 INFO
>>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>>>> required for active state
>>>>>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>> Server Responder: starting
>>>>>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>> Server listener on 9000: starting
>>>>>>> 2015-01-14 11:03:08,349 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>>>> 2015-01-14 11:03:08,349 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Rescanning after 4287712 milliseconds
>>>>>>> 2015-01-14 11:03:08,350 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>>> ipcPort=50020,
>>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>>>> 2015-01-14 11:03:13,244 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>>> failed storage changes from 0 to 0
>>>>>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>>>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>>>> 2015-01-14 11:03:13,743 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>>> failed storage changes from 0 to 0
>>>>>>> 2015-01-14 11:03:13,750 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>>>> 10.10.10.63:50010
>>>>>>> 2015-01-14 11:03:13,959 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>>>> processReport: Received first block report from
>>>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>>>> stale
>>>>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport:
>>>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>>>> DatanodeRegistration(10.10.10.63,
>>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>>> ipcPort=50020,
>>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>>>>>> 2015-01-14 11:03:38,349 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Rescanning after 30000 milliseconds
>>>>>>> 2015-01-14 11:03:38,350 INFO
>>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>>>>>>
>>>>>>>
>>>>>>>  Thanks
>>>>>>> Mahesh.S
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>>  Hi Mahesh,
>>>>>>>>      We will need the namenode logs to debug this further. Can you
>>>>>>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>>>>>>> Thanks.
>>>>>>>>
>>>>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Ramesh,
>>>>>>>>>
>>>>>>>>>                    I didnt see any exception in the hdfs logs.my
>>>>>>>>> problem is agent for hdfs is not created.
>>>>>>>>>
>>>>>>>>>  Regards,
>>>>>>>>> Mahesh.S
>>>>>>>>>
>>>>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <
>>>>>>>>> rmani@hortonworks.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi Mahesh,
>>>>>>>>>>
>>>>>>>>>>  The error you are seeing in is just a notice that  parent
>>>>>>>>>> folder of the resource you are creating doesn’t have read permission for
>>>>>>>>>> the user whom you are creating the policy.
>>>>>>>>>>
>>>>>>>>>>  when you start the hdfs namenode and secondarynode do you see
>>>>>>>>>> any exception in the hdfs logs?
>>>>>>>>>>
>>>>>>>>>>  Regards,
>>>>>>>>>> Ramesh
>>>>>>>>>>
>>>>>>>>>>   On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <
>>>>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>   Hi all,
>>>>>>>>>>
>>>>>>>>>>  I successfully configured ranger admin,user sync.now am trying
>>>>>>>>>> to configure hdfs plugin.My steps are following,
>>>>>>>>>>
>>>>>>>>>>  1.Created repository testhdfs.
>>>>>>>>>> 2.cd /usr/local
>>>>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>>>>>>> 5.cd ranger-hdfs-plugin
>>>>>>>>>> 6.vi install.properties
>>>>>>>>>>  POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>>>>>>           REPOSITORY_NAME=testhdfs
>>>>>>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>>>>>>> 7.cd /usr/local/hadoop
>>>>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>>>>>>> 11../enable-hdfs-plugin.sh
>>>>>>>>>> 12.cp /usr/local/hadoop/lib/*
>>>>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>>>>>>> 13.vi xasecure-audit.xml
>>>>>>>>>>  <property>
>>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>>>>>>                    </property>
>>>>>>>>>>                    <property>
>>>>>>>>>>
>>>>>>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>>>                    </property>
>>>>>>>>>>                    <property>
>>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>>>                    </property>
>>>>>>>>>> 14.Restarted hadoop
>>>>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>>>>>>> agent is not created.Am i missed any steps.
>>>>>>>>>>
>>>>>>>>>>  *NOTE:I am not using HDP.*
>>>>>>>>>>
>>>>>>>>>>  *here is my xa_portal.log*
>>>>>>>>>>
>>>>>>>>>>  2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>>> builtin-java classes where applicable
>>>>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>>>>>>> create:nfsnobody@bigdata
>>>>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>>> builtin-java classes where applicable
>>>>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>>>> Login: security not enabled, using username
>>>>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>>>> Login: security not enabled, using username
>>>>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>>>> Validation error:logMessage=null,
>>>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>>>> to save this policy?}
>>>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>>>> }
>>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>>>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>>> builtin-java classes where applicable
>>>>>>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>>>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>>>> Validation error:logMessage=null,
>>>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>>>> to save this policy?}
>>>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>>>> }
>>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>>> at
>>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>>> at
>>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>>> at
>>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>>> at
>>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>>> at
>>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>>>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>>> user
>>>>>>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>>
>>>>>>>>>>  Thanks
>>>>>>>>>> Mahesh.S
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>  --
>>>>>>>> Regards,
>>>>>>>> Gautam.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>>
>>>>>
>>>>> ------------------------------
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> NOTE: This message may contain information that is confidential,
>>>>> proprietary, privileged or otherwise protected by law. The message is
>>>>> intended solely for the named addressee. If received in error, please
>>>>> destroy and notify the sender. Any use of this email is prohibited when
>>>>> received in error. Impetus does not represent, warrant and/or guarantee,
>>>>> that the integrity of this communication has been maintained nor that the
>>>>> communication is free of errors, virus, interception or interference.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Regards,
>>>> Gautam.
>>>>
>>>
>>>
>>
>>
>> --
>> Regards,
>> Gautam.
>>
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Ramesh Mani <rm...@hortonworks.com>.
Mahesh,

/usr/local/hadoop/lib/ is the path where it is checking for ranger*jar  I feel.

Just to confirm can you post the ps -ef | grep namenode output.  

If that is the case can you change it to classpath where you have ranger*jar and restart the namenode

Regards,
Ramesh

On Jan 14, 2015, at 3:28 AM, Mahesh Sankaran <sa...@gmail.com> wrote:

> Hi Gautam,
>                 I debugged  set-hdfs-plugin-env.sh script it returns the following,
> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
> -javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
> 
> Thanks
> Mahesh.S
> 
> 
> On Wed, Jan 14, 2015 at 4:54 PM, Gautam Borad <gb...@gmail.com> wrote:
> It is not guaranteed that the values will be preserved in your current bash session. Please try to put an echo statement in the set-hdfs-plugin-env.sh script to debug.
> 
> 
> On Wed, Jan 14, 2015 at 4:35 PM, Mahesh Sankaran <sa...@gmail.com> wrote:
> Hi Gautam and Hanish,
> 
>                     Thank you for the quick reply.the echo statements of  HADOOP_NAMENODE_OPTS and
>  HADOOP_SECONDARYNAMENODE_OPTS did not return any values.
> 
> [root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS
> 
> [root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS
> 
> [root@bigdata conf]# 
> 
> 
> Thanks
> Mahesh.S
> 
> On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:
> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs are getting loaded in the class path :
> 
> /usr/local/hadoop/
> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
> 
> @Mahesh, I suspect some other problem. Can you put echo statements and debug the set-hdfs-plugin-env.sh script? Ideally after the script is executed the HADOOP_NAMENODE_OPTS and HADOOP_SECONDARYNAMENODE_OPTS should contain the -javaagent line. 
> 
> 
> 
> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <ha...@impetus.co.in> wrote:
> ​​​​​Hi Mahesh,
> 
> Could you try one thing that copy all the jar files from ${hadoop_home}/lib to hadoop share directory.
> 
> 
> $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
> 
> This may be an issue that hadoop is not able to pick ranger jars from lib directory.
> 
> After copying jars, restart hadoop and check if agent is started.
> 
> 
> -------
> Thanks & Regards,
> Hanish Bansal
> Software Engineer, iLabs
> Impetus Infotech Pvt. Ltd.
> 
> From: Mahesh Sankaran <sa...@gmail.com>
> Sent: Wednesday, January 14, 2015 3:33 PM
> To: user@ranger.incubator.apache.org
> Subject: Re: Hdfs agent not created
>  
> Hi Ramesh,
>                ranger*.jar is added in classpath.i can see in hadoop/lib directory.Can i know the meaning of following error.
> 
> 2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
> 
> thanks
> 
> Mahesh.S
> 
> 
> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
> Hi Mahesh,
> 
> This exception is related to datanode not coming of for some reason, but Ranger plugins will be in the name node.
> 
> Do you see the namenode and secondarynamenode running after ranger installation and restarting the name node and secondarynamenode?
> 
> In the classpath of the namenode I don’t see any ranger*.jar? do you have it in the hadoop/lib directory?
> 
> Also can I get the details of xasecure-hdfs-security.xml  from the conf directory?
> 
> Regards,
> Ramesh
> 
> On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com> wrote:
> 
>> Hi Gautam,
>> 
>>                 Now am seeing following exception. is this causes the problem?
>> 
>> 2015-01-14 11:41:23,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>> java.io.EOFException: End of File Exception between local host is: "bigdata/10.10.10.63"; destination host is: "bigdata":9000; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>> at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>> at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>> at java.lang.Thread.run(Thread.java:744)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2015-01-14 11:41:25,981 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>> 2015-01-14 11:41:25,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:42:03,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
>> /************************************************************
>> 
>> Thanks
>> Mahesh.S
>> 
>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>> Hi Gautam,
>> 
>>               Here is my namenode log.Kindly see it.
>> 
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:01:27,345 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = bigdata/10.10.10.63
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.6.0
>> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
>> STARTUP_MSG:   java = 1.7.0_45
>> ************************************************************/
>> 2015-01-14 11:01:27,363 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
>> 2015-01-14 11:01:27,368 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>> 2015-01-14 11:01:28,029 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
>> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
>> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
>> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
>> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
>> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
>> 2015-01-14 11:01:30,184 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>> 2015-01-14 11:01:30,196 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
>> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
>> 2015-01-14 11:01:30,266 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>> 2015-01-14 11:01:30,268 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:01:30
>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
>> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
>> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
>> 2015-01-14 11:01:30,298 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
>> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>> 2015-01-14 11:01:30,302 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
>> 2015-01-14 11:01:30,648 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
>> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
>> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
>> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
>> 2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 11417@bigdata
>> 2015-01-14 11:01:30,963 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>> 2015-01-14 11:01:31,065 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:01:31,210 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
>> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
>> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5 expecting start txid #84
>> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> 2015-01-14 11:01:31,299 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #86
>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #88
>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #90
>> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #92
>> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #94
>> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
>> 2015-01-14 11:01:31,313 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:01:31,317 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>> 2015-01-14 11:01:31,346 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
>> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1216 msecs
>> 2015-01-14 11:01:32,427 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
>> 2015-01-14 11:01:32,568 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
>> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 52 msec
>> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
>> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
>> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
>> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4192060 milliseconds
>> 2015-01-14 11:01:32,704 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>> 2015-01-14 11:01:38,039 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
>> 2015-01-14 11:01:38,557 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>> 2015-01-14 11:01:38,562 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
>> 2015-01-14 11:01:38,692 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>> 2015-01-14 11:02:02,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
>> 2015-01-14 11:02:02,698 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:02:21,288 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>> 2015-01-14 11:02:21,291 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: 
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:03:02,845 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = bigdata/10.10.10.63
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.6.0
>> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
>> STARTUP_MSG:   java = 1.7.0_45
>> ************************************************************/
>> 2015-01-14 11:03:02,861 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
>> 2015-01-14 11:03:02,866 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>> 2015-01-14 11:03:03,521 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
>> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
>> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
>> 2015-01-14 11:03:03,700 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
>> 2015-01-14 11:03:03,701 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
>> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
>> 2015-01-14 11:03:05,746 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>> 2015-01-14 11:03:05,761 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
>> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
>> 2015-01-14 11:03:05,841 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>> 2015-01-14 11:03:05,843 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:03:05
>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
>> 2015-01-14 11:03:05,864 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
>> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
>> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
>> 2015-01-14 11:03:05,875 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>> 2015-01-14 11:03:05,878 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
>> 2015-01-14 11:03:06,284 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
>> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
>> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
>> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
>> 2015-01-14 11:03:06,317 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
>> 2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 13312@bigdata
>> 2015-01-14 11:03:06,532 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>> 2015-01-14 11:03:06,622 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>> 2015-01-14 11:03:06,807 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
>> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
>> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #84
>> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> 2015-01-14 11:03:06,893 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
>> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #86
>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
>> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #88
>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
>> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #90
>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #92
>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b expecting start txid #94
>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
>> 2015-01-14 11:03:06,907 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2 expecting start txid #95
>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095' to transaction ID 84
>> 2015-01-14 11:03:07,266 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095 of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:03:07,274 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>> 2015-01-14 11:03:07,313 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>> 2015-01-14 11:03:07,558 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
>> 2015-01-14 11:03:07,559 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1240 msecs
>> 2015-01-14 11:03:08,011 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
>> 2015-01-14 11:03:08,151 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
>> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 18 msec
>> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
>> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
>> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
>> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4287712 milliseconds
>> 2015-01-14 11:03:08,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>> 2015-01-14 11:03:13,244 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
>> 2015-01-14 11:03:13,743 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
>> 2015-01-14 11:03:13,750 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
>> 2015-01-14 11:03:13,959 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>> 2015-01-14 11:03:38,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
>> 2015-01-14 11:03:38,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>> 
>> 
>> Thanks
>> Mahesh.S
>> 
>> 
>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
>> Hi Mahesh,
>>     We will need the namenode logs to debug this further. Can you restart namenode and paste the logs of that somewhere for us to analyze? Thanks.  
>> 
>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>> Hi Ramesh,
>> 
>>                   I didnt see any exception in the hdfs logs.my problem is agent for hdfs is not created.
>> 
>> Regards,
>> Mahesh.S
>> 
>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
>> Hi Mahesh,
>> 
>> The error you are seeing in is just a notice that  parent folder of the resource you are creating doesn’t have read permission for the user whom you are creating the policy.
>> 
>> when you start the hdfs namenode and secondarynode do you see any exception in the hdfs logs?
>> 
>> Regards,
>> Ramesh
>> 
>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
>> 
>>> Hi all,
>>> 
>>> I successfully configured ranger admin,user sync.now am trying to configure hdfs plugin.My steps are following,
>>> 
>>> 1.Created repository testhdfs.
>>> 2.cd /usr/local
>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>> 5.cd ranger-hdfs-plugin
>>> 6.vi install.properties
>>>  POLICY_MGR_URL=http://IP:6080
>>>           REPOSITORY_NAME=testhdfs
>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>> 7.cd /usr/local/hadoop
>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>> 10.cd /usr/local/ranger-hdfs-plugin
>>> 11../enable-hdfs-plugin.sh
>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>> 13.vi xasecure-audit.xml
>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>                    </property>
>>>                    <property>
>>>                    <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>                    <value>rangerlogger</value>
>>>                    </property>
>>>                    <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>                    <value>rangerlogger</value>
>>>                    </property>
>>> 14.Restarted hadoop
>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>> agent is not created.Am i missed any steps.
>>> 
>>> NOTE:I am not using HDP.
>>> 
>>> here is my xa_portal.log
>>> 
>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, requestId=10.10.10.53
>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO  org.apache.ranger.rest.UserREST (UserREST.java:186) - create:nfsnobody@bigdata
>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, requestId=10.10.10.53
>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, requestId=10.10.10.53
>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>>> javax.ws.rs.WebApplicationException
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>>> javax.ws.rs.WebApplicationException
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655, requestId=10.10.10.53
>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A, requestId=10.10.10.53
>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>>> javax.ws.rs.WebApplicationException
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>>> javax.ws.rs.WebApplicationException
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>> at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264, requestId=10.10.10.53
>>> 
>>> Thanks
>>> Mahesh.S
>> 
>> 
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
>> 
>> 
>> 
>> 
>> -- 
>> Regards,
>> Gautam.
>> 
>> 
> 
> 
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
> 
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
> 
> 
> 
> -- 
> Regards,
> Gautam.
> 
> 
> 
> 
> -- 
> Regards,
> Gautam.
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Gautam,
                I debugged  set-hdfs-plugin-env.sh script it returns the
following,
-javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent
-javaagent:/usr/local/hadoop/lib/ranger-hdfs-plugin-0.4.0.jar=authagent

Thanks
Mahesh.S


On Wed, Jan 14, 2015 at 4:54 PM, Gautam Borad <gb...@gmail.com> wrote:

> It is not guaranteed that the values will be preserved in your current
> bash session. Please try to put an echo statement in the
> set-hdfs-plugin-env.sh script to debug.
>
>
> On Wed, Jan 14, 2015 at 4:35 PM, Mahesh Sankaran <sankarmahesh37@gmail.com
> > wrote:
>
>> Hi Gautam and Hanish,
>>
>>                     Thank you for the quick reply.the echo statements of
>>  *HADOOP_NAMENODE_OPTS* and
>>  *HADOOP_SECONDARYNAMENODE_OPTS *did not return any values.
>>
>> [root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS
>>
>> [root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS
>>
>> [root@bigdata conf]#
>>
>>
>> Thanks
>> Mahesh.S
>>
>> On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:
>>
>>> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs
>>> are getting loaded in the class path :
>>>
>>> /usr/local/hadoop/
>>>>
>>>> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>>>>
>>>
>>> @Mahesh, I suspect some other problem. Can you put echo statements and
>>> debug the set-hdfs-plugin-env.sh script? Ideally after the script is
>>> executed the *HADOOP_NAMENODE_OPTS* and *HADOOP_SECONDARYNAMENODE_OPTS*
>>> should contain the -javaagent line.
>>>
>>>
>>>
>>> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <
>>> hanish.bansal@impetus.co.in> wrote:
>>>
>>>>     ​​​​​Hi Mahesh,
>>>>
>>>>
>>>>  Could you try one thing that copy all the jar files from
>>>> ${hadoop_home}/lib to hadoop share directory.
>>>>
>>>>
>>>>  $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>>>>
>>>>
>>>>  This may be an issue that hadoop is not able to pick ranger jars from
>>>> lib directory.
>>>>
>>>>
>>>>  After copying jars, restart hadoop and check if agent is started.
>>>>
>>>>
>>>>
>>>>      -------
>>>>
>>>> *Thanks & Regards, Hanish Bansal*
>>>> Software Engineer, iLabs
>>>> Impetus Infotech Pvt. Ltd.
>>>>
>>>>      ------------------------------
>>>> *From:* Mahesh Sankaran <sa...@gmail.com>
>>>> *Sent:* Wednesday, January 14, 2015 3:33 PM
>>>> *To:* user@ranger.incubator.apache.org
>>>> *Subject:* Re: Hdfs agent not created
>>>>
>>>>  Hi Ramesh,
>>>>                ranger*.jar is added in classpath.i can see in
>>>> hadoop/lib directory.Can i know the meaning of following error.
>>>>
>>>>  2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>
>>>>  thanks
>>>>
>>>>  Mahesh.S
>>>>
>>>>
>>>> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>
>>>> wrote:
>>>>
>>>>> Hi Mahesh,
>>>>>
>>>>>   This exception is related to datanode not coming of for some
>>>>> reason, but Ranger plugins will be in the name node.
>>>>>
>>>>>  Do you see the namenode and secondarynamenode running after ranger
>>>>> installation and restarting the name node and secondarynamenode?
>>>>>
>>>>>  In the classpath of the namenode I don’t see any ranger*.jar? do you
>>>>> have it in the hadoop/lib directory?
>>>>>
>>>>>  Also can I get the details of xasecure-hdfs-security.xml  from the
>>>>> conf directory?
>>>>>
>>>>>  Regards,
>>>>> Ramesh
>>>>>
>>>>>  On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <
>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>
>>>>>  Hi Gautam,
>>>>>
>>>>>                  Now am seeing following exception. is this causes
>>>>> the problem?
>>>>>
>>>>>  2015-01-14 11:41:23,102 WARN
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>>>>> java.io.EOFException: End of File Exception between local host is:
>>>>> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
>>>>> java.io.EOFException; For more details see:
>>>>> http://wiki.apache.org/hadoop/EOFException
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>> at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>> at
>>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>>>>> at
>>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>>>>> at
>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>>>>> at
>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>>>>> at
>>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>> Caused by: java.io.EOFException
>>>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>>>> at
>>>>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>> 2015-01-14 11:41:25,981 ERROR
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>>>>> 2015-01-14 11:41:25,984 INFO
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
>>>>> /************************************************************
>>>>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>>>>> ************************************************************/
>>>>> 2015-01-14 11:42:03,054 INFO
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
>>>>> /************************************************************
>>>>>
>>>>>  Thanks
>>>>> Mahesh.S
>>>>>
>>>>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>
>>>>>> Hi Gautam,
>>>>>>
>>>>>>                Here is my namenode log.Kindly see it.
>>>>>>
>>>>>>  /************************************************************
>>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>>> ************************************************************/
>>>>>> 2015-01-14 11:01:27,345 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>>> /************************************************************
>>>>>> STARTUP_MSG: Starting NameNode
>>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>>> STARTUP_MSG:   args = []
>>>>>> STARTUP_MSG:   version = 2.6.0
>>>>>> STARTUP_MSG:   classpath =
>>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>>> STARTUP_MSG:   build =
>>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>>> 2014-11-13T21:10Z
>>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>>> ************************************************************/
>>>>>> 2015-01-14 11:01:27,363 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>>> handlers for [TERM, HUP, INT]
>>>>>> 2015-01-14 11:01:27,368 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>>> 2015-01-14 11:01:28,029 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>>> hadoop-metrics2.properties
>>>>>> 2015-01-14 11:01:28,205 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>>> period at 10 second(s).
>>>>>> 2015-01-14 11:01:28,205 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>>> started
>>>>>> 2015-01-14 11:01:28,209 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>>> hdfs://bigdata:9000
>>>>>> 2015-01-14 11:01:28,209 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>>> bigdata:9000 to access this namenode/service.
>>>>>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>> builtin-java classes where applicable
>>>>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>>> org.mortbay.log.Slf4jLog
>>>>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>>> Http request log for http.requests.namenode is not defined
>>>>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added global filter 'safety'
>>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context hdfs
>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context static
>>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context logs
>>>>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> addJerseyResourcePackage:
>>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>>> pathSpec=/webhdfs/v1/*
>>>>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Jetty bound to port 50070
>>>>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>>>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>>> 2015-01-14 11:01:30,093 WARN
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>>> lack of redundant storage directories!
>>>>>> 2015-01-14 11:01:30,093 WARN
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>>> loss due to lack of redundant storage directories!
>>>>>> 2015-01-14 11:01:30,184 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>>> 2015-01-14 11:01:30,196 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>>> 2015-01-14 11:01:30,262 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>> dfs.block.invalidate.limit=1000
>>>>>> 2015-01-14 11:01:30,262 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>>> 2015-01-14 11:01:30,266 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>>> 2015-01-14 11:01:30,268 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>>> deletion will start around 2015 Jan 14 11:01:30
>>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map BlocksMap
>>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>>> memory 889 MB = 17.8 MB
>>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^21 = 2097152 entries
>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> dfs.block.access.token.enable=false
>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> defaultReplication         = 1
>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>>             = 512
>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>>             = 1
>>>>>> 2015-01-14 11:01:30,289 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> maxReplicationStreams      = 2
>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> shouldCheckForEnoughRacks  = false
>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> replicationRecheckInterval = 3000
>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> encryptDataTransfer        = false
>>>>>> 2015-01-14 11:01:30,290 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> maxNumBlocksToLog          = 1000
>>>>>> 2015-01-14 11:01:30,298 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>>> hadoop2 (auth:SIMPLE)
>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>>> supergroup
>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>>> true
>>>>>> 2015-01-14 11:01:30,299 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>>> 2015-01-14 11:01:30,302 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map INodeMap
>>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>>> memory 889 MB = 8.9 MB
>>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^20 = 1048576 entries
>>>>>> 2015-01-14 11:01:30,648 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>>> occuring more than 10 times
>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map cachedBlocks
>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>>> memory 889 MB = 2.2 MB
>>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^18 = 262144 entries
>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>>> 2015-01-14 11:01:30,669 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.extension     = 30000
>>>>>> 2015-01-14 11:01:30,674 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>>> namenode is enabled
>>>>>> 2015-01-14 11:01:30,674 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map NameNodeRetryCache
>>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^15 = 32768 entries
>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>>> 2015-01-14 11:01:30,687 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>>> 16384
>>>>>> 2015-01-14 11:01:30,729 INFO
>>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>>> 11417@bigdata
>>>>>> 2015-01-14 11:01:30,963 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>>> 2015-01-14 11:01:31,065 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>>> file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>>>>>> ->
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>> 2015-01-14 11:01:31,210 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>>> INodes.
>>>>>> 2015-01-14 11:01:31,293 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>>> FSImage in 0 seconds.
>>>>>> 2015-01-14 11:01:31,293 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>>> 2015-01-14 11:01:31,294 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>>>>>> expecting start txid #84
>>>>>> 2015-01-14 11:01:31,294 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>> 2015-01-14 11:01:31,299 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>>> expecting start txid #86
>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>> 2015-01-14 11:01:31,303 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>>> expecting start txid #88
>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>> 2015-01-14 11:01:31,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>>> expecting start txid #90
>>>>>> 2015-01-14 11:01:31,305 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>>> expecting start txid #92
>>>>>> 2015-01-14 11:01:31,306 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,307 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>>> expecting start txid #94
>>>>>> 2015-01-14 11:01:31,308 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>> 2015-01-14 11:01:31,308 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:01:31,313 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>> 2015-01-14 11:01:31,317 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>>> 2015-01-14 11:01:31,346 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>>>>> 2015-01-14 11:01:31,904 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>>> entries 0 lookups
>>>>>> 2015-01-14 11:01:31,904 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>>> FSImage in 1216 msecs
>>>>>> 2015-01-14 11:01:32,427 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>>> bigdata:9000
>>>>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>>>>>> Socket Reader #1 for port 9000
>>>>>> 2015-01-14 11:01:32,568 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>>> FSNamesystemState MBean
>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>> construction: 0
>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>> construction: 0
>>>>>> 2015-01-14 11:01:32,588 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>>> replication queues
>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Leaving safe mode after 2 secs
>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>>> blocks            = 0
>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> invalid blocks          = 0
>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> under-replicated blocks = 0
>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>  over-replicated blocks = 0
>>>>>> 2015-01-14 11:01:32,645 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> blocks being written    = 0
>>>>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>>> under-replicated blocks completed in 52 msec
>>>>>> 2015-01-14 11:01:32,676 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>>> bigdata/10.10.10.63:9000
>>>>>> 2015-01-14 11:01:32,676 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>>> required for active state
>>>>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>>> Responder: starting
>>>>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>>> listener on 9000: starting
>>>>>> 2015-01-14 11:01:32,697 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>>> 2015-01-14 11:01:32,697 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Rescanning after 4192060 milliseconds
>>>>>> 2015-01-14 11:01:32,704 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>>>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>> ipcPort=50020,
>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>>> 2015-01-14 11:01:38,039 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>> failed storage changes from 0 to 0
>>>>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>>> 2015-01-14 11:01:38,557 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>> failed storage changes from 0 to 0
>>>>>> 2015-01-14 11:01:38,562 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>>> 10.10.10.63:50010
>>>>>> 2015-01-14 11:01:38,692 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>>> processReport: Received first block report from
>>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>>> stale
>>>>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport:
>>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>>> DatanodeRegistration(10.10.10.63,
>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>> ipcPort=50020,
>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>>>>> 2015-01-14 11:02:02,697 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Rescanning after 30000 milliseconds
>>>>>> 2015-01-14 11:02:02,698 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>> 2015-01-14 11:02:21,288 ERROR
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>>>>> 2015-01-14 11:02:21,291 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>>>>>> /************************************************************
>>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>>> ************************************************************/
>>>>>> 2015-01-14 11:03:02,845 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>>> /************************************************************
>>>>>> STARTUP_MSG: Starting NameNode
>>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>>> STARTUP_MSG:   args = []
>>>>>> STARTUP_MSG:   version = 2.6.0
>>>>>> STARTUP_MSG:   classpath =
>>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>>> STARTUP_MSG:   build =
>>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>>> 2014-11-13T21:10Z
>>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>>> ************************************************************/
>>>>>> 2015-01-14 11:03:02,861 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>>> handlers for [TERM, HUP, INT]
>>>>>> 2015-01-14 11:03:02,866 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>>> 2015-01-14 11:03:03,521 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>>> hadoop-metrics2.properties
>>>>>> 2015-01-14 11:03:03,697 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>>> period at 10 second(s).
>>>>>> 2015-01-14 11:03:03,697 INFO
>>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>>> started
>>>>>> 2015-01-14 11:03:03,700 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>>> hdfs://bigdata:9000
>>>>>> 2015-01-14 11:03:03,701 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>>> bigdata:9000 to access this namenode/service.
>>>>>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>> builtin-java classes where applicable
>>>>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>>> org.mortbay.log.Slf4jLog
>>>>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>>> Http request log for http.requests.namenode is not defined
>>>>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added global filter 'safety'
>>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context hdfs
>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context logs
>>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter static_user_filter
>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>>> context static
>>>>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Added filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> addJerseyResourcePackage:
>>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>>> pathSpec=/webhdfs/v1/*
>>>>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2:
>>>>>> Jetty bound to port 50070
>>>>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>>>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>>> 2015-01-14 11:03:05,645 WARN
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>>> lack of redundant storage directories!
>>>>>> 2015-01-14 11:03:05,645 WARN
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>>> loss due to lack of redundant storage directories!
>>>>>> 2015-01-14 11:03:05,746 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>>> 2015-01-14 11:03:05,761 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>>> 2015-01-14 11:03:05,837 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>> dfs.block.invalidate.limit=1000
>>>>>> 2015-01-14 11:03:05,837 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>>> 2015-01-14 11:03:05,841 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>>> 2015-01-14 11:03:05,843 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>>> deletion will start around 2015 Jan 14 11:03:05
>>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map BlocksMap
>>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>>> memory 889 MB = 17.8 MB
>>>>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^21 = 2097152 entries
>>>>>> 2015-01-14 11:03:05,864 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> dfs.block.access.token.enable=false
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> defaultReplication         = 1
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>>             = 512
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>>             = 1
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> maxReplicationStreams      = 2
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> shouldCheckForEnoughRacks  = false
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> replicationRecheckInterval = 3000
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> encryptDataTransfer        = false
>>>>>> 2015-01-14 11:03:05,865 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>>> maxNumBlocksToLog          = 1000
>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>>> hadoop2 (auth:SIMPLE)
>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>>> supergroup
>>>>>> 2015-01-14 11:03:05,874 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>>> true
>>>>>> 2015-01-14 11:03:05,875 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>>> 2015-01-14 11:03:05,878 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map INodeMap
>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>>> memory 889 MB = 8.9 MB
>>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^20 = 1048576 entries
>>>>>> 2015-01-14 11:03:06,284 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>>> occuring more than 10 times
>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map cachedBlocks
>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>>> memory 889 MB = 2.2 MB
>>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^18 = 262144 entries
>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>>> 2015-01-14 11:03:06,301 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>>> dfs.namenode.safemode.extension     = 30000
>>>>>> 2015-01-14 11:03:06,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>>> namenode is enabled
>>>>>> 2015-01-14 11:03:06,304 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>>>>>> capacity for map NameNodeRetryCache
>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type
>>>>>>   = 64-bit
>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity
>>>>>>    = 2^15 = 32768 entries
>>>>>> 2015-01-14 11:03:06,317 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>>> 2015-01-14 11:03:06,318 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>>> 2015-01-14 11:03:06,318 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>>> 16384
>>>>>> 2015-01-14 11:03:06,368 INFO
>>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>>> 13312@bigdata
>>>>>> 2015-01-14 11:03:06,532 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>>> 2015-01-14 11:03:06,622 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>>> file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>>>>>> ->
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>> 2015-01-14 11:03:06,807 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>>> INodes.
>>>>>> 2015-01-14 11:03:06,888 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>>> FSImage in 0 seconds.
>>>>>> 2015-01-14 11:03:06,888 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>>> 2015-01-14 11:03:06,889 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>>> expecting start txid #84
>>>>>> 2015-01-14 11:03:06,889 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>> 2015-01-14 11:03:06,893 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,897 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,897 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>>> expecting start txid #86
>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,898 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>>> expecting start txid #88
>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,899 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>>> expecting start txid #90
>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>> 2015-01-14 11:03:06,900 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>>> expecting start txid #92
>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>> 2015-01-14 11:03:06,901 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>>>>>> expecting start txid #94
>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>> 2015-01-14 11:03:06,902 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:06,907 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>>>>>> expecting start txid #95
>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>> 2015-01-14 11:03:06,908 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>>> stream
>>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>>>>>> to transaction ID 84
>>>>>> 2015-01-14 11:03:07,266 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>>> 2015-01-14 11:03:07,274 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>>> 2015-01-14 11:03:07,313 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>>>>> 2015-01-14 11:03:07,558 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>>> entries 0 lookups
>>>>>> 2015-01-14 11:03:07,559 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>>> FSImage in 1240 msecs
>>>>>> 2015-01-14 11:03:08,011 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>>> bigdata:9000
>>>>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>>>>>> Socket Reader #1 for port 9000
>>>>>> 2015-01-14 11:03:08,151 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>>> FSNamesystemState MBean
>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>> construction: 0
>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>>> construction: 0
>>>>>> 2015-01-14 11:03:08,173 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>>> replication queues
>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Leaving safe mode after 2 secs
>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>>> blocks            = 0
>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> invalid blocks          = 0
>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> under-replicated blocks = 0
>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>>  over-replicated blocks = 0
>>>>>> 2015-01-14 11:03:08,194 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>> blocks being written    = 0
>>>>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>>> under-replicated blocks completed in 18 msec
>>>>>> 2015-01-14 11:03:08,322 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>>> bigdata/10.10.10.63:9000
>>>>>> 2015-01-14 11:03:08,322 INFO
>>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>>> required for active state
>>>>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>>> Responder: starting
>>>>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>>> listener on 9000: starting
>>>>>> 2015-01-14 11:03:08,349 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>>> 2015-01-14 11:03:08,349 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Rescanning after 4287712 milliseconds
>>>>>> 2015-01-14 11:03:08,350 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange:
>>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>> ipcPort=50020,
>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>>> 2015-01-14 11:03:13,244 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>> failed storage changes from 0 to 0
>>>>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>>> 2015-01-14 11:03:13,743 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>>> failed storage changes from 0 to 0
>>>>>> 2015-01-14 11:03:13,750 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>>> 10.10.10.63:50010
>>>>>> 2015-01-14 11:03:13,959 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>>> processReport: Received first block report from
>>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>>> stale
>>>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport:
>>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>>> DatanodeRegistration(10.10.10.63,
>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>>> ipcPort=50020,
>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>>>>> 2015-01-14 11:03:38,349 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Rescanning after 30000 milliseconds
>>>>>> 2015-01-14 11:03:38,350 INFO
>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>>>>>
>>>>>>
>>>>>>  Thanks
>>>>>> Mahesh.S
>>>>>>
>>>>>>
>>>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>>  Hi Mahesh,
>>>>>>>      We will need the namenode logs to debug this further. Can you
>>>>>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>>>>>> Thanks.
>>>>>>>
>>>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Ramesh,
>>>>>>>>
>>>>>>>>                    I didnt see any exception in the hdfs logs.my
>>>>>>>> problem is agent for hdfs is not created.
>>>>>>>>
>>>>>>>>  Regards,
>>>>>>>> Mahesh.S
>>>>>>>>
>>>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Hi Mahesh,
>>>>>>>>>
>>>>>>>>>  The error you are seeing in is just a notice that  parent folder
>>>>>>>>> of the resource you are creating doesn’t have read permission for the user
>>>>>>>>> whom you are creating the policy.
>>>>>>>>>
>>>>>>>>>  when you start the hdfs namenode and secondarynode do you see
>>>>>>>>> any exception in the hdfs logs?
>>>>>>>>>
>>>>>>>>>  Regards,
>>>>>>>>> Ramesh
>>>>>>>>>
>>>>>>>>>   On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <
>>>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>   Hi all,
>>>>>>>>>
>>>>>>>>>  I successfully configured ranger admin,user sync.now am trying
>>>>>>>>> to configure hdfs plugin.My steps are following,
>>>>>>>>>
>>>>>>>>>  1.Created repository testhdfs.
>>>>>>>>> 2.cd /usr/local
>>>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>>>>>> 5.cd ranger-hdfs-plugin
>>>>>>>>> 6.vi install.properties
>>>>>>>>>  POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>>>>>           REPOSITORY_NAME=testhdfs
>>>>>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>>>>>> 7.cd /usr/local/hadoop
>>>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>>>>>> 11../enable-hdfs-plugin.sh
>>>>>>>>> 12.cp /usr/local/hadoop/lib/*
>>>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>>>>>> 13.vi xasecure-audit.xml
>>>>>>>>>  <property>
>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>>>>>                    </property>
>>>>>>>>>                    <property>
>>>>>>>>>
>>>>>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>>                    </property>
>>>>>>>>>                    <property>
>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>>                    </property>
>>>>>>>>> 14.Restarted hadoop
>>>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>>>>>> agent is not created.Am i missed any steps.
>>>>>>>>>
>>>>>>>>>  *NOTE:I am not using HDP.*
>>>>>>>>>
>>>>>>>>>  *here is my xa_portal.log*
>>>>>>>>>
>>>>>>>>>  2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>> builtin-java classes where applicable
>>>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>>>>>> create:nfsnobody@bigdata
>>>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>> builtin-java classes where applicable
>>>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>>> Login: security not enabled, using username
>>>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>>> Login: security not enabled, using username
>>>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>>> Validation error:logMessage=null,
>>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>>> to save this policy?}
>>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>>> }
>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_default.properties]
>>>>>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_system.properties]
>>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_custom.properties]
>>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>>> builtin-java classes where applicable
>>>>>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>>> Validation error:logMessage=null,
>>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>>> to save this policy?}
>>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>>> }
>>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>>> at
>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>>> at
>>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>>> at
>>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>>> at
>>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>>> at
>>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>>> at
>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>>> at
>>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>>> at
>>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>>> at
>>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>>> user
>>>>>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>>>>>> requestId=10.10.10.53
>>>>>>>>>
>>>>>>>>>  Thanks
>>>>>>>>> Mahesh.S
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>  --
>>>>>>> Regards,
>>>>>>> Gautam.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>
>>>>
>>>>
>>>> ------------------------------
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> NOTE: This message may contain information that is confidential,
>>>> proprietary, privileged or otherwise protected by law. The message is
>>>> intended solely for the named addressee. If received in error, please
>>>> destroy and notify the sender. Any use of this email is prohibited when
>>>> received in error. Impetus does not represent, warrant and/or guarantee,
>>>> that the integrity of this communication has been maintained nor that the
>>>> communication is free of errors, virus, interception or interference.
>>>>
>>>
>>>
>>>
>>> --
>>> Regards,
>>> Gautam.
>>>
>>
>>
>
>
> --
> Regards,
> Gautam.
>

Re: Hdfs agent not created

Posted by Gautam Borad <gb...@gmail.com>.
It is not guaranteed that the values will be preserved in your current bash
session. Please try to put an echo statement in the set-hdfs-plugin-env.sh
script to debug.


On Wed, Jan 14, 2015 at 4:35 PM, Mahesh Sankaran <sa...@gmail.com>
wrote:

> Hi Gautam and Hanish,
>
>                     Thank you for the quick reply.the echo statements of
> *HADOOP_NAMENODE_OPTS* and
>  *HADOOP_SECONDARYNAMENODE_OPTS *did not return any values.
>
> [root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS
>
> [root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS
>
> [root@bigdata conf]#
>
>
> Thanks
> Mahesh.S
>
> On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:
>
>> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs
>> are getting loaded in the class path :
>>
>> /usr/local/hadoop/
>>>
>>> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>>>
>>
>> @Mahesh, I suspect some other problem. Can you put echo statements and
>> debug the set-hdfs-plugin-env.sh script? Ideally after the script is
>> executed the *HADOOP_NAMENODE_OPTS* and *HADOOP_SECONDARYNAMENODE_OPTS*
>> should contain the -javaagent line.
>>
>>
>>
>> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <
>> hanish.bansal@impetus.co.in> wrote:
>>
>>>     ​​​​​Hi Mahesh,
>>>
>>>
>>>  Could you try one thing that copy all the jar files from
>>> ${hadoop_home}/lib to hadoop share directory.
>>>
>>>
>>>  $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>>>
>>>
>>>  This may be an issue that hadoop is not able to pick ranger jars from
>>> lib directory.
>>>
>>>
>>>  After copying jars, restart hadoop and check if agent is started.
>>>
>>>
>>>
>>>      -------
>>>
>>> *Thanks & Regards, Hanish Bansal*
>>> Software Engineer, iLabs
>>> Impetus Infotech Pvt. Ltd.
>>>
>>>      ------------------------------
>>> *From:* Mahesh Sankaran <sa...@gmail.com>
>>> *Sent:* Wednesday, January 14, 2015 3:33 PM
>>> *To:* user@ranger.incubator.apache.org
>>> *Subject:* Re: Hdfs agent not created
>>>
>>>  Hi Ramesh,
>>>                ranger*.jar is added in classpath.i can see in hadoop/lib
>>> directory.Can i know the meaning of following error.
>>>
>>>  2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>> RangerDaoManager.getEntityManager(loggingPU)
>>>
>>>  thanks
>>>
>>>  Mahesh.S
>>>
>>>
>>> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>
>>> wrote:
>>>
>>>> Hi Mahesh,
>>>>
>>>>   This exception is related to datanode not coming of for some reason,
>>>> but Ranger plugins will be in the name node.
>>>>
>>>>  Do you see the namenode and secondarynamenode running after ranger
>>>> installation and restarting the name node and secondarynamenode?
>>>>
>>>>  In the classpath of the namenode I don’t see any ranger*.jar? do you
>>>> have it in the hadoop/lib directory?
>>>>
>>>>  Also can I get the details of xasecure-hdfs-security.xml  from the
>>>> conf directory?
>>>>
>>>>  Regards,
>>>> Ramesh
>>>>
>>>>  On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <
>>>> sankarmahesh37@gmail.com> wrote:
>>>>
>>>>  Hi Gautam,
>>>>
>>>>                  Now am seeing following exception. is this causes the
>>>> problem?
>>>>
>>>>  2015-01-14 11:41:23,102 WARN
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>>>> java.io.EOFException: End of File Exception between local host is:
>>>> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
>>>> java.io.EOFException; For more details see:
>>>> http://wiki.apache.org/hadoop/EOFException
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>> at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>>>> at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>>>> at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>>>> at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>>>> at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> Caused by: java.io.EOFException
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>>> at
>>>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2015-01-14 11:41:25,981 ERROR
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>>>> 2015-01-14 11:41:25,984 INFO
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>>>> ************************************************************/
>>>> 2015-01-14 11:42:03,054 INFO
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
>>>> /************************************************************
>>>>
>>>>  Thanks
>>>> Mahesh.S
>>>>
>>>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
>>>> sankarmahesh37@gmail.com> wrote:
>>>>
>>>>> Hi Gautam,
>>>>>
>>>>>                Here is my namenode log.Kindly see it.
>>>>>
>>>>>  /************************************************************
>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>> ************************************************************/
>>>>> 2015-01-14 11:01:27,345 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>> /************************************************************
>>>>> STARTUP_MSG: Starting NameNode
>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>> STARTUP_MSG:   args = []
>>>>> STARTUP_MSG:   version = 2.6.0
>>>>> STARTUP_MSG:   classpath =
>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>> STARTUP_MSG:   build =
>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>> 2014-11-13T21:10Z
>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>> ************************************************************/
>>>>> 2015-01-14 11:01:27,363 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>> handlers for [TERM, HUP, INT]
>>>>> 2015-01-14 11:01:27,368 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>> 2015-01-14 11:01:28,029 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>> hadoop-metrics2.properties
>>>>> 2015-01-14 11:01:28,205 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>> period at 10 second(s).
>>>>> 2015-01-14 11:01:28,205 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>> started
>>>>> 2015-01-14 11:01:28,209 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>> hdfs://bigdata:9000
>>>>> 2015-01-14 11:01:28,209 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>> bigdata:9000 to access this namenode/service.
>>>>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>>> Unable to load native-hadoop library for your platform... using
>>>>> builtin-java classes where applicable
>>>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>> org.mortbay.log.Slf4jLog
>>>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>> Http request log for http.requests.namenode is not defined
>>>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> global filter 'safety'
>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context hdfs
>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context static
>>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context logs
>>>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>>>>> addJerseyResourcePackage:
>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>> pathSpec=/webhdfs/v1/*
>>>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>>>> bound to port 50070
>>>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>> 2015-01-14 11:01:30,093 WARN
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>> lack of redundant storage directories!
>>>>> 2015-01-14 11:01:30,093 WARN
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>> loss due to lack of redundant storage directories!
>>>>> 2015-01-14 11:01:30,184 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>> 2015-01-14 11:01:30,196 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>> 2015-01-14 11:01:30,262 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>> dfs.block.invalidate.limit=1000
>>>>> 2015-01-14 11:01:30,262 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>> 2015-01-14 11:01:30,266 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>> 2015-01-14 11:01:30,268 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>> deletion will start around 2015 Jan 14 11:01:30
>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map BlocksMap
>>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>> memory 889 MB = 17.8 MB
>>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^21 = 2097152 entries
>>>>> 2015-01-14 11:01:30,289 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> dfs.block.access.token.enable=false
>>>>> 2015-01-14 11:01:30,289 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> defaultReplication         = 1
>>>>> 2015-01-14 11:01:30,289 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>             = 512
>>>>> 2015-01-14 11:01:30,289 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>             = 1
>>>>> 2015-01-14 11:01:30,289 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> maxReplicationStreams      = 2
>>>>> 2015-01-14 11:01:30,290 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> shouldCheckForEnoughRacks  = false
>>>>> 2015-01-14 11:01:30,290 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> replicationRecheckInterval = 3000
>>>>> 2015-01-14 11:01:30,290 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> encryptDataTransfer        = false
>>>>> 2015-01-14 11:01:30,290 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> maxNumBlocksToLog          = 1000
>>>>> 2015-01-14 11:01:30,298 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>> hadoop2 (auth:SIMPLE)
>>>>> 2015-01-14 11:01:30,299 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>> supergroup
>>>>> 2015-01-14 11:01:30,299 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>> true
>>>>> 2015-01-14 11:01:30,299 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>> 2015-01-14 11:01:30,302 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map INodeMap
>>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>> memory 889 MB = 8.9 MB
>>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^20 = 1048576 entries
>>>>> 2015-01-14 11:01:30,648 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>> occuring more than 10 times
>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map cachedBlocks
>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>> memory 889 MB = 2.2 MB
>>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^18 = 262144 entries
>>>>> 2015-01-14 11:01:30,669 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>> 2015-01-14 11:01:30,669 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>> 2015-01-14 11:01:30,669 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.extension     = 30000
>>>>> 2015-01-14 11:01:30,674 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>> namenode is enabled
>>>>> 2015-01-14 11:01:30,674 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map NameNodeRetryCache
>>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^15 = 32768 entries
>>>>> 2015-01-14 11:01:30,687 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>> 2015-01-14 11:01:30,687 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>> 2015-01-14 11:01:30,687 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>> 16384
>>>>> 2015-01-14 11:01:30,729 INFO
>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>> 11417@bigdata
>>>>> 2015-01-14 11:01:30,963 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>> 2015-01-14 11:01:31,065 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>> file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>>>>> ->
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>> 2015-01-14 11:01:31,210 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>> INodes.
>>>>> 2015-01-14 11:01:31,293 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>> FSImage in 0 seconds.
>>>>> 2015-01-14 11:01:31,293 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>> 2015-01-14 11:01:31,294 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>>>>> expecting start txid #84
>>>>> 2015-01-14 11:01:31,294 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>> 2015-01-14 11:01:31,299 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,303 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,303 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>> expecting start txid #86
>>>>> 2015-01-14 11:01:31,303 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>> 2015-01-14 11:01:31,303 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>> expecting start txid #88
>>>>> 2015-01-14 11:01:31,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>> 2015-01-14 11:01:31,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,305 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,305 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>> expecting start txid #90
>>>>> 2015-01-14 11:01:31,305 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>> 2015-01-14 11:01:31,306 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,306 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,306 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>> expecting start txid #92
>>>>> 2015-01-14 11:01:31,306 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>> 2015-01-14 11:01:31,307 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,307 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,307 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>> expecting start txid #94
>>>>> 2015-01-14 11:01:31,308 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>> 2015-01-14 11:01:31,308 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:01:31,313 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>> 2015-01-14 11:01:31,317 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>> 2015-01-14 11:01:31,346 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>>>> 2015-01-14 11:01:31,904 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>> entries 0 lookups
>>>>> 2015-01-14 11:01:31,904 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>> FSImage in 1216 msecs
>>>>> 2015-01-14 11:01:32,427 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>> bigdata:9000
>>>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>>>>> Socket Reader #1 for port 9000
>>>>> 2015-01-14 11:01:32,568 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>> FSNamesystemState MBean
>>>>> 2015-01-14 11:01:32,588 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>> construction: 0
>>>>> 2015-01-14 11:01:32,588 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>> construction: 0
>>>>> 2015-01-14 11:01:32,588 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>> replication queues
>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Leaving safe mode after 2 secs
>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>> 2015-01-14 11:01:32,645 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>> blocks            = 0
>>>>> 2015-01-14 11:01:32,645 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> invalid blocks          = 0
>>>>> 2015-01-14 11:01:32,645 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> under-replicated blocks = 0
>>>>> 2015-01-14 11:01:32,645 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>  over-replicated blocks = 0
>>>>> 2015-01-14 11:01:32,645 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> blocks being written    = 0
>>>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>> under-replicated blocks completed in 52 msec
>>>>> 2015-01-14 11:01:32,676 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>> bigdata/10.10.10.63:9000
>>>>> 2015-01-14 11:01:32,676 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>> required for active state
>>>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>> Responder: starting
>>>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>> listener on 9000: starting
>>>>> 2015-01-14 11:01:32,697 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>> 2015-01-14 11:01:32,697 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Rescanning after 4192060 milliseconds
>>>>> 2015-01-14 11:01:32,704 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>> ipcPort=50020,
>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>> 2015-01-14 11:01:38,039 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>> failed storage changes from 0 to 0
>>>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>> 2015-01-14 11:01:38,557 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>> failed storage changes from 0 to 0
>>>>> 2015-01-14 11:01:38,562 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>> 10.10.10.63:50010
>>>>> 2015-01-14 11:01:38,692 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>> processReport: Received first block report from
>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>> stale
>>>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport:
>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>> DatanodeRegistration(10.10.10.63,
>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>> ipcPort=50020,
>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>>>> 2015-01-14 11:02:02,697 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Rescanning after 30000 milliseconds
>>>>> 2015-01-14 11:02:02,698 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>> 2015-01-14 11:02:21,288 ERROR
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>>>> 2015-01-14 11:02:21,291 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>>>>> /************************************************************
>>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>>> ************************************************************/
>>>>> 2015-01-14 11:03:02,845 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>>> /************************************************************
>>>>> STARTUP_MSG: Starting NameNode
>>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>>> STARTUP_MSG:   args = []
>>>>> STARTUP_MSG:   version = 2.6.0
>>>>> STARTUP_MSG:   classpath =
>>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>>> STARTUP_MSG:   build =
>>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>>> 2014-11-13T21:10Z
>>>>> STARTUP_MSG:   java = 1.7.0_45
>>>>> ************************************************************/
>>>>> 2015-01-14 11:03:02,861 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>>> handlers for [TERM, HUP, INT]
>>>>> 2015-01-14 11:03:02,866 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>>> 2015-01-14 11:03:03,521 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>>> hadoop-metrics2.properties
>>>>> 2015-01-14 11:03:03,697 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>>> period at 10 second(s).
>>>>> 2015-01-14 11:03:03,697 INFO
>>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>>> started
>>>>> 2015-01-14 11:03:03,700 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>>> hdfs://bigdata:9000
>>>>> 2015-01-14 11:03:03,701 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>>> bigdata:9000 to access this namenode/service.
>>>>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>>> Unable to load native-hadoop library for your platform... using
>>>>> builtin-java classes where applicable
>>>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>>> org.mortbay.log.Slf4jLog
>>>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog:
>>>>> Http request log for http.requests.namenode is not defined
>>>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> global filter 'safety'
>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context hdfs
>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context logs
>>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter static_user_filter
>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>>> context static
>>>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
>>>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>>>>> addJerseyResourcePackage:
>>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>>> pathSpec=/webhdfs/v1/*
>>>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>>>> bound to port 50070
>>>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>>> 2015-01-14 11:03:05,645 WARN
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>>> lack of redundant storage directories!
>>>>> 2015-01-14 11:03:05,645 WARN
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>>> loss due to lack of redundant storage directories!
>>>>> 2015-01-14 11:03:05,746 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>>> 2015-01-14 11:03:05,761 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>>> 2015-01-14 11:03:05,837 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>> dfs.block.invalidate.limit=1000
>>>>> 2015-01-14 11:03:05,837 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>>> 2015-01-14 11:03:05,841 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>>> 2015-01-14 11:03:05,843 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>>> deletion will start around 2015 Jan 14 11:03:05
>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map BlocksMap
>>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>>> memory 889 MB = 17.8 MB
>>>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^21 = 2097152 entries
>>>>> 2015-01-14 11:03:05,864 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> dfs.block.access.token.enable=false
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> defaultReplication         = 1
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>>             = 512
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>>             = 1
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> maxReplicationStreams      = 2
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> shouldCheckForEnoughRacks  = false
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> replicationRecheckInterval = 3000
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> encryptDataTransfer        = false
>>>>> 2015-01-14 11:03:05,865 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>>> maxNumBlocksToLog          = 1000
>>>>> 2015-01-14 11:03:05,874 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>>> hadoop2 (auth:SIMPLE)
>>>>> 2015-01-14 11:03:05,874 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>>> supergroup
>>>>> 2015-01-14 11:03:05,874 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>>> true
>>>>> 2015-01-14 11:03:05,875 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>>> 2015-01-14 11:03:05,878 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map INodeMap
>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>>> memory 889 MB = 8.9 MB
>>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^20 = 1048576 entries
>>>>> 2015-01-14 11:03:06,284 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>>> occuring more than 10 times
>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map cachedBlocks
>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>>> memory 889 MB = 2.2 MB
>>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^18 = 262144 entries
>>>>> 2015-01-14 11:03:06,301 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>>> 2015-01-14 11:03:06,301 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.min.datanodes = 0
>>>>> 2015-01-14 11:03:06,301 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>>> dfs.namenode.safemode.extension     = 30000
>>>>> 2015-01-14 11:03:06,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>>> namenode is enabled
>>>>> 2015-01-14 11:03:06,304 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>>>>> capacity for map NameNodeRetryCache
>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type
>>>>>   = 64-bit
>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity
>>>>>  = 2^15 = 32768 entries
>>>>> 2015-01-14 11:03:06,317 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>>> 2015-01-14 11:03:06,318 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>>> 2015-01-14 11:03:06,318 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>>> 16384
>>>>> 2015-01-14 11:03:06,368 INFO
>>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>>> 13312@bigdata
>>>>> 2015-01-14 11:03:06,532 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>>> 2015-01-14 11:03:06,622 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>>> file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>>>>> ->
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>> 2015-01-14 11:03:06,807 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>>> INodes.
>>>>> 2015-01-14 11:03:06,888 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>>> FSImage in 0 seconds.
>>>>> 2015-01-14 11:03:06,888 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>>> 2015-01-14 11:03:06,889 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>>> expecting start txid #84
>>>>> 2015-01-14 11:03:06,889 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>> 2015-01-14 11:03:06,893 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,897 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,897 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>>> expecting start txid #86
>>>>> 2015-01-14 11:03:06,898 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>> 2015-01-14 11:03:06,898 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,898 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,899 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>>> expecting start txid #88
>>>>> 2015-01-14 11:03:06,899 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>> 2015-01-14 11:03:06,899 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,899 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,900 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>>> expecting start txid #90
>>>>> 2015-01-14 11:03:06,900 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>> 2015-01-14 11:03:06,900 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,901 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,901 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>>> expecting start txid #92
>>>>> 2015-01-14 11:03:06,901 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>> 2015-01-14 11:03:06,901 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,902 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>>> of size 42 edits # 2 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,902 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>>>>> expecting start txid #94
>>>>> 2015-01-14 11:03:06,902 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>> 2015-01-14 11:03:06,902 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:06,907 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>> 2015-01-14 11:03:06,908 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>>>>> expecting start txid #95
>>>>> 2015-01-14 11:03:06,908 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>> 2015-01-14 11:03:06,908 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>>> stream
>>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>>>>> to transaction ID 84
>>>>> 2015-01-14 11:03:07,266 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>>> 2015-01-14 11:03:07,274 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>>> 2015-01-14 11:03:07,313 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>>>> 2015-01-14 11:03:07,558 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>>> entries 0 lookups
>>>>> 2015-01-14 11:03:07,559 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>>> FSImage in 1240 msecs
>>>>> 2015-01-14 11:03:08,011 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>>> bigdata:9000
>>>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>>>>> Socket Reader #1 for port 9000
>>>>> 2015-01-14 11:03:08,151 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>>> FSNamesystemState MBean
>>>>> 2015-01-14 11:03:08,173 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>> construction: 0
>>>>> 2015-01-14 11:03:08,173 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>>> construction: 0
>>>>> 2015-01-14 11:03:08,173 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>>> replication queues
>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Leaving safe mode after 2 secs
>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Network topology has 0 racks and 0 datanodes
>>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* UnderReplicatedBlocks has 0 blocks
>>>>> 2015-01-14 11:03:08,194 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>>> blocks            = 0
>>>>> 2015-01-14 11:03:08,194 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> invalid blocks          = 0
>>>>> 2015-01-14 11:03:08,194 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> under-replicated blocks = 0
>>>>> 2015-01-14 11:03:08,194 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>>  over-replicated blocks = 0
>>>>> 2015-01-14 11:03:08,194 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>> blocks being written    = 0
>>>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> STATE* Replication Queue initialization scan for invalid, over- and
>>>>> under-replicated blocks completed in 18 msec
>>>>> 2015-01-14 11:03:08,322 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>>> bigdata/10.10.10.63:9000
>>>>> 2015-01-14 11:03:08,322 INFO
>>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>>> required for active state
>>>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>> Responder: starting
>>>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>>> listener on 9000: starting
>>>>> 2015-01-14 11:03:08,349 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>>> 2015-01-14 11:03:08,349 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Rescanning after 4287712 milliseconds
>>>>> 2015-01-14 11:03:08,350 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange:
>>>>> BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>> ipcPort=50020,
>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>>> 2015-01-14 11:03:13,244 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>> failed storage changes from 0 to 0
>>>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>>> 2015-01-14 11:03:13,743 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>>> failed storage changes from 0 to 0
>>>>> 2015-01-14 11:03:13,750 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>>> 10.10.10.63:50010
>>>>> 2015-01-14 11:03:13,959 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>>> processReport: Received first block report from
>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>>> starting up or becoming active. Its block contents are no longer considered
>>>>> stale
>>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport:
>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>>> DatanodeRegistration(10.10.10.63,
>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>>> ipcPort=50020,
>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>>>> 2015-01-14 11:03:38,349 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Rescanning after 30000 milliseconds
>>>>> 2015-01-14 11:03:38,350 INFO
>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>>>>
>>>>>
>>>>>  Thanks
>>>>> Mahesh.S
>>>>>
>>>>>
>>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>>  Hi Mahesh,
>>>>>>      We will need the namenode logs to debug this further. Can you
>>>>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>>>>> Thanks.
>>>>>>
>>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Ramesh,
>>>>>>>
>>>>>>>                    I didnt see any exception in the hdfs logs.my
>>>>>>> problem is agent for hdfs is not created.
>>>>>>>
>>>>>>>  Regards,
>>>>>>> Mahesh.S
>>>>>>>
>>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Mahesh,
>>>>>>>>
>>>>>>>>  The error you are seeing in is just a notice that  parent folder
>>>>>>>> of the resource you are creating doesn’t have read permission for the user
>>>>>>>> whom you are creating the policy.
>>>>>>>>
>>>>>>>>  when you start the hdfs namenode and secondarynode do you see any
>>>>>>>> exception in the hdfs logs?
>>>>>>>>
>>>>>>>>  Regards,
>>>>>>>> Ramesh
>>>>>>>>
>>>>>>>>   On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <
>>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>>
>>>>>>>>   Hi all,
>>>>>>>>
>>>>>>>>  I successfully configured ranger admin,user sync.now am trying to
>>>>>>>> configure hdfs plugin.My steps are following,
>>>>>>>>
>>>>>>>>  1.Created repository testhdfs.
>>>>>>>> 2.cd /usr/local
>>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>>>>> 5.cd ranger-hdfs-plugin
>>>>>>>> 6.vi install.properties
>>>>>>>>  POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>>>>           REPOSITORY_NAME=testhdfs
>>>>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>>>>> 7.cd /usr/local/hadoop
>>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>>>>> 11../enable-hdfs-plugin.sh
>>>>>>>> 12.cp /usr/local/hadoop/lib/*
>>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>>>>> 13.vi xasecure-audit.xml
>>>>>>>>  <property>
>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>>>>                    </property>
>>>>>>>>                    <property>
>>>>>>>>
>>>>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>                    </property>
>>>>>>>>                    <property>
>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>>>>                    <value>rangerlogger</value>
>>>>>>>>                    </property>
>>>>>>>> 14.Restarted hadoop
>>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>>>>> agent is not created.Am i missed any steps.
>>>>>>>>
>>>>>>>>  *NOTE:I am not using HDP.*
>>>>>>>>
>>>>>>>>  *here is my xa_portal.log*
>>>>>>>>
>>>>>>>>  2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_default.properties]
>>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_system.properties]
>>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_custom.properties]
>>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>> builtin-java classes where applicable
>>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>>>>> requestId=10.10.10.53
>>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>>>>> create:nfsnobody@bigdata
>>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_default.properties]
>>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_system.properties]
>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_custom.properties]
>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>> builtin-java classes where applicable
>>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>>>>> requestId=10.10.10.53
>>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>>>>> requestId=10.10.10.53
>>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>> Login: security not enabled, using username
>>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>>> Login: security not enabled, using username
>>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>> at
>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>> at
>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>> at
>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>> at
>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>> at
>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>> at
>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>> Validation error:logMessage=null,
>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>> to save this policy?}
>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>> }
>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>> at
>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>> at
>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>> at
>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>> at
>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>> at
>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>> at
>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>>>>> requestId=10.10.10.53
>>>>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_default.properties]
>>>>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_system.properties]
>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_custom.properties]
>>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [xa_ldap.properties]
>>>>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>>> builtin-java classes where applicable
>>>>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>>> path resource [db_message_bundle.properties]
>>>>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>>>>> requestId=10.10.10.53
>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>> at
>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>> at
>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>> at
>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>> at
>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>> at
>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>> at
>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>>> Validation error:logMessage=null,
>>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>>> to save this policy?}
>>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>>> }
>>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>>> at
>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>>> at
>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>>> at
>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>>> at
>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>> at
>>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>> at
>>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>>> at
>>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>>> at
>>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>>> at
>>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>>> at
>>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>>> at
>>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>>> at
>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>>> at
>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>>> at
>>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>>> at
>>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>>> at
>>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>>> at
>>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>>> user
>>>>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>>>>> requestId=10.10.10.53
>>>>>>>>
>>>>>>>>  Thanks
>>>>>>>> Mahesh.S
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>>  --
>>>>>> Regards,
>>>>>> Gautam.
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>
>>>
>>>
>>> ------------------------------
>>>
>>>
>>>
>>>
>>>
>>>
>>> NOTE: This message may contain information that is confidential,
>>> proprietary, privileged or otherwise protected by law. The message is
>>> intended solely for the named addressee. If received in error, please
>>> destroy and notify the sender. Any use of this email is prohibited when
>>> received in error. Impetus does not represent, warrant and/or guarantee,
>>> that the integrity of this communication has been maintained nor that the
>>> communication is free of errors, virus, interception or interference.
>>>
>>
>>
>>
>> --
>> Regards,
>> Gautam.
>>
>
>


-- 
Regards,
Gautam.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Gautam and Hanish,

                    Thank you for the quick reply.the echo statements of
*HADOOP_NAMENODE_OPTS* and
 *HADOOP_SECONDARYNAMENODE_OPTS *did not return any values.

[root@bigdata conf]# echo $HADOOP_SECONDARYNAMENODE_OPTS

[root@bigdata conf]# echo $HADOOP_NAMENODE_OPTS

[root@bigdata conf]#


Thanks
Mahesh.S

On Wed, Jan 14, 2015 at 4:15 PM, Gautam Borad <gb...@gmail.com> wrote:

> @Hanish/Ramesh, If we check the logs properly, we see that ranger libs are
> getting loaded in the class path :
>
> /usr/local/hadoop/
>>
>> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>>
>
> @Mahesh, I suspect some other problem. Can you put echo statements and
> debug the set-hdfs-plugin-env.sh script? Ideally after the script is
> executed the *HADOOP_NAMENODE_OPTS* and *HADOOP_SECONDARYNAMENODE_OPTS*
> should contain the -javaagent line.
>
>
>
> On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <
> hanish.bansal@impetus.co.in> wrote:
>
>>     ​​​​​Hi Mahesh,
>>
>>
>>  Could you try one thing that copy all the jar files from
>> ${hadoop_home}/lib to hadoop share directory.
>>
>>
>>  $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>>
>>
>>  This may be an issue that hadoop is not able to pick ranger jars from
>> lib directory.
>>
>>
>>  After copying jars, restart hadoop and check if agent is started.
>>
>>
>>
>>      -------
>>
>> *Thanks & Regards, Hanish Bansal*
>> Software Engineer, iLabs
>> Impetus Infotech Pvt. Ltd.
>>
>>      ------------------------------
>> *From:* Mahesh Sankaran <sa...@gmail.com>
>> *Sent:* Wednesday, January 14, 2015 3:33 PM
>> *To:* user@ranger.incubator.apache.org
>> *Subject:* Re: Hdfs agent not created
>>
>>  Hi Ramesh,
>>                ranger*.jar is added in classpath.i can see in hadoop/lib
>> directory.Can i know the meaning of following error.
>>
>>  2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>> RangerDaoManager.getEntityManager(loggingPU)
>>
>>  thanks
>>
>>  Mahesh.S
>>
>>
>> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>
>> wrote:
>>
>>> Hi Mahesh,
>>>
>>>   This exception is related to datanode not coming of for some reason,
>>> but Ranger plugins will be in the name node.
>>>
>>>  Do you see the namenode and secondarynamenode running after ranger
>>> installation and restarting the name node and secondarynamenode?
>>>
>>>  In the classpath of the namenode I don’t see any ranger*.jar? do you
>>> have it in the hadoop/lib directory?
>>>
>>>  Also can I get the details of xasecure-hdfs-security.xml  from the
>>> conf directory?
>>>
>>>  Regards,
>>> Ramesh
>>>
>>>  On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com>
>>> wrote:
>>>
>>>  Hi Gautam,
>>>
>>>                  Now am seeing following exception. is this causes the
>>> problem?
>>>
>>>  2015-01-14 11:41:23,102 WARN
>>> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>>> java.io.EOFException: End of File Exception between local host is:
>>> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
>>> java.io.EOFException; For more details see:
>>> http://wiki.apache.org/hadoop/EOFException
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>> at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>>> at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>>> at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>>> at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>>> at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>>> at java.lang.Thread.run(Thread.java:744)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> 2015-01-14 11:41:25,981 ERROR
>>> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>>> 2015-01-14 11:41:25,984 INFO
>>> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:42:03,054 INFO
>>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
>>> /************************************************************
>>>
>>>  Thanks
>>> Mahesh.S
>>>
>>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
>>> sankarmahesh37@gmail.com> wrote:
>>>
>>>> Hi Gautam,
>>>>
>>>>                Here is my namenode log.Kindly see it.
>>>>
>>>>  /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>> ************************************************************/
>>>> 2015-01-14 11:01:27,345 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.6.0
>>>> STARTUP_MSG:   classpath =
>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build =
>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>> 2014-11-13T21:10Z
>>>> STARTUP_MSG:   java = 1.7.0_45
>>>> ************************************************************/
>>>> 2015-01-14 11:01:27,363 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 2015-01-14 11:01:27,368 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>> 2015-01-14 11:01:28,029 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 2015-01-14 11:01:28,205 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 2015-01-14 11:01:28,205 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>> started
>>>> 2015-01-14 11:01:28,209 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>> hdfs://bigdata:9000
>>>> 2015-01-14 11:01:28,209 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>> bigdata:9000 to access this namenode/service.
>>>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java classes where applicable
>>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog:
>>>> Http request log for http.requests.namenode is not defined
>>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context hdfs
>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>>>> addJerseyResourcePackage:
>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>> pathSpec=/webhdfs/v1/*
>>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>>> bound to port 50070
>>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>> 2015-01-14 11:01:30,093 WARN
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>> lack of redundant storage directories!
>>>> 2015-01-14 11:01:30,093 WARN
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>> loss due to lack of redundant storage directories!
>>>> 2015-01-14 11:01:30,184 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>> 2015-01-14 11:01:30,196 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>> 2015-01-14 11:01:30,262 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>> dfs.block.invalidate.limit=1000
>>>> 2015-01-14 11:01:30,262 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>> 2015-01-14 11:01:30,266 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>> 2015-01-14 11:01:30,268 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>> deletion will start around 2015 Jan 14 11:01:30
>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map BlocksMap
>>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>> memory 889 MB = 17.8 MB
>>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^21 = 2097152 entries
>>>> 2015-01-14 11:01:30,289 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> dfs.block.access.token.enable=false
>>>> 2015-01-14 11:01:30,289 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> defaultReplication         = 1
>>>> 2015-01-14 11:01:30,289 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>             = 512
>>>> 2015-01-14 11:01:30,289 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>             = 1
>>>> 2015-01-14 11:01:30,289 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> maxReplicationStreams      = 2
>>>> 2015-01-14 11:01:30,290 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> shouldCheckForEnoughRacks  = false
>>>> 2015-01-14 11:01:30,290 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> replicationRecheckInterval = 3000
>>>> 2015-01-14 11:01:30,290 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> encryptDataTransfer        = false
>>>> 2015-01-14 11:01:30,290 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> maxNumBlocksToLog          = 1000
>>>> 2015-01-14 11:01:30,298 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>> hadoop2 (auth:SIMPLE)
>>>> 2015-01-14 11:01:30,299 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>> supergroup
>>>> 2015-01-14 11:01:30,299 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>> true
>>>> 2015-01-14 11:01:30,299 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>> 2015-01-14 11:01:30,302 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map INodeMap
>>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>> memory 889 MB = 8.9 MB
>>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^20 = 1048576 entries
>>>> 2015-01-14 11:01:30,648 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>> occuring more than 10 times
>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map cachedBlocks
>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>> memory 889 MB = 2.2 MB
>>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^18 = 262144 entries
>>>> 2015-01-14 11:01:30,669 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>> 2015-01-14 11:01:30,669 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.min.datanodes = 0
>>>> 2015-01-14 11:01:30,669 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.extension     = 30000
>>>> 2015-01-14 11:01:30,674 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>> namenode is enabled
>>>> 2015-01-14 11:01:30,674 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map NameNodeRetryCache
>>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^15 = 32768 entries
>>>> 2015-01-14 11:01:30,687 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>> 2015-01-14 11:01:30,687 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>> 2015-01-14 11:01:30,687 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>> 16384
>>>> 2015-01-14 11:01:30,729 INFO
>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>> 11417@bigdata
>>>> 2015-01-14 11:01:30,963 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>> 2015-01-14 11:01:31,065 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>> file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>>>> ->
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>> 2015-01-14 11:01:31,210 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>> INodes.
>>>> 2015-01-14 11:01:31,293 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>> FSImage in 0 seconds.
>>>> 2015-01-14 11:01:31,293 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>> 2015-01-14 11:01:31,294 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>>>> expecting start txid #84
>>>> 2015-01-14 11:01:31,294 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>> 2015-01-14 11:01:31,299 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,303 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,303 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>> expecting start txid #86
>>>> 2015-01-14 11:01:31,303 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>> 2015-01-14 11:01:31,303 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>> expecting start txid #88
>>>> 2015-01-14 11:01:31,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>> 2015-01-14 11:01:31,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,305 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,305 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>> expecting start txid #90
>>>> 2015-01-14 11:01:31,305 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>> 2015-01-14 11:01:31,306 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,306 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,306 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>> expecting start txid #92
>>>> 2015-01-14 11:01:31,306 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>> 2015-01-14 11:01:31,307 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,307 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,307 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>> expecting start txid #94
>>>> 2015-01-14 11:01:31,308 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>> 2015-01-14 11:01:31,308 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>> to transaction ID 84
>>>> 2015-01-14 11:01:31,313 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>> 2015-01-14 11:01:31,317 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>> 2015-01-14 11:01:31,346 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>>> 2015-01-14 11:01:31,904 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>> entries 0 lookups
>>>> 2015-01-14 11:01:31,904 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>> FSImage in 1216 msecs
>>>> 2015-01-14 11:01:32,427 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>> bigdata:9000
>>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>>>> Socket Reader #1 for port 9000
>>>> 2015-01-14 11:01:32,568 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>> FSNamesystemState MBean
>>>> 2015-01-14 11:01:32,588 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>> construction: 0
>>>> 2015-01-14 11:01:32,588 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>> construction: 0
>>>> 2015-01-14 11:01:32,588 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>> replication queues
>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Leaving safe mode after 2 secs
>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Network topology has 0 racks and 0 datanodes
>>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> UnderReplicatedBlocks has 0 blocks
>>>> 2015-01-14 11:01:32,645 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>> blocks            = 0
>>>> 2015-01-14 11:01:32,645 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> invalid blocks          = 0
>>>> 2015-01-14 11:01:32,645 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> under-replicated blocks = 0
>>>> 2015-01-14 11:01:32,645 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>  over-replicated blocks = 0
>>>> 2015-01-14 11:01:32,645 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> blocks being written    = 0
>>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Replication Queue initialization scan for invalid, over- and
>>>> under-replicated blocks completed in 52 msec
>>>> 2015-01-14 11:01:32,676 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>> bigdata/10.10.10.63:9000
>>>> 2015-01-14 11:01:32,676 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>> required for active state
>>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>> Responder: starting
>>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>> listener on 9000: starting
>>>> 2015-01-14 11:01:32,697 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>> 2015-01-14 11:01:32,697 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Rescanning after 4192060 milliseconds
>>>> 2015-01-14 11:01:32,704 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>>>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>> 2015-01-14 11:01:38,039 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>> failed storage changes from 0 to 0
>>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>> 2015-01-14 11:01:38,557 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>> failed storage changes from 0 to 0
>>>> 2015-01-14 11:01:38,562 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>> 10.10.10.63:50010
>>>> 2015-01-14 11:01:38,692 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>> processReport: Received first block report from
>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>> starting up or becoming active. Its block contents are no longer considered
>>>> stale
>>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport:
>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>> DatanodeRegistration(10.10.10.63,
>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>>> 2015-01-14 11:02:02,697 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Rescanning after 30000 milliseconds
>>>> 2015-01-14 11:02:02,698 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>> 2015-01-14 11:02:21,288 ERROR
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>>> 2015-01-14 11:02:21,291 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>>> ************************************************************/
>>>> 2015-01-14 11:03:02,845 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.6.0
>>>> STARTUP_MSG:   classpath =
>>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build =
>>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>>> 2014-11-13T21:10Z
>>>> STARTUP_MSG:   java = 1.7.0_45
>>>> ************************************************************/
>>>> 2015-01-14 11:03:02,861 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 2015-01-14 11:03:02,866 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>>> 2015-01-14 11:03:03,521 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 2015-01-14 11:03:03,697 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 2015-01-14 11:03:03,697 INFO
>>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>>> started
>>>> 2015-01-14 11:03:03,700 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>>> hdfs://bigdata:9000
>>>> 2015-01-14 11:03:03,701 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>>> bigdata:9000 to access this namenode/service.
>>>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java classes where applicable
>>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>>> Web-server for hdfs at: http://0.0.0.0:50070
>>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog:
>>>> Http request log for http.requests.namenode is not defined
>>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context hdfs
>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
>>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>>>> addJerseyResourcePackage:
>>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>>> pathSpec=/webhdfs/v1/*
>>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>>> bound to port 50070
>>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>>> 2015-01-14 11:03:05,645 WARN
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>>> lack of redundant storage directories!
>>>> 2015-01-14 11:03:05,645 WARN
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>>> loss due to lack of redundant storage directories!
>>>> 2015-01-14 11:03:05,746 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>>> 2015-01-14 11:03:05,761 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>>> 2015-01-14 11:03:05,837 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>> dfs.block.invalidate.limit=1000
>>>> 2015-01-14 11:03:05,837 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>>> 2015-01-14 11:03:05,841 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>>> 2015-01-14 11:03:05,843 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>>> deletion will start around 2015 Jan 14 11:03:05
>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map BlocksMap
>>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max
>>>> memory 889 MB = 17.8 MB
>>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^21 = 2097152 entries
>>>> 2015-01-14 11:03:05,864 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> dfs.block.access.token.enable=false
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> defaultReplication         = 1
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>>             = 512
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>>             = 1
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> maxReplicationStreams      = 2
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> shouldCheckForEnoughRacks  = false
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> replicationRecheckInterval = 3000
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> encryptDataTransfer        = false
>>>> 2015-01-14 11:03:05,865 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>>> maxNumBlocksToLog          = 1000
>>>> 2015-01-14 11:03:05,874 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>>> hadoop2 (auth:SIMPLE)
>>>> 2015-01-14 11:03:05,874 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>>> supergroup
>>>> 2015-01-14 11:03:05,874 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>>> true
>>>> 2015-01-14 11:03:05,875 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>>> 2015-01-14 11:03:05,878 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map INodeMap
>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max
>>>> memory 889 MB = 8.9 MB
>>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^20 = 1048576 entries
>>>> 2015-01-14 11:03:06,284 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>>> occuring more than 10 times
>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map cachedBlocks
>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>>>> memory 889 MB = 2.2 MB
>>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^18 = 262144 entries
>>>> 2015-01-14 11:03:06,301 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>>> 2015-01-14 11:03:06,301 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.min.datanodes = 0
>>>> 2015-01-14 11:03:06,301 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>>> dfs.namenode.safemode.extension     = 30000
>>>> 2015-01-14 11:03:06,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>>> namenode is enabled
>>>> 2015-01-14 11:03:06,304 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>>>> capacity for map NameNodeRetryCache
>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type
>>>> = 64-bit
>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity
>>>>  = 2^15 = 32768 entries
>>>> 2015-01-14 11:03:06,317 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>>> 2015-01-14 11:03:06,318 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>>> 2015-01-14 11:03:06,318 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>>> 16384
>>>> 2015-01-14 11:03:06,368 INFO
>>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>>> 13312@bigdata
>>>> 2015-01-14 11:03:06,532 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>>> 2015-01-14 11:03:06,622 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>>> file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>>>> ->
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>> 2015-01-14 11:03:06,807 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>>> INodes.
>>>> 2015-01-14 11:03:06,888 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>>> FSImage in 0 seconds.
>>>> 2015-01-14 11:03:06,888 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>>> 2015-01-14 11:03:06,889 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>>> expecting start txid #84
>>>> 2015-01-14 11:03:06,889 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>> 2015-01-14 11:03:06,893 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,897 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,897 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>>> expecting start txid #86
>>>> 2015-01-14 11:03:06,898 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>> 2015-01-14 11:03:06,898 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,898 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,899 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>>> expecting start txid #88
>>>> 2015-01-14 11:03:06,899 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>> 2015-01-14 11:03:06,899 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,899 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,900 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>>> expecting start txid #90
>>>> 2015-01-14 11:03:06,900 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>> 2015-01-14 11:03:06,900 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,901 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,901 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>>> expecting start txid #92
>>>> 2015-01-14 11:03:06,901 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>> 2015-01-14 11:03:06,901 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,902 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>>> of size 42 edits # 2 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,902 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>>>> expecting start txid #94
>>>> 2015-01-14 11:03:06,902 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>> 2015-01-14 11:03:06,902 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:06,907 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>> 2015-01-14 11:03:06,908 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>>>> expecting start txid #95
>>>> 2015-01-14 11:03:06,908 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>> 2015-01-14 11:03:06,908 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>>> stream
>>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>>>> to transaction ID 84
>>>> 2015-01-14 11:03:07,266 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>>> of size 1048576 edits # 1 loaded in 0 seconds
>>>> 2015-01-14 11:03:07,274 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>>> 2015-01-14 11:03:07,313 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>>> 2015-01-14 11:03:07,558 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>>> entries 0 lookups
>>>> 2015-01-14 11:03:07,559 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>>> FSImage in 1240 msecs
>>>> 2015-01-14 11:03:08,011 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>>> bigdata:9000
>>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>>>> Socket Reader #1 for port 9000
>>>> 2015-01-14 11:03:08,151 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>>> FSNamesystemState MBean
>>>> 2015-01-14 11:03:08,173 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>> construction: 0
>>>> 2015-01-14 11:03:08,173 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>>> construction: 0
>>>> 2015-01-14 11:03:08,173 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>>> replication queues
>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Leaving safe mode after 2 secs
>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Network topology has 0 racks and 0 datanodes
>>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> UnderReplicatedBlocks has 0 blocks
>>>> 2015-01-14 11:03:08,194 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>>> blocks            = 0
>>>> 2015-01-14 11:03:08,194 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> invalid blocks          = 0
>>>> 2015-01-14 11:03:08,194 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> under-replicated blocks = 0
>>>> 2015-01-14 11:03:08,194 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>>  over-replicated blocks = 0
>>>> 2015-01-14 11:03:08,194 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>> blocks being written    = 0
>>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>>> Replication Queue initialization scan for invalid, over- and
>>>> under-replicated blocks completed in 18 msec
>>>> 2015-01-14 11:03:08,322 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>>> bigdata/10.10.10.63:9000
>>>> 2015-01-14 11:03:08,322 INFO
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>>> required for active state
>>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>> Responder: starting
>>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
>>>> listener on 9000: starting
>>>> 2015-01-14 11:03:08,349 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>>> 2015-01-14 11:03:08,349 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Rescanning after 4287712 milliseconds
>>>> 2015-01-14 11:03:08,350 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>>>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>>> 2015-01-14 11:03:13,244 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>> failed storage changes from 0 to 0
>>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>>>> Adding a new node: /default-rack/10.10.10.63:50010
>>>> 2015-01-14 11:03:13,743 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>>> failed storage changes from 0 to 0
>>>> 2015-01-14 11:03:13,750 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>>> 10.10.10.63:50010
>>>> 2015-01-14 11:03:13,959 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>>> processReport: Received first block report from
>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>>> starting up or becoming active. Its block contents are no longer considered
>>>> stale
>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport:
>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>>> DatanodeRegistration(10.10.10.63,
>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>>> 2015-01-14 11:03:38,349 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Rescanning after 30000 milliseconds
>>>> 2015-01-14 11:03:38,350 INFO
>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>>>
>>>>
>>>>  Thanks
>>>> Mahesh.S
>>>>
>>>>
>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com>
>>>> wrote:
>>>>
>>>>>  Hi Mahesh,
>>>>>      We will need the namenode logs to debug this further. Can you
>>>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>>>> Thanks.
>>>>>
>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>
>>>>>> Hi Ramesh,
>>>>>>
>>>>>>                    I didnt see any exception in the hdfs logs.my
>>>>>> problem is agent for hdfs is not created.
>>>>>>
>>>>>>  Regards,
>>>>>> Mahesh.S
>>>>>>
>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi Mahesh,
>>>>>>>
>>>>>>>  The error you are seeing in is just a notice that  parent folder
>>>>>>> of the resource you are creating doesn’t have read permission for the user
>>>>>>> whom you are creating the policy.
>>>>>>>
>>>>>>>  when you start the hdfs namenode and secondarynode do you see any
>>>>>>> exception in the hdfs logs?
>>>>>>>
>>>>>>>  Regards,
>>>>>>> Ramesh
>>>>>>>
>>>>>>>   On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <
>>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>>
>>>>>>>   Hi all,
>>>>>>>
>>>>>>>  I successfully configured ranger admin,user sync.now am trying to
>>>>>>> configure hdfs plugin.My steps are following,
>>>>>>>
>>>>>>>  1.Created repository testhdfs.
>>>>>>> 2.cd /usr/local
>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>>>> 5.cd ranger-hdfs-plugin
>>>>>>> 6.vi install.properties
>>>>>>>  POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>>>           REPOSITORY_NAME=testhdfs
>>>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>>>> 7.cd /usr/local/hadoop
>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>>>> 11../enable-hdfs-plugin.sh
>>>>>>> 12.cp /usr/local/hadoop/lib/*
>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>>>> 13.vi xasecure-audit.xml
>>>>>>>  <property>
>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>>>                    </property>
>>>>>>>                    <property>
>>>>>>>
>>>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>>>                    <value>rangerlogger</value>
>>>>>>>                    </property>
>>>>>>>                    <property>
>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>>>                    <value>rangerlogger</value>
>>>>>>>                    </property>
>>>>>>> 14.Restarted hadoop
>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>>>> agent is not created.Am i missed any steps.
>>>>>>>
>>>>>>>  *NOTE:I am not using HDP.*
>>>>>>>
>>>>>>>  *here is my xa_portal.log*
>>>>>>>
>>>>>>>  2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_default.properties]
>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_system.properties]
>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_custom.properties]
>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_ldap.properties]
>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>> builtin-java classes where applicable
>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [db_message_bundle.properties]
>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>>>> requestId=10.10.10.53
>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>>>> create:nfsnobody@bigdata
>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_default.properties]
>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_system.properties]
>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_custom.properties]
>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_ldap.properties]
>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>> builtin-java classes where applicable
>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [db_message_bundle.properties]
>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>>>> requestId=10.10.10.53
>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>>>> requestId=10.10.10.53
>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>> Login: security not enabled, using username
>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>>> Login: security not enabled, using username
>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>> at
>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>> at
>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>>>> at
>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>> at
>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>> at
>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>> at
>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>> at
>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>> at
>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>> Validation error:logMessage=null,
>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>> to save this policy?}
>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>> }
>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>> at
>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>> at
>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>> at
>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>> at
>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>> at
>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>> at
>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>> at
>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>>>> requestId=10.10.10.53
>>>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_default.properties]
>>>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_system.properties]
>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_custom.properties]
>>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [xa_ldap.properties]
>>>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>>> builtin-java classes where applicable
>>>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>>> path resource [db_message_bundle.properties]
>>>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>>>> requestId=10.10.10.53
>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>> at
>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>> at
>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>> at
>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>> at
>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>> at
>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>> at
>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>> at
>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>>> Validation error:logMessage=null,
>>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>>> to save this policy?}
>>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>>> }
>>>>>>> javax.ws.rs.WebApplicationException
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>>> at
>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>>> at
>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>>> at
>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>>> at
>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>>> at
>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>> at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>> at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>>> at
>>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>>> at
>>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>>> at
>>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>>> at
>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>>> at
>>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>>> at
>>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>>> at
>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>>> at
>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>>> at
>>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>>> at
>>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>>> at
>>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>>> user
>>>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>>>> requestId=10.10.10.53
>>>>>>>
>>>>>>>  Thanks
>>>>>>> Mahesh.S
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>  --
>>>>> Regards,
>>>>> Gautam.
>>>>>
>>>>
>>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>
>>
>>
>> ------------------------------
>>
>>
>>
>>
>>
>>
>> NOTE: This message may contain information that is confidential,
>> proprietary, privileged or otherwise protected by law. The message is
>> intended solely for the named addressee. If received in error, please
>> destroy and notify the sender. Any use of this email is prohibited when
>> received in error. Impetus does not represent, warrant and/or guarantee,
>> that the integrity of this communication has been maintained nor that the
>> communication is free of errors, virus, interception or interference.
>>
>
>
>
> --
> Regards,
> Gautam.
>

Re: Hdfs agent not created

Posted by Gautam Borad <gb...@gmail.com>.
@Hanish/Ramesh, If we check the logs properly, we see that ranger libs are
getting loaded in the class path :

/usr/local/hadoop/
>
> share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar
>

@Mahesh, I suspect some other problem. Can you put echo statements and
debug the set-hdfs-plugin-env.sh script? Ideally after the script is
executed the *HADOOP_NAMENODE_OPTS* and *HADOOP_SECONDARYNAMENODE_OPTS*
should contain the -javaagent line.



On Wed, Jan 14, 2015 at 3:46 PM, Hanish Bansal <ha...@impetus.co.in>
wrote:

>     ​​​​​Hi Mahesh,
>
>
>  Could you try one thing that copy all the jar files from
> ${hadoop_home}/lib to hadoop share directory.
>
>
>  $ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/
>
>
>  This may be an issue that hadoop is not able to pick ranger jars from
> lib directory.
>
>
>  After copying jars, restart hadoop and check if agent is started.
>
>
>
>      -------
>
> *Thanks & Regards, Hanish Bansal*
> Software Engineer, iLabs
> Impetus Infotech Pvt. Ltd.
>
>      ------------------------------
> *From:* Mahesh Sankaran <sa...@gmail.com>
> *Sent:* Wednesday, January 14, 2015 3:33 PM
> *To:* user@ranger.incubator.apache.org
> *Subject:* Re: Hdfs agent not created
>
>  Hi Ramesh,
>                ranger*.jar is added in classpath.i can see in hadoop/lib
> directory.Can i know the meaning of following error.
>
>  2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
> RangerDaoManager.getEntityManager(loggingPU)
>
>  thanks
>
>  Mahesh.S
>
>
> On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>
> wrote:
>
>> Hi Mahesh,
>>
>>   This exception is related to datanode not coming of for some reason,
>> but Ranger plugins will be in the name node.
>>
>>  Do you see the namenode and secondarynamenode running after ranger
>> installation and restarting the name node and secondarynamenode?
>>
>>  In the classpath of the namenode I don’t see any ranger*.jar? do you
>> have it in the hadoop/lib directory?
>>
>>  Also can I get the details of xasecure-hdfs-security.xml  from the conf
>> directory?
>>
>>  Regards,
>> Ramesh
>>
>>  On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com>
>> wrote:
>>
>>  Hi Gautam,
>>
>>                  Now am seeing following exception. is this causes the
>> problem?
>>
>>  2015-01-14 11:41:23,102 WARN
>> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
>> java.io.EOFException: End of File Exception between local host is:
>> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
>> java.io.EOFException; For more details see:
>> http://wiki.apache.org/hadoop/EOFException
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
>> at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
>> at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
>> at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
>> at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
>> at java.lang.Thread.run(Thread.java:744)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2015-01-14 11:41:25,981 ERROR
>> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
>> 2015-01-14 11:41:25,984 INFO
>> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:42:03,054 INFO
>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
>> /************************************************************
>>
>>  Thanks
>> Mahesh.S
>>
>> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
>> sankarmahesh37@gmail.com> wrote:
>>
>>> Hi Gautam,
>>>
>>>                Here is my namenode log.Kindly see it.
>>>
>>>  /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:01:27,345 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.6.0
>>> STARTUP_MSG:   classpath =
>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build =
>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>> 2014-11-13T21:10Z
>>> STARTUP_MSG:   java = 1.7.0_45
>>> ************************************************************/
>>> 2015-01-14 11:01:27,363 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 2015-01-14 11:01:27,368 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>> 2015-01-14 11:01:28,029 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 2015-01-14 11:01:28,205 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>> period at 10 second(s).
>>> 2015-01-14 11:01:28,205 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>> started
>>> 2015-01-14 11:01:28,209 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>> hdfs://bigdata:9000
>>> 2015-01-14 11:01:28,209 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>> bigdata:9000 to access this namenode/service.
>>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
>>> Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>> Web-server for hdfs at: http://0.0.0.0:50070
>>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http
>>> request log for http.requests.namenode is not defined
>>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
>>> global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context hdfs
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>>> addJerseyResourcePackage:
>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>> pathSpec=/webhdfs/v1/*
>>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>> bound to port 50070
>>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>> 2015-01-14 11:01:30,093 WARN
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>> lack of redundant storage directories!
>>> 2015-01-14 11:01:30,093 WARN
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>> loss due to lack of redundant storage directories!
>>> 2015-01-14 11:01:30,184 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>> 2015-01-14 11:01:30,196 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>> 2015-01-14 11:01:30,262 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>> dfs.block.invalidate.limit=1000
>>> 2015-01-14 11:01:30,262 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>> 2015-01-14 11:01:30,266 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>> 2015-01-14 11:01:30,268 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>> deletion will start around 2015 Jan 14 11:01:30
>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map BlocksMap
>>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max
>>> memory 889 MB = 17.8 MB
>>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^21 = 2097152 entries
>>> 2015-01-14 11:01:30,289 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> dfs.block.access.token.enable=false
>>> 2015-01-14 11:01:30,289 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> defaultReplication         = 1
>>> 2015-01-14 11:01:30,289 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>             = 512
>>> 2015-01-14 11:01:30,289 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>             = 1
>>> 2015-01-14 11:01:30,289 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> maxReplicationStreams      = 2
>>> 2015-01-14 11:01:30,290 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> shouldCheckForEnoughRacks  = false
>>> 2015-01-14 11:01:30,290 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> replicationRecheckInterval = 3000
>>> 2015-01-14 11:01:30,290 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> encryptDataTransfer        = false
>>> 2015-01-14 11:01:30,290 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> maxNumBlocksToLog          = 1000
>>> 2015-01-14 11:01:30,298 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>> hadoop2 (auth:SIMPLE)
>>> 2015-01-14 11:01:30,299 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>> supergroup
>>> 2015-01-14 11:01:30,299 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>> true
>>> 2015-01-14 11:01:30,299 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>> 2015-01-14 11:01:30,302 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map INodeMap
>>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max
>>> memory 889 MB = 8.9 MB
>>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^20 = 1048576 entries
>>> 2015-01-14 11:01:30,648 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>> occuring more than 10 times
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map cachedBlocks
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>>> memory 889 MB = 2.2 MB
>>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^18 = 262144 entries
>>> 2015-01-14 11:01:30,669 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>> 2015-01-14 11:01:30,669 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.min.datanodes = 0
>>> 2015-01-14 11:01:30,669 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.extension     = 30000
>>> 2015-01-14 11:01:30,674 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>> namenode is enabled
>>> 2015-01-14 11:01:30,674 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map NameNodeRetryCache
>>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^15 = 32768 entries
>>> 2015-01-14 11:01:30,687 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>> 2015-01-14 11:01:30,687 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>> 2015-01-14 11:01:30,687 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>> 16384
>>> 2015-01-14 11:01:30,729 INFO
>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>> 11417@bigdata
>>> 2015-01-14 11:01:30,963 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>> 2015-01-14 11:01:31,065 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>> file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>>> ->
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:01:31,210 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>> INodes.
>>> 2015-01-14 11:01:31,293 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>> FSImage in 0 seconds.
>>> 2015-01-14 11:01:31,293 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>> 2015-01-14 11:01:31,294 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>>> expecting start txid #84
>>> 2015-01-14 11:01:31,294 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> 2015-01-14 11:01:31,299 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,303 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,303 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>> expecting start txid #86
>>> 2015-01-14 11:01:31,303 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> 2015-01-14 11:01:31,303 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>> expecting start txid #88
>>> 2015-01-14 11:01:31,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> 2015-01-14 11:01:31,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,305 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,305 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>> expecting start txid #90
>>> 2015-01-14 11:01:31,305 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> 2015-01-14 11:01:31,306 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,306 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,306 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>> expecting start txid #92
>>> 2015-01-14 11:01:31,306 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> 2015-01-14 11:01:31,307 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,307 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:01:31,307 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>> expecting start txid #94
>>> 2015-01-14 11:01:31,308 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:01:31,308 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>> to transaction ID 84
>>> 2015-01-14 11:01:31,313 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:01:31,317 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>> 2015-01-14 11:01:31,346 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>>> 2015-01-14 11:01:31,904 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>> entries 0 lookups
>>> 2015-01-14 11:01:31,904 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>> FSImage in 1216 msecs
>>> 2015-01-14 11:01:32,427 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>> bigdata:9000
>>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>>> Socket Reader #1 for port 9000
>>> 2015-01-14 11:01:32,568 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>> FSNamesystemState MBean
>>> 2015-01-14 11:01:32,588 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>> construction: 0
>>> 2015-01-14 11:01:32,588 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>> construction: 0
>>> 2015-01-14 11:01:32,588 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>> replication queues
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Leaving safe mode after 2 secs
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Network topology has 0 racks and 0 datanodes
>>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> UnderReplicatedBlocks has 0 blocks
>>> 2015-01-14 11:01:32,645 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>> blocks            = 0
>>> 2015-01-14 11:01:32,645 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> invalid blocks          = 0
>>> 2015-01-14 11:01:32,645 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> under-replicated blocks = 0
>>> 2015-01-14 11:01:32,645 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>  over-replicated blocks = 0
>>> 2015-01-14 11:01:32,645 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> blocks being written    = 0
>>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Replication Queue initialization scan for invalid, over- and
>>> under-replicated blocks completed in 52 msec
>>> 2015-01-14 11:01:32,676 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>> bigdata/10.10.10.63:9000
>>> 2015-01-14 11:01:32,676 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>> required for active state
>>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
>>> Responder: starting
>>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
>>> listener on 9000: starting
>>> 2015-01-14 11:01:32,697 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>> 2015-01-14 11:01:32,697 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Rescanning after 4192060 milliseconds
>>> 2015-01-14 11:01:32,704 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>> 2015-01-14 11:01:38,039 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>> failed storage changes from 0 to 0
>>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>>> Adding a new node: /default-rack/10.10.10.63:50010
>>> 2015-01-14 11:01:38,557 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>> failed storage changes from 0 to 0
>>> 2015-01-14 11:01:38,562 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>> 10.10.10.63:50010
>>> 2015-01-14 11:01:38,692 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>> processReport: Received first block report from
>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>> starting up or becoming active. Its block contents are no longer considered
>>> stale
>>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport:
>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>> DatanodeRegistration(10.10.10.63,
>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>>> 2015-01-14 11:02:02,697 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Rescanning after 30000 milliseconds
>>> 2015-01-14 11:02:02,698 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:02:21,288 ERROR
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>>> 2015-01-14 11:02:21,291 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>>> ************************************************************/
>>> 2015-01-14 11:03:02,845 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = bigdata/10.10.10.63
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.6.0
>>> STARTUP_MSG:   classpath =
>>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build =
>>> https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>>> 2014-11-13T21:10Z
>>> STARTUP_MSG:   java = 1.7.0_45
>>> ************************************************************/
>>> 2015-01-14 11:03:02,861 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 2015-01-14 11:03:02,866 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>>> 2015-01-14 11:03:03,521 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 2015-01-14 11:03:03,697 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>>> period at 10 second(s).
>>> 2015-01-14 11:03:03,697 INFO
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>>> started
>>> 2015-01-14 11:03:03,700 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>>> hdfs://bigdata:9000
>>> 2015-01-14 11:03:03,701 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>>> bigdata:9000 to access this namenode/service.
>>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
>>> Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>>> Web-server for hdfs at: http://0.0.0.0:50070
>>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http
>>> request log for http.requests.namenode is not defined
>>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
>>> global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context hdfs
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
>>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>>> addJerseyResourcePackage:
>>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>>> pathSpec=/webhdfs/v1/*
>>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
>>> bound to port 50070
>>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>>> 2015-01-14 11:03:05,645 WARN
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>>> lack of redundant storage directories!
>>> 2015-01-14 11:03:05,645 WARN
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>>> loss due to lack of redundant storage directories!
>>> 2015-01-14 11:03:05,746 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>>> 2015-01-14 11:03:05,761 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>>> 2015-01-14 11:03:05,837 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>> dfs.block.invalidate.limit=1000
>>> 2015-01-14 11:03:05,837 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>>> dfs.namenode.datanode.registration.ip-hostname-check=true
>>> 2015-01-14 11:03:05,841 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>>> 2015-01-14 11:03:05,843 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>>> deletion will start around 2015 Jan 14 11:03:05
>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map BlocksMap
>>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max
>>> memory 889 MB = 17.8 MB
>>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^21 = 2097152 entries
>>> 2015-01-14 11:03:05,864 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> dfs.block.access.token.enable=false
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> defaultReplication         = 1
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>>             = 512
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>>             = 1
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> maxReplicationStreams      = 2
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> shouldCheckForEnoughRacks  = false
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> replicationRecheckInterval = 3000
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> encryptDataTransfer        = false
>>> 2015-01-14 11:03:05,865 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>>> maxNumBlocksToLog          = 1000
>>> 2015-01-14 11:03:05,874 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>>> hadoop2 (auth:SIMPLE)
>>> 2015-01-14 11:03:05,874 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>>> supergroup
>>> 2015-01-14 11:03:05,874 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>>> true
>>> 2015-01-14 11:03:05,875 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>>> 2015-01-14 11:03:05,878 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map INodeMap
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max
>>> memory 889 MB = 8.9 MB
>>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^20 = 1048576 entries
>>> 2015-01-14 11:03:06,284 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>>> occuring more than 10 times
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map cachedBlocks
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>>> memory 889 MB = 2.2 MB
>>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^18 = 262144 entries
>>> 2015-01-14 11:03:06,301 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>>> 2015-01-14 11:03:06,301 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.min.datanodes = 0
>>> 2015-01-14 11:03:06,301 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>>> dfs.namenode.safemode.extension     = 30000
>>> 2015-01-14 11:03:06,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>>> namenode is enabled
>>> 2015-01-14 11:03:06,304 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>>> capacity for map NameNodeRetryCache
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type
>>> = 64-bit
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity
>>>  = 2^15 = 32768 entries
>>> 2015-01-14 11:03:06,317 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>>> 2015-01-14 11:03:06,318 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>>> 2015-01-14 11:03:06,318 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>>> 16384
>>> 2015-01-14 11:03:06,368 INFO
>>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>>> 13312@bigdata
>>> 2015-01-14 11:03:06,532 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>>> 2015-01-14 11:03:06,622 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>>> file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>>> ->
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>> 2015-01-14 11:03:06,807 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>>> INodes.
>>> 2015-01-14 11:03:06,888 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>>> FSImage in 0 seconds.
>>> 2015-01-14 11:03:06,888 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>>> 2015-01-14 11:03:06,889 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>>> expecting start txid #84
>>> 2015-01-14 11:03:06,889 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> 2015-01-14 11:03:06,893 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,897 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,897 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>>> expecting start txid #86
>>> 2015-01-14 11:03:06,898 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> 2015-01-14 11:03:06,898 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,898 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,899 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>>> expecting start txid #88
>>> 2015-01-14 11:03:06,899 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> 2015-01-14 11:03:06,899 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,899 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,900 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>>> expecting start txid #90
>>> 2015-01-14 11:03:06,900 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> 2015-01-14 11:03:06,900 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,901 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,901 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>>> expecting start txid #92
>>> 2015-01-14 11:03:06,901 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> 2015-01-14 11:03:06,901 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,902 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>>> of size 42 edits # 2 loaded in 0 seconds
>>> 2015-01-14 11:03:06,902 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>>> expecting start txid #94
>>> 2015-01-14 11:03:06,902 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> 2015-01-14 11:03:06,902 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>>> to transaction ID 84
>>> 2015-01-14 11:03:06,907 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>>> of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:03:06,908 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>>> expecting start txid #95
>>> 2015-01-14 11:03:06,908 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>> 2015-01-14 11:03:06,908 INFO
>>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>>> stream
>>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>>> to transaction ID 84
>>> 2015-01-14 11:03:07,266 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>>> of size 1048576 edits # 1 loaded in 0 seconds
>>> 2015-01-14 11:03:07,274 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>>> 2015-01-14 11:03:07,313 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>>> 2015-01-14 11:03:07,558 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>>> entries 0 lookups
>>> 2015-01-14 11:03:07,559 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>>> FSImage in 1240 msecs
>>> 2015-01-14 11:03:08,011 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>>> bigdata:9000
>>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>>> Socket Reader #1 for port 9000
>>> 2015-01-14 11:03:08,151 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>>> FSNamesystemState MBean
>>> 2015-01-14 11:03:08,173 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>> construction: 0
>>> 2015-01-14 11:03:08,173 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>>> construction: 0
>>> 2015-01-14 11:03:08,173 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>>> replication queues
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Leaving safe mode after 2 secs
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Network topology has 0 racks and 0 datanodes
>>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> UnderReplicatedBlocks has 0 blocks
>>> 2015-01-14 11:03:08,194 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>>> blocks            = 0
>>> 2015-01-14 11:03:08,194 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> invalid blocks          = 0
>>> 2015-01-14 11:03:08,194 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> under-replicated blocks = 0
>>> 2015-01-14 11:03:08,194 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>>  over-replicated blocks = 0
>>> 2015-01-14 11:03:08,194 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>> blocks being written    = 0
>>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>>> Replication Queue initialization scan for invalid, over- and
>>> under-replicated blocks completed in 18 msec
>>> 2015-01-14 11:03:08,322 INFO
>>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>>> bigdata/10.10.10.63:9000
>>> 2015-01-14 11:03:08,322 INFO
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>>> required for active state
>>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
>>> Responder: starting
>>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
>>> listener on 9000: starting
>>> 2015-01-14 11:03:08,349 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>>> 2015-01-14 11:03:08,349 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Rescanning after 4287712 milliseconds
>>> 2015-01-14 11:03:08,350 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>>> 2015-01-14 11:03:13,244 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>> failed storage changes from 0 to 0
>>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>>> Adding a new node: /default-rack/10.10.10.63:50010
>>> 2015-01-14 11:03:13,743 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>>> failed storage changes from 0 to 0
>>> 2015-01-14 11:03:13,750 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>>> 10.10.10.63:50010
>>> 2015-01-14 11:03:13,959 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>>> processReport: Received first block report from
>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>>> starting up or becoming active. Its block contents are no longer considered
>>> stale
>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport:
>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>>> DatanodeRegistration(10.10.10.63,
>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>>> 2015-01-14 11:03:38,349 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Rescanning after 30000 milliseconds
>>> 2015-01-14 11:03:38,350 INFO
>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>>
>>>
>>>  Thanks
>>> Mahesh.S
>>>
>>>
>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
>>>
>>>>  Hi Mahesh,
>>>>      We will need the namenode logs to debug this further. Can you
>>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>>> Thanks.
>>>>
>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>>> sankarmahesh37@gmail.com> wrote:
>>>>
>>>>> Hi Ramesh,
>>>>>
>>>>>                    I didnt see any exception in the hdfs logs.my
>>>>> problem is agent for hdfs is not created.
>>>>>
>>>>>  Regards,
>>>>> Mahesh.S
>>>>>
>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>>>>> wrote:
>>>>>
>>>>>> Hi Mahesh,
>>>>>>
>>>>>>  The error you are seeing in is just a notice that  parent folder of
>>>>>> the resource you are creating doesn’t have read permission for the user
>>>>>> whom you are creating the policy.
>>>>>>
>>>>>>  when you start the hdfs namenode and secondarynode do you see any
>>>>>> exception in the hdfs logs?
>>>>>>
>>>>>>  Regards,
>>>>>> Ramesh
>>>>>>
>>>>>>   On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <
>>>>>> sankarmahesh37@gmail.com> wrote:
>>>>>>
>>>>>>   Hi all,
>>>>>>
>>>>>>  I successfully configured ranger admin,user sync.now am trying to
>>>>>> configure hdfs plugin.My steps are following,
>>>>>>
>>>>>>  1.Created repository testhdfs.
>>>>>> 2.cd /usr/local
>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>>> 5.cd ranger-hdfs-plugin
>>>>>> 6.vi install.properties
>>>>>>  POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>>           REPOSITORY_NAME=testhdfs
>>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>>> 7.cd /usr/local/hadoop
>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>>> 11../enable-hdfs-plugin.sh
>>>>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>>> 13.vi xasecure-audit.xml
>>>>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>>                    </property>
>>>>>>                    <property>
>>>>>>
>>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>>                    <value>rangerlogger</value>
>>>>>>                    </property>
>>>>>>                    <property>
>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>>                    <value>rangerlogger</value>
>>>>>>                    </property>
>>>>>> 14.Restarted hadoop
>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>>> agent is not created.Am i missed any steps.
>>>>>>
>>>>>>  *NOTE:I am not using HDP.*
>>>>>>
>>>>>>  *here is my xa_portal.log*
>>>>>>
>>>>>>  2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_default.properties]
>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_system.properties]
>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_custom.properties]
>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_ldap.properties]
>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>> builtin-java classes where applicable
>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [db_message_bundle.properties]
>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>>> requestId=10.10.10.53
>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>>> create:nfsnobody@bigdata
>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_default.properties]
>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_system.properties]
>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_custom.properties]
>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_ldap.properties]
>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>> builtin-java classes where applicable
>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [db_message_bundle.properties]
>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>>> requestId=10.10.10.53
>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>>> requestId=10.10.10.53
>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>> Login: security not enabled, using username
>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>>> Login: security not enabled, using username
>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>> javax.ws.rs.WebApplicationException
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>> at
>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>> at
>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>>> at
>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>> at
>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>> at
>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>> at
>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>> at
>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>> at
>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>> at
>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>> at
>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>> at
>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at
>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>> Validation error:logMessage=null,
>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>> to save this policy?}
>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>> }
>>>>>> javax.ws.rs.WebApplicationException
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>> at
>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>> at
>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>> at
>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>> at
>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>> at
>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>> at
>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>> at
>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>> at
>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>> at
>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>> at
>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at
>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>>> requestId=10.10.10.53
>>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_default.properties]
>>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_system.properties]
>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_custom.properties]
>>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [xa_ldap.properties]
>>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>>> Unable to load native-hadoop library for your platform... using
>>>>>> builtin-java classes where applicable
>>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>>> path resource [db_message_bundle.properties]
>>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>>> requestId=10.10.10.53
>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>>> permission on parent folder. Do you want to save this policy?
>>>>>> javax.ws.rs.WebApplicationException
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>> at
>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>> at
>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>> at
>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>> at
>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>> at
>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>> at
>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>> at
>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>> at
>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>> at
>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>> at
>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at
>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>>> Validation error:logMessage=null,
>>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>>> to save this policy?}
>>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>>> }
>>>>>> javax.ws.rs.WebApplicationException
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>>> at
>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>>> at
>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>>> at
>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>>> at
>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>>> at
>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>>> at
>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>> at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>> at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>>> at
>>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>>> at
>>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>>> at
>>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>>> at
>>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>>> at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>>> at
>>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>>> at
>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>>> at
>>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>>> at
>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>>> at
>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>>> at
>>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>>> at
>>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>>> at
>>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>> at
>>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>>> user
>>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>>> requestId=10.10.10.53
>>>>>>
>>>>>>  Thanks
>>>>>> Mahesh.S
>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>>  --
>>>> Regards,
>>>> Gautam.
>>>>
>>>
>>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>



-- 
Regards,
Gautam.

RE: Hdfs agent not created

Posted by Hanish Bansal <ha...@impetus.co.in>.
​​​​​Hi Mahesh,


Could you try one thing that copy all the jar files from ${hadoop_home}/lib to hadoop share directory.


$ cp <hadoop-home>/lib/* <hadoop-home>/share/hadoop/hdfs/lib/


This may be an issue that hadoop is not able to pick ranger jars from lib directory.


After copying jars, restart hadoop and check if agent is started.



-------
Thanks & Regards,
Hanish Bansal
Software Engineer, iLabs
Impetus Infotech Pvt. Ltd.

________________________________
From: Mahesh Sankaran <sa...@gmail.com>
Sent: Wednesday, January 14, 2015 3:33 PM
To: user@ranger.incubator.apache.org
Subject: Re: Hdfs agent not created

Hi Ramesh,
               ranger*.jar is added in classpath.i can see in hadoop/lib directory.Can i know the meaning of following error.

2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)

thanks

Mahesh.S


On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com>> wrote:
Hi Mahesh,

This exception is related to datanode not coming of for some reason, but Ranger plugins will be in the name node.

Do you see the namenode and secondarynamenode running after ranger installation and restarting the name node and secondarynamenode?

In the classpath of the namenode I don’t see any ranger*.jar? do you have it in the hadoop/lib directory?

Also can I get the details of xasecure-hdfs-security.xml  from the conf directory?

Regards,
Ramesh

On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com>> wrote:

Hi Gautam,

                Now am seeing following exception. is this causes the problem?

2015-01-14 11:41:23,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.EOFException: End of File Exception between local host is: "bigdata/10.10.10.63<http://10.10.10.63/>"; destination host is: "bigdata":9000; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2015-01-14 11:41:25,981 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
2015-01-14 11:41:25,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63<http://10.10.10.63/>
************************************************************/
2015-01-14 11:42:03,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************

Thanks
Mahesh.S

On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <sa...@gmail.com>> wrote:
Hi Gautam,

              Here is my namenode log.Kindly see it.

/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63<http://10.10.10.63/>
************************************************************/
2015-01-14 11:01:27,345 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = bigdata/10.10.10.63<http://10.10.10.63/>
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.7.0_45
************************************************************/
2015-01-14 11:01:27,363 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-01-14 11:01:27,368 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2015-01-14 11:01:28,029 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070<http://0.0.0.0:50070/>
2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070<http://SelectChannelConnectorWithSafeStartup@0.0.0.0:50070/>
2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
2015-01-14 11:01:30,184 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
2015-01-14 11:01:30,196 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
2015-01-14 11:01:30,266 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2015-01-14 11:01:30,268 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:01:30
2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
2015-01-14 11:01:30,298 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
2015-01-14 11:01:30,302 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
2015-01-14 11:01:30,648 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 11417@bigdata
2015-01-14 11:01:30,963 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
2015-01-14 11:01:31,065 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:01:31,210 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5 expecting start txid #84
2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
2015-01-14 11:01:31,299 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #86
2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #88
2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #90
2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #92
2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #94
2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
2015-01-14 11:01:31,313 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:01:31,317 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
2015-01-14 11:01:31,346 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1216 msecs
2015-01-14 11:01:32,427 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
2015-01-14 11:01:32,568 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 52 msec
2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000<http://10.10.10.63:9000/>
2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4192060 milliseconds
2015-01-14 11:01:32,704 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
2015-01-14 11:01:38,039 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010<http://10.10.10.63:50010/>
2015-01-14 11:01:38,557 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
2015-01-14 11:01:38,562 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010<http://10.10.10.63:50010/>
2015-01-14 11:01:38,692 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 9 msecs
2015-01-14 11:02:02,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
2015-01-14 11:02:02,698 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:02:21,288 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
2015-01-14 11:02:21,291 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63<http://10.10.10.63/>
************************************************************/
2015-01-14 11:03:02,845 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = bigdata/10.10.10.63<http://10.10.10.63/>
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.7.0_45
************************************************************/
2015-01-14 11:03:02,861 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-01-14 11:03:02,866 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2015-01-14 11:03:03,521 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
2015-01-14 11:03:03,700 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
2015-01-14 11:03:03,701 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070<http://0.0.0.0:50070/>
2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070<http://SelectChannelConnectorWithSafeStartup@0.0.0.0:50070/>
2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
2015-01-14 11:03:05,746 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
2015-01-14 11:03:05,761 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
2015-01-14 11:03:05,841 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2015-01-14 11:03:05,843 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:03:05
2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
2015-01-14 11:03:05,864 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
2015-01-14 11:03:05,875 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
2015-01-14 11:03:05,878 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
2015-01-14 11:03:06,284 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
2015-01-14 11:03:06,317 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 13312@bigdata
2015-01-14 11:03:06,532 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
2015-01-14 11:03:06,622 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
2015-01-14 11:03:06,807 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #84
2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
2015-01-14 11:03:06,893 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #86
2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #88
2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #90
2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #92
2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b expecting start txid #94
2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
2015-01-14 11:03:06,907 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2 expecting start txid #95
2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095' to transaction ID 84
2015-01-14 11:03:07,266 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095 of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:03:07,274 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
2015-01-14 11:03:07,313 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
2015-01-14 11:03:07,558 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
2015-01-14 11:03:07,559 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1240 msecs
2015-01-14 11:03:08,011 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
2015-01-14 11:03:08,151 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 18 msec
2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000<http://10.10.10.63:9000/>
2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4287712 milliseconds
2015-01-14 11:03:08,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
2015-01-14 11:03:13,244 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010<http://10.10.10.63:50010/>
2015-01-14 11:03:13,743 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
2015-01-14 11:03:13,750 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010<http://10.10.10.63:50010/>
2015-01-14 11:03:13,959 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 11 msecs
2015-01-14 11:03:38,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
2015-01-14 11:03:38,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:03:57,100 INFO logs: Aliases are enabled


Thanks
Mahesh.S


On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com>> wrote:
Hi Mahesh,
    We will need the namenode logs to debug this further. Can you restart namenode and paste the logs of that somewhere for us to analyze? Thanks.

On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <sa...@gmail.com>> wrote:
Hi Ramesh,

                  I didnt see any exception in the hdfs logs.my problem is agent for hdfs is not created.

Regards,
Mahesh.S

On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>> wrote:
Hi Mahesh,

The error you are seeing in is just a notice that  parent folder of the resource you are creating doesn’t have read permission for the user whom you are creating the policy.

when you start the hdfs namenode and secondarynode do you see any exception in the hdfs logs?

Regards,
Ramesh

On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>> wrote:

Hi all,

I successfully configured ranger admin,user sync.now am trying to configure hdfs plugin.My steps are following,

1.Created repository testhdfs.
2.cd<http://2.cd/> /usr/local
3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
5.cd<http://5.cd/> ranger-hdfs-plugin
6.vi<http://6.vi/> install.properties
 POLICY_MGR_URL=http://IP:6080<http://ip:6080/>
          REPOSITORY_NAME=testhdfs
          XAAUDIT.DB.HOSTNAME=localhost
          XAAUDIT.DB.DATABASE_NAME=ranger
          XAAUDIT.DB.USER_NAME=rangerlogger
          XAAUDIT.DB.PASSWORD=rangerlogger
7.cd<http://7.cd/> /usr/local/hadoop
8.ln -s /usr/local/hadoop/etc/hadoop conf
9.export HADOOP_HOME=/usr/local/hadoop
10.cd<http://10.cd/> /usr/local/ranger-hdfs-plugin
11../enable-hdfs-plugin.sh
12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
13.vi<http://13.vi/> xasecure-audit.xml
 <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
                   <value>jdbc:mysql://localhost/ranger</value>
                   </property>
                   <property>
                   <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
                   <value>rangerlogger</value>
                   </property>
                   <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
                   <value>rangerlogger</value>
                   </property>
14.Restarted hadoop
when i see Ranger Admin Web interface -> Audit -> Agents
agent is not created.Am i missed any steps.

NOTE:I am not using HDP.

here is my xa_portal.log

2015-01-13 15:16:45,901 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
2015-01-13 15:16:45,932 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
2015-01-13 15:16:45,965 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
2015-01-13 15:16:45,978 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
2015-01-13 15:16:46,490 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-13 15:16:47,417 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, requestId=10.10.10.53
2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO  org.apache.ranger.rest.UserREST (UserREST.java:186) - create:nfsnobody@bigdata
2015-01-13 15:30:32,173 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
2015-01-13 15:30:32,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
2015-01-13 15:30:33,049 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-13 15:30:34,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, requestId=10.10.10.53
2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, requestId=10.10.10.53
2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
javax.ws.rs.WebApplicationException
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
javax.ws.rs.WebApplicationException
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655, requestId=10.10.10.53
2015-01-13 16:14:23,673 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
2015-01-13 16:14:23,678 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
2015-01-13 16:14:24,064 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-13 16:14:24,666 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A, requestId=10.10.10.53
2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
javax.ws.rs.WebApplicationException
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
javax.ws.rs.WebApplicationException
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:744)
2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264, requestId=10.10.10.53

Thanks
Mahesh.S


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Regards,
Gautam.




CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Ramesh,
               ranger*.jar is added in classpath.i can see in hadoop/lib
directory.Can i know the meaning of following error.

2015-01-14 15:27:47,180 [http-bio-6080-exec-9] ERROR
org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
RangerDaoManager.getEntityManager(loggingPU)

thanks

Mahesh.S


On Wed, Jan 14, 2015 at 1:22 PM, Ramesh Mani <rm...@hortonworks.com> wrote:

> Hi Mahesh,
>
> This exception is related to datanode not coming of for some reason, but
> Ranger plugins will be in the name node.
>
> Do you see the namenode and secondarynamenode running after ranger
> installation and restarting the name node and secondarynamenode?
>
> In the classpath of the namenode I don’t see any ranger*.jar? do you have
> it in the hadoop/lib directory?
>
> Also can I get the details of xasecure-hdfs-security.xml  from the conf
> directory?
>
> Regards,
> Ramesh
>
> On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com>
> wrote:
>
> Hi Gautam,
>
>                 Now am seeing following exception. is this causes the
> problem?
>
> 2015-01-14 11:41:23,102 WARN
> org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
> java.io.EOFException: End of File Exception between local host is:
> "bigdata/10.10.10.63"; destination host is: "bigdata":9000; :
> java.io.EOFException; For more details see:
> http://wiki.apache.org/hadoop/EOFException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
> at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
> at java.lang.Thread.run(Thread.java:744)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2015-01-14 11:41:25,981 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
> 2015-01-14 11:41:25,984 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:42:03,054 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
>
> Thanks
> Mahesh.S
>
> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <
> sankarmahesh37@gmail.com> wrote:
>
>> Hi Gautam,
>>
>>               Here is my namenode log.Kindly see it.
>>
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:01:27,345 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = bigdata/10.10.10.63
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.6.0
>> STARTUP_MSG:   classpath =
>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
>> -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>> 2014-11-13T21:10Z
>> STARTUP_MSG:   java = 1.7.0_45
>> ************************************************************/
>> 2015-01-14 11:01:27,363 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>> handlers for [TERM, HUP, INT]
>> 2015-01-14 11:01:27,368 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>> 2015-01-14 11:01:28,029 INFO
>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 2015-01-14 11:01:28,205 INFO
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>> period at 10 second(s).
>> 2015-01-14 11:01:28,205 INFO
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>> started
>> 2015-01-14 11:01:28,209 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>> hdfs://bigdata:9000
>> 2015-01-14 11:01:28,209 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>> bigdata:9000 to access this namenode/service.
>> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
>> Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>> Web-server for hdfs at: http://0.0.0.0:50070
>> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http
>> request log for http.requests.namenode is not defined
>> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
>> global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context hdfs
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
>> addJerseyResourcePackage:
>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>> pathSpec=/webhdfs/v1/*
>> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
>> bound to port 50070
>> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
>> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>> 2015-01-14 11:01:30,093 WARN
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>> lack of redundant storage directories!
>> 2015-01-14 11:01:30,093 WARN
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>> loss due to lack of redundant storage directories!
>> 2015-01-14 11:01:30,184 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>> 2015-01-14 11:01:30,196 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>> 2015-01-14 11:01:30,262 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>> dfs.block.invalidate.limit=1000
>> 2015-01-14 11:01:30,262 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>> dfs.namenode.datanode.registration.ip-hostname-check=true
>> 2015-01-14 11:01:30,266 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>> 2015-01-14 11:01:30,268 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>> deletion will start around 2015 Jan 14 11:01:30
>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map BlocksMap
>> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory
>> 889 MB = 17.8 MB
>> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^21 = 2097152 entries
>> 2015-01-14 11:01:30,289 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> dfs.block.access.token.enable=false
>> 2015-01-14 11:01:30,289 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> defaultReplication         = 1
>> 2015-01-14 11:01:30,289 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>             = 512
>> 2015-01-14 11:01:30,289 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>             = 1
>> 2015-01-14 11:01:30,289 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> maxReplicationStreams      = 2
>> 2015-01-14 11:01:30,290 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> shouldCheckForEnoughRacks  = false
>> 2015-01-14 11:01:30,290 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> replicationRecheckInterval = 3000
>> 2015-01-14 11:01:30,290 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> encryptDataTransfer        = false
>> 2015-01-14 11:01:30,290 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> maxNumBlocksToLog          = 1000
>> 2015-01-14 11:01:30,298 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>> hadoop2 (auth:SIMPLE)
>> 2015-01-14 11:01:30,299 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>> supergroup
>> 2015-01-14 11:01:30,299 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>> true
>> 2015-01-14 11:01:30,299 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>> 2015-01-14 11:01:30,302 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map INodeMap
>> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory
>> 889 MB = 8.9 MB
>> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^20 = 1048576 entries
>> 2015-01-14 11:01:30,648 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>> occuring more than 10 times
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map cachedBlocks
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max
>> memory 889 MB = 2.2 MB
>> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^18 = 262144 entries
>> 2015-01-14 11:01:30,669 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>> 2015-01-14 11:01:30,669 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.min.datanodes = 0
>> 2015-01-14 11:01:30,669 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.extension     = 30000
>> 2015-01-14 11:01:30,674 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>> namenode is enabled
>> 2015-01-14 11:01:30,674 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map NameNodeRetryCache
>> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^15 = 32768 entries
>> 2015-01-14 11:01:30,687 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>> 2015-01-14 11:01:30,687 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>> 2015-01-14 11:01:30,687 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>> 16384
>> 2015-01-14 11:01:30,729 INFO
>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>> 11417@bigdata
>> 2015-01-14 11:01:30,963 INFO
>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>> 2015-01-14 11:01:31,065 INFO
>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>> file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
>> ->
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:01:31,210 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>> INodes.
>> 2015-01-14 11:01:31,293 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>> FSImage in 0 seconds.
>> 2015-01-14 11:01:31,293 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>> 2015-01-14 11:01:31,294 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
>> expecting start txid #84
>> 2015-01-14 11:01:31,294 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> 2015-01-14 11:01:31,299 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>> to transaction ID 84
>> 2015-01-14 11:01:31,303 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,303 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>> expecting start txid #86
>> 2015-01-14 11:01:31,303 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> 2015-01-14 11:01:31,303 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>> to transaction ID 84
>> 2015-01-14 11:01:31,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>> expecting start txid #88
>> 2015-01-14 11:01:31,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> 2015-01-14 11:01:31,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>> to transaction ID 84
>> 2015-01-14 11:01:31,305 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,305 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>> expecting start txid #90
>> 2015-01-14 11:01:31,305 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> 2015-01-14 11:01:31,306 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>> to transaction ID 84
>> 2015-01-14 11:01:31,306 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,306 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>> expecting start txid #92
>> 2015-01-14 11:01:31,306 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> 2015-01-14 11:01:31,307 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>> to transaction ID 84
>> 2015-01-14 11:01:31,307 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:01:31,307 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>> expecting start txid #94
>> 2015-01-14 11:01:31,308 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:01:31,308 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>> to transaction ID 84
>> 2015-01-14 11:01:31,313 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:01:31,317 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>> 2015-01-14 11:01:31,346 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
>> 2015-01-14 11:01:31,904 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>> entries 0 lookups
>> 2015-01-14 11:01:31,904 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>> FSImage in 1216 msecs
>> 2015-01-14 11:01:32,427 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>> bigdata:9000
>> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager:
>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting
>> Socket Reader #1 for port 9000
>> 2015-01-14 11:01:32,568 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>> FSNamesystemState MBean
>> 2015-01-14 11:01:32,588 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>> construction: 0
>> 2015-01-14 11:01:32,588 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>> construction: 0
>> 2015-01-14 11:01:32,588 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>> replication queues
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Leaving safe mode after 2 secs
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Network topology has 0 racks and 0 datanodes
>> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> UnderReplicatedBlocks has 0 blocks
>> 2015-01-14 11:01:32,645 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>> blocks            = 0
>> 2015-01-14 11:01:32,645 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> invalid blocks          = 0
>> 2015-01-14 11:01:32,645 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> under-replicated blocks = 0
>> 2015-01-14 11:01:32,645 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>  over-replicated blocks = 0
>> 2015-01-14 11:01:32,645 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> blocks being written    = 0
>> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Replication Queue initialization scan for invalid, over- and
>> under-replicated blocks completed in 52 msec
>> 2015-01-14 11:01:32,676 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>> bigdata/10.10.10.63:9000
>> 2015-01-14 11:01:32,676 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>> required for active state
>> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
>> Responder: starting
>> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
>> listener on 9000: starting
>> 2015-01-14 11:01:32,697 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>> 2015-01-14 11:01:32,697 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Rescanning after 4192060 milliseconds
>> 2015-01-14 11:01:32,704 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
>> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>> 2015-01-14 11:01:38,039 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>> failed storage changes from 0 to 0
>> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology:
>> Adding a new node: /default-rack/10.10.10.63:50010
>> 2015-01-14 11:01:38,557 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>> failed storage changes from 0 to 0
>> 2015-01-14 11:01:38,562 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>> 10.10.10.63:50010
>> 2015-01-14 11:01:38,692 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>> processReport: Received first block report from
>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>> starting up or becoming active. Its block contents are no longer considered
>> stale
>> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from
>> storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>> DatanodeRegistration(10.10.10.63,
>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
>> 2015-01-14 11:02:02,697 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Rescanning after 30000 milliseconds
>> 2015-01-14 11:02:02,698 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:02:21,288 ERROR
>> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
>> 2015-01-14 11:02:21,291 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
>> ************************************************************/
>> 2015-01-14 11:03:02,845 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = bigdata/10.10.10.63
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.6.0
>> STARTUP_MSG:   classpath =
>> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
>> -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
>> 2014-11-13T21:10Z
>> STARTUP_MSG:   java = 1.7.0_45
>> ************************************************************/
>> 2015-01-14 11:03:02,861 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
>> handlers for [TERM, HUP, INT]
>> 2015-01-14 11:03:02,866 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
>> 2015-01-14 11:03:03,521 INFO
>> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 2015-01-14 11:03:03,697 INFO
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
>> period at 10 second(s).
>> 2015-01-14 11:03:03,697 INFO
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
>> started
>> 2015-01-14 11:03:03,700 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
>> hdfs://bigdata:9000
>> 2015-01-14 11:03:03,701 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
>> bigdata:9000 to access this namenode/service.
>> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
>> Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
>> Web-server for hdfs at: http://0.0.0.0:50070
>> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http
>> request log for http.requests.namenode is not defined
>> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
>> global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context hdfs
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
>> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
>> (class=org.apache.hadoop.hdfs.web.AuthFilter)
>> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
>> addJerseyResourcePackage:
>> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
>> pathSpec=/webhdfs/v1/*
>> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
>> bound to port 50070
>> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
>> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
>> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
>> 2015-01-14 11:03:05,645 WARN
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
>> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
>> lack of redundant storage directories!
>> 2015-01-14 11:03:05,645 WARN
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
>> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
>> loss due to lack of redundant storage directories!
>> 2015-01-14 11:03:05,746 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
>> 2015-01-14 11:03:05,761 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
>> 2015-01-14 11:03:05,837 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>> dfs.block.invalidate.limit=1000
>> 2015-01-14 11:03:05,837 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
>> dfs.namenode.datanode.registration.ip-hostname-check=true
>> 2015-01-14 11:03:05,841 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
>> 2015-01-14 11:03:05,843 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
>> deletion will start around 2015 Jan 14 11:03:05
>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map BlocksMap
>> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory
>> 889 MB = 17.8 MB
>> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^21 = 2097152 entries
>> 2015-01-14 11:03:05,864 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> dfs.block.access.token.enable=false
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> defaultReplication         = 1
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>>             = 512
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>>             = 1
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> maxReplicationStreams      = 2
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> shouldCheckForEnoughRacks  = false
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> replicationRecheckInterval = 3000
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> encryptDataTransfer        = false
>> 2015-01-14 11:03:05,865 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
>> maxNumBlocksToLog          = 1000
>> 2015-01-14 11:03:05,874 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
>> hadoop2 (auth:SIMPLE)
>> 2015-01-14 11:03:05,874 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
>> supergroup
>> 2015-01-14 11:03:05,874 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
>> true
>> 2015-01-14 11:03:05,875 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
>> 2015-01-14 11:03:05,878 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map INodeMap
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory
>> 889 MB = 8.9 MB
>> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^20 = 1048576 entries
>> 2015-01-14 11:03:06,284 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
>> occuring more than 10 times
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map cachedBlocks
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max
>> memory 889 MB = 2.2 MB
>> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^18 = 262144 entries
>> 2015-01-14 11:03:06,301 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
>> 2015-01-14 11:03:06,301 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.min.datanodes = 0
>> 2015-01-14 11:03:06,301 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
>> dfs.namenode.safemode.extension     = 30000
>> 2015-01-14 11:03:06,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
>> namenode is enabled
>> 2015-01-14 11:03:06,304 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
>> 0.03 of total heap and retry cache entry expiry time is 600000 millis
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
>> capacity for map NameNodeRetryCache
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       =
>> 64-bit
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
>> 0.029999999329447746% max memory 889 MB = 273.1 KB
>> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      =
>> 2^15 = 32768 entries
>> 2015-01-14 11:03:06,317 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
>> 2015-01-14 11:03:06,318 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
>> 2015-01-14 11:03:06,318 INFO
>> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
>> 16384
>> 2015-01-14 11:03:06,368 INFO
>> org.apache.hadoop.hdfs.server.common.Storage: Lock on
>> /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
>> 13312@bigdata
>> 2015-01-14 11:03:06,532 INFO
>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
>> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
>> 2015-01-14 11:03:06,622 INFO
>> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
>> file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
>> ->
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>> 2015-01-14 11:03:06,807 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
>> INodes.
>> 2015-01-14 11:03:06,888 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
>> FSImage in 0 seconds.
>> 2015-01-14 11:03:06,888 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
>> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
>> 2015-01-14 11:03:06,889 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
>> expecting start txid #84
>> 2015-01-14 11:03:06,889 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> 2015-01-14 11:03:06,893 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
>> to transaction ID 84
>> 2015-01-14 11:03:06,897 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,897 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
>> expecting start txid #86
>> 2015-01-14 11:03:06,898 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> 2015-01-14 11:03:06,898 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
>> to transaction ID 84
>> 2015-01-14 11:03:06,898 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,899 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
>> expecting start txid #88
>> 2015-01-14 11:03:06,899 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> 2015-01-14 11:03:06,899 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
>> to transaction ID 84
>> 2015-01-14 11:03:06,899 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,900 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
>> expecting start txid #90
>> 2015-01-14 11:03:06,900 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> 2015-01-14 11:03:06,900 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
>> to transaction ID 84
>> 2015-01-14 11:03:06,901 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,901 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
>> expecting start txid #92
>> 2015-01-14 11:03:06,901 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> 2015-01-14 11:03:06,901 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
>> to transaction ID 84
>> 2015-01-14 11:03:06,902 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
>> of size 42 edits # 2 loaded in 0 seconds
>> 2015-01-14 11:03:06,902 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
>> expecting start txid #94
>> 2015-01-14 11:03:06,902 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> 2015-01-14 11:03:06,902 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
>> to transaction ID 84
>> 2015-01-14 11:03:06,907 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
>> of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:03:06,908 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
>> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
>> expecting start txid #95
>> 2015-01-14 11:03:06,908 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>> 2015-01-14 11:03:06,908 INFO
>> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
>> stream
>> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
>> to transaction ID 84
>> 2015-01-14 11:03:07,266 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
>> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
>> of size 1048576 edits # 1 loaded in 0 seconds
>> 2015-01-14 11:03:07,274 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
>> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
>> 2015-01-14 11:03:07,313 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
>> 2015-01-14 11:03:07,558 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
>> entries 0 lookups
>> 2015-01-14 11:03:07,559 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
>> FSImage in 1240 msecs
>> 2015-01-14 11:03:08,011 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
>> bigdata:9000
>> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager:
>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting
>> Socket Reader #1 for port 9000
>> 2015-01-14 11:03:08,151 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
>> FSNamesystemState MBean
>> 2015-01-14 11:03:08,173 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>> construction: 0
>> 2015-01-14 11:03:08,173 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
>> construction: 0
>> 2015-01-14 11:03:08,173 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
>> replication queues
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Leaving safe mode after 2 secs
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Network topology has 0 racks and 0 datanodes
>> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> UnderReplicatedBlocks has 0 blocks
>> 2015-01-14 11:03:08,194 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
>> blocks            = 0
>> 2015-01-14 11:03:08,194 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> invalid blocks          = 0
>> 2015-01-14 11:03:08,194 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> under-replicated blocks = 0
>> 2015-01-14 11:03:08,194 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>>  over-replicated blocks = 0
>> 2015-01-14 11:03:08,194 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>> blocks being written    = 0
>> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE*
>> Replication Queue initialization scan for invalid, over- and
>> under-replicated blocks completed in 18 msec
>> 2015-01-14 11:03:08,322 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
>> bigdata/10.10.10.63:9000
>> 2015-01-14 11:03:08,322 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
>> required for active state
>> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
>> Responder: starting
>> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
>> listener on 9000: starting
>> 2015-01-14 11:03:08,349 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Starting CacheReplicationMonitor with interval 30000 milliseconds
>> 2015-01-14 11:03:08,349 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Rescanning after 4287712 milliseconds
>> 2015-01-14 11:03:08,350 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
>> registerDatanode: from DatanodeRegistration(10.10.10.63,
>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
>> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
>> 2015-01-14 11:03:13,244 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>> failed storage changes from 0 to 0
>> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology:
>> Adding a new node: /default-rack/10.10.10.63:50010
>> 2015-01-14 11:03:13,743 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
>> failed storage changes from 0 to 0
>> 2015-01-14 11:03:13,750 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
>> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
>> 10.10.10.63:50010
>> 2015-01-14 11:03:13,959 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
>> processReport: Received first block report from
>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
>> starting up or becoming active. Its block contents are no longer considered
>> stale
>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from
>> storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
>> DatanodeRegistration(10.10.10.63,
>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
>> 2015-01-14 11:03:38,349 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Rescanning after 30000 milliseconds
>> 2015-01-14 11:03:38,350 INFO
>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>>
>>
>> Thanks
>> Mahesh.S
>>
>>
>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
>>
>>> Hi Mahesh,
>>>     We will need the namenode logs to debug this further. Can you
>>> restart namenode and paste the logs of that somewhere for us to analyze?
>>> Thanks.
>>>
>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>>> sankarmahesh37@gmail.com> wrote:
>>>
>>>> Hi Ramesh,
>>>>
>>>>                   I didnt see any exception in the hdfs logs.my problem
>>>> is agent for hdfs is not created.
>>>>
>>>> Regards,
>>>> Mahesh.S
>>>>
>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>>>> wrote:
>>>>
>>>>> Hi Mahesh,
>>>>>
>>>>> The error you are seeing in is just a notice that  parent folder of
>>>>> the resource you are creating doesn’t have read permission for the user
>>>>> whom you are creating the policy.
>>>>>
>>>>> when you start the hdfs namenode and secondarynode do you see any
>>>>> exception in the hdfs logs?
>>>>>
>>>>> Regards,
>>>>> Ramesh
>>>>>
>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> Hi all,
>>>>>
>>>>> I successfully configured ranger admin,user sync.now am trying to
>>>>> configure hdfs plugin.My steps are following,
>>>>>
>>>>> 1.Created repository testhdfs.
>>>>> 2.cd /usr/local
>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>>> 5.cd ranger-hdfs-plugin
>>>>> 6.vi install.properties
>>>>>   POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>>           REPOSITORY_NAME=testhdfs
>>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>>> 7.cd /usr/local/hadoop
>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>>> 11../enable-hdfs-plugin.sh
>>>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>>> 13.vi xasecure-audit.xml
>>>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>>                    </property>
>>>>>                    <property>
>>>>>
>>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>>                    <value>rangerlogger</value>
>>>>>                    </property>
>>>>>                    <property>
>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>>                    <value>rangerlogger</value>
>>>>>                    </property>
>>>>> 14.Restarted hadoop
>>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>>> agent is not created.Am i missed any steps.
>>>>>
>>>>> *NOTE:I am not using HDP.*
>>>>>
>>>>> *here is my xa_portal.log*
>>>>>
>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_default.properties]
>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_system.properties]
>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_custom.properties]
>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_ldap.properties]
>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>> Unable to load native-hadoop library for your platform... using
>>>>> builtin-java classes where applicable
>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [db_message_bundle.properties]
>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>>> requestId=10.10.10.53
>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>>> create:nfsnobody@bigdata
>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_default.properties]
>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_system.properties]
>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_custom.properties]
>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_ldap.properties]
>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>> Unable to load native-hadoop library for your platform... using
>>>>> builtin-java classes where applicable
>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [db_message_bundle.properties]
>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>>> requestId=10.10.10.53
>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>>> requestId=10.10.10.53
>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>> Login: security not enabled, using username
>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>>> Login: security not enabled, using username
>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>>> permission on parent folder. Do you want to save this policy?
>>>>> javax.ws.rs.WebApplicationException
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>> at
>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>> at
>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>>> at
>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>> at
>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>> at
>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>> at
>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>> at
>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>> at
>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>> at
>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>> at
>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>> at
>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>> at
>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>> at
>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>> at
>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at
>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>> Validation error:logMessage=null,
>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>> to save this policy?}
>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>> }
>>>>> javax.ws.rs.WebApplicationException
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>> at
>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>> at
>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>> at
>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>> at
>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>> at
>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>> at
>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>> at
>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>> at
>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>> at
>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>> at
>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>> at
>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>> at
>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>> at
>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at
>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>>> requestId=10.10.10.53
>>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_default.properties]
>>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_system.properties]
>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_custom.properties]
>>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [xa_ldap.properties]
>>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>>> Unable to load native-hadoop library for your platform... using
>>>>> builtin-java classes where applicable
>>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>>> path resource [db_message_bundle.properties]
>>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>>> requestId=10.10.10.53
>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>>> permission on parent folder. Do you want to save this policy?
>>>>> javax.ws.rs.WebApplicationException
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>> at
>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>> at
>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>> at
>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>> at
>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>> at
>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>> at
>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>> at
>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>> at
>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>> at
>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>> at
>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>> at
>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>> at
>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>> at
>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at
>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>>> Validation error:logMessage=null,
>>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>>> to save this policy?}
>>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>>> }
>>>>> javax.ws.rs.WebApplicationException
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>>> at
>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>>> at
>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>>> at
>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>>> at
>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>>> at
>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>>> at
>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>> at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>> at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>> at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>> at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>>> at
>>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>>> at
>>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>>> at
>>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>>> at
>>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>>> at
>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>>> at
>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>>> at
>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>>> at
>>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>>> at
>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>>> at
>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>>> at
>>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>>> at
>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>>> at
>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>>> at
>>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>>> at
>>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>>> at
>>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>> at
>>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>>> at java.lang.Thread.run(Thread.java:744)
>>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>>> user
>>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>>> requestId=10.10.10.53
>>>>>
>>>>> Thanks
>>>>> Mahesh.S
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Regards,
>>> Gautam.
>>>
>>
>>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Ramesh Mani <rm...@hortonworks.com>.
Hi Mahesh,

This exception is related to datanode not coming of for some reason, but Ranger plugins will be in the name node.

Do you see the namenode and secondarynamenode running after ranger installation and restarting the name node and secondarynamenode?

In the classpath of the namenode I don’t see any ranger*.jar? do you have it in the hadoop/lib directory?

Also can I get the details of xasecure-hdfs-security.xml  from the conf directory?

Regards,
Ramesh

On Jan 13, 2015, at 10:23 PM, Mahesh Sankaran <sa...@gmail.com> wrote:

> Hi Gautam,
> 
>                 Now am seeing following exception. is this causes the problem?
> 
> 2015-01-14 11:41:23,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
> java.io.EOFException: End of File Exception between local host is: "bigdata/10.10.10.63"; destination host is: "bigdata":9000; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> 	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
> 	at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
> 	at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
> 	at java.lang.Thread.run(Thread.java:744)
> Caused by: java.io.EOFException
> 	at java.io.DataInputStream.readInt(DataInputStream.java:392)
> 	at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> 	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2015-01-14 11:41:25,981 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
> 2015-01-14 11:41:25,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:42:03,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
> /************************************************************
> 
> Thanks
> Mahesh.S
> 
> On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
> Hi Gautam,
> 
>               Here is my namenode log.Kindly see it.
> 
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:01:27,345 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = bigdata/10.10.10.63
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.6.0
> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
> STARTUP_MSG:   java = 1.7.0_45
> ************************************************************/
> 2015-01-14 11:01:27,363 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
> 2015-01-14 11:01:27,368 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
> 2015-01-14 11:01:28,029 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
> 2015-01-14 11:01:28,205 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
> 2015-01-14 11:01:28,209 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
> 2015-01-14 11:01:30,093 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
> 2015-01-14 11:01:30,184 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
> 2015-01-14 11:01:30,196 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
> 2015-01-14 11:01:30,262 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
> 2015-01-14 11:01:30,266 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
> 2015-01-14 11:01:30,268 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:01:30
> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
> 2015-01-14 11:01:30,289 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
> 2015-01-14 11:01:30,290 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
> 2015-01-14 11:01:30,298 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
> 2015-01-14 11:01:30,299 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
> 2015-01-14 11:01:30,302 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
> 2015-01-14 11:01:30,648 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
> 2015-01-14 11:01:30,669 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
> 2015-01-14 11:01:30,674 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
> 2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
> 2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 11417@bigdata
> 2015-01-14 11:01:30,963 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
> 2015-01-14 11:01:31,065 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:01:31,210 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
> 2015-01-14 11:01:31,293 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5 expecting start txid #84
> 2015-01-14 11:01:31,294 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> 2015-01-14 11:01:31,299 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #86
> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> 2015-01-14 11:01:31,303 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #88
> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> 2015-01-14 11:01:31,304 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #90
> 2015-01-14 11:01:31,305 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #92
> 2015-01-14 11:01:31,306 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,307 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #94
> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:01:31,308 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
> 2015-01-14 11:01:31,313 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:01:31,317 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
> 2015-01-14 11:01:31,346 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
> 2015-01-14 11:01:31,904 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1216 msecs
> 2015-01-14 11:01:32,427 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
> 2015-01-14 11:01:32,568 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
> 2015-01-14 11:01:32,588 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
> 2015-01-14 11:01:32,645 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 52 msec
> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
> 2015-01-14 11:01:32,676 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
> 2015-01-14 11:01:32,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4192060 milliseconds
> 2015-01-14 11:01:32,704 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
> 2015-01-14 11:01:38,039 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
> 2015-01-14 11:01:38,557 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
> 2015-01-14 11:01:38,562 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
> 2015-01-14 11:01:38,692 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 9 msecs
> 2015-01-14 11:02:02,697 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
> 2015-01-14 11:02:02,698 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:02:21,288 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
> 2015-01-14 11:02:21,291 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: 
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:03:02,845 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: 
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = bigdata/10.10.10.63
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.6.0
> STARTUP_MSG:   classpath = /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
> STARTUP_MSG:   java = 1.7.0_45
> ************************************************************/
> 2015-01-14 11:03:02,861 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
> 2015-01-14 11:03:02,866 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
> 2015-01-14 11:03:03,521 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
> 2015-01-14 11:03:03,697 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
> 2015-01-14 11:03:03,700 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://bigdata:9000
> 2015-01-14 11:03:03,701 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use bigdata:9000 to access this namenode/service.
> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined
> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
> 2015-01-14 11:03:05,645 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
> 2015-01-14 11:03:05,746 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
> 2015-01-14 11:03:05,761 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
> 2015-01-14 11:03:05,837 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
> 2015-01-14 11:03:05,841 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
> 2015-01-14 11:03:05,843 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2015 Jan 14 11:03:05
> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory 889 MB = 17.8 MB
> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      = 2^21 = 2097152 entries
> 2015-01-14 11:03:05,864 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable=false
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication         = 1
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication             = 512
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication             = 1
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams      = 2
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: replicationRecheckInterval = 3000
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer        = false
> 2015-01-14 11:03:05,865 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             = hadoop2 (auth:SIMPLE)
> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          = supergroup
> 2015-01-14 11:03:05,874 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true
> 2015-01-14 11:03:05,875 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
> 2015-01-14 11:03:05,878 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory 889 MB = 8.9 MB
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      = 2^20 = 1048576 entries
> 2015-01-14 11:03:06,284 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory 889 MB = 2.2 MB
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      = 2^18 = 262144 entries
> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
> 2015-01-14 11:03:06,301 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled
> 2015-01-14 11:03:06,304 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
> 2015-01-14 11:03:06,317 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
> 2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr: 16384
> 2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename 13312@bigdata
> 2015-01-14 11:03:06,532 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
> 2015-01-14 11:03:06,622 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095 -> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
> 2015-01-14 11:03:06,807 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2 INodes.
> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
> 2015-01-14 11:03:06,888 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83 from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972 expecting start txid #84
> 2015-01-14 11:03:06,889 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> 2015-01-14 11:03:06,893 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085' to transaction ID 84
> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,897 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b expecting start txid #86
> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087' to transaction ID 84
> 2015-01-14 11:03:06,898 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe expecting start txid #88
> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089' to transaction ID 84
> 2015-01-14 11:03:06,899 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09 expecting start txid #90
> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> 2015-01-14 11:03:06,900 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091' to transaction ID 84
> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b expecting start txid #92
> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> 2015-01-14 11:03:06,901 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093' to transaction ID 84
> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093 of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b expecting start txid #94
> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:03:06,902 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094' to transaction ID 84
> 2015-01-14 11:03:06,907 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094 of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2 expecting start txid #95
> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
> 2015-01-14 11:03:06,908 INFO org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding stream '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095' to transaction ID 84
> 2015-01-14 11:03:07,266 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095 of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:03:07,274 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
> 2015-01-14 11:03:07,313 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
> 2015-01-14 11:03:07,558 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
> 2015-01-14 11:03:07,559 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 1240 msecs
> 2015-01-14 11:03:08,011 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to bigdata:9000
> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
> 2015-01-14 11:03:08,151 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState MBean
> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under construction: 0
> 2015-01-14 11:03:08,173 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing replication queues
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 2 secs
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 18 msec
> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: bigdata/10.10.10.63:9000
> 2015-01-14 11:03:08,322 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
> 2015-01-14 11:03:08,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 4287712 milliseconds
> 2015-01-14 11:03:08,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0) storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
> 2015-01-14 11:03:13,244 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/10.10.10.63:50010
> 2015-01-14 11:03:13,743 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of failed storage changes from 0 to 0
> 2015-01-14 11:03:13,750 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN 10.10.10.63:50010
> 2015-01-14 11:03:13,959 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* processReport: Received first block report from DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after starting up or becoming active. Its block contents are no longer considered stale
> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node DatanodeRegistration(10.10.10.63, datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), blocks: 0, hasStaleStorages: false, processing time: 11 msecs
> 2015-01-14 11:03:38,349 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Rescanning after 30000 milliseconds
> 2015-01-14 11:03:38,350 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
> 
> 
> Thanks
> Mahesh.S
> 
> 
> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
> Hi Mahesh,
>     We will need the namenode logs to debug this further. Can you restart namenode and paste the logs of that somewhere for us to analyze? Thanks.  
> 
> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
> Hi Ramesh,
> 
>                   I didnt see any exception in the hdfs logs.my problem is agent for hdfs is not created.
> 
> Regards,
> Mahesh.S
> 
> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com> wrote:
> Hi Mahesh,
> 
> The error you are seeing in is just a notice that  parent folder of the resource you are creating doesn’t have read permission for the user whom you are creating the policy.
> 
> when you start the hdfs namenode and secondarynode do you see any exception in the hdfs logs?
> 
> Regards,
> Ramesh
> 
> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com> wrote:
> 
>> Hi all,
>> 
>> I successfully configured ranger admin,user sync.now am trying to configure hdfs plugin.My steps are following,
>> 
>> 1.Created repository testhdfs.
>> 2.cd /usr/local
>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>> 5.cd ranger-hdfs-plugin
>> 6.vi install.properties
>> 	  POLICY_MGR_URL=http://IP:6080
>>           REPOSITORY_NAME=testhdfs
>>           XAAUDIT.DB.HOSTNAME=localhost
>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>           XAAUDIT.DB.PASSWORD=rangerlogger
>> 7.cd /usr/local/hadoop
>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>> 9.export HADOOP_HOME=/usr/local/hadoop
>> 10.cd /usr/local/ranger-hdfs-plugin
>> 11../enable-hdfs-plugin.sh
>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>> 13.vi xasecure-audit.xml
>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>                    <value>jdbc:mysql://localhost/ranger</value>
>>                    </property>
>>                    <property>
>>                    <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>                    <value>rangerlogger</value>
>>                    </property>
>>                    <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>                    <value>rangerlogger</value>
>>                    </property>
>> 14.Restarted hadoop
>> when i see Ranger Admin Web interface -> Audit -> Agents
>> agent is not created.Am i missed any steps.
>> 
>> NOTE:I am not using HDP.
>> 
>> here is my xa_portal.log
>> 
>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, requestId=10.10.10.53
>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO  org.apache.ranger.rest.UserREST (UserREST.java:186) - create:nfsnobody@bigdata
>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, requestId=10.10.10.53
>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, requestId=10.10.10.53
>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>> javax.ws.rs.WebApplicationException
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> 	at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>> 	at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 	at java.lang.reflect.Method.invoke(Method.java:606)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> 	at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>> javax.ws.rs.WebApplicationException
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> 	at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>> 	at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 	at java.lang.reflect.Method.invoke(Method.java:606)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> 	at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655, requestId=10.10.10.53
>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A, requestId=10.10.10.53
>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
>> javax.ws.rs.WebApplicationException
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> 	at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>> 	at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 	at java.lang.reflect.Method.invoke(Method.java:606)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> 	at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
>> javax.ws.rs.WebApplicationException
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> 	at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>> 	at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 	at java.lang.reflect.Method.invoke(Method.java:606)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> 	at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264, requestId=10.10.10.53
>> 
>> Thanks
>> Mahesh.S
> 
> 
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
> 
> 
> 
> 
> -- 
> Regards,
> Gautam.
> 
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Gautam,

                Now am seeing following exception. is this causes the
problem?

2015-01-14 11:41:23,102 WARN
org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.EOFException: End of File Exception between local host is: "bigdata/
10.10.10.63"; destination host is: "bigdata":9000; : java.io.EOFException;
For more details see:  http://wiki.apache.org/hadoop/EOFException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy14.sendHeartbeat(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:139)
at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:582)
at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:680)
at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:850)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2015-01-14 11:41:25,981 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
2015-01-14 11:41:25,984 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at bigdata/10.10.10.63
************************************************************/
2015-01-14 11:42:03,054 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************

Thanks
Mahesh.S

On Wed, Jan 14, 2015 at 11:16 AM, Mahesh Sankaran <sa...@gmail.com>
wrote:

> Hi Gautam,
>
>               Here is my namenode log.Kindly see it.
>
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:01:27,345 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = bigdata/10.10.10.63
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.6.0
> STARTUP_MSG:   classpath =
> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
> -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
> 2014-11-13T21:10Z
> STARTUP_MSG:   java = 1.7.0_45
> ************************************************************/
> 2015-01-14 11:01:27,363 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
> handlers for [TERM, HUP, INT]
> 2015-01-14 11:01:27,368 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
> 2015-01-14 11:01:28,029 INFO
> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 2015-01-14 11:01:28,205 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2015-01-14 11:01:28,205 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
> started
> 2015-01-14 11:01:28,209 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
> hdfs://bigdata:9000
> 2015-01-14 11:01:28,209 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
> bigdata:9000 to access this namenode/service.
> 2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> 2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
> Web-server for hdfs at: http://0.0.0.0:50070
> 2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http
> request log for http.requests.namenode is not defined
> 2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
> global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context hdfs
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
> (class=org.apache.hadoop.hdfs.web.AuthFilter)
> 2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
> addJerseyResourcePackage:
> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
> pathSpec=/webhdfs/v1/*
> 2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
> bound to port 50070
> 2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
> 2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
> 2015-01-14 11:01:30,093 WARN
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
> lack of redundant storage directories!
> 2015-01-14 11:01:30,093 WARN
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
> loss due to lack of redundant storage directories!
> 2015-01-14 11:01:30,184 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
> 2015-01-14 11:01:30,196 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
> 2015-01-14 11:01:30,262 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
> dfs.block.invalidate.limit=1000
> 2015-01-14 11:01:30,262 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
> dfs.namenode.datanode.registration.ip-hostname-check=true
> 2015-01-14 11:01:30,266 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
> 2015-01-14 11:01:30,268 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
> deletion will start around 2015 Jan 14 11:01:30
> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map BlocksMap
> 2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory
> 889 MB = 17.8 MB
> 2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^21 = 2097152 entries
> 2015-01-14 11:01:30,289 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> dfs.block.access.token.enable=false
> 2015-01-14 11:01:30,289 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> defaultReplication         = 1
> 2015-01-14 11:01:30,289 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>             = 512
> 2015-01-14 11:01:30,289 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>             = 1
> 2015-01-14 11:01:30,289 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> maxReplicationStreams      = 2
> 2015-01-14 11:01:30,290 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> shouldCheckForEnoughRacks  = false
> 2015-01-14 11:01:30,290 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> replicationRecheckInterval = 3000
> 2015-01-14 11:01:30,290 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> encryptDataTransfer        = false
> 2015-01-14 11:01:30,290 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> maxNumBlocksToLog          = 1000
> 2015-01-14 11:01:30,298 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
> hadoop2 (auth:SIMPLE)
> 2015-01-14 11:01:30,299 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
> supergroup
> 2015-01-14 11:01:30,299 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
> true
> 2015-01-14 11:01:30,299 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
> 2015-01-14 11:01:30,302 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map INodeMap
> 2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory
> 889 MB = 8.9 MB
> 2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^20 = 1048576 entries
> 2015-01-14 11:01:30,648 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
> occuring more than 10 times
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map cachedBlocks
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory
> 889 MB = 2.2 MB
> 2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^18 = 262144 entries
> 2015-01-14 11:01:30,669 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
> 2015-01-14 11:01:30,669 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.min.datanodes = 0
> 2015-01-14 11:01:30,669 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.extension     = 30000
> 2015-01-14 11:01:30,674 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
> namenode is enabled
> 2015-01-14 11:01:30,674 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
> 0.03 of total heap and retry cache entry expiry time is 600000 millis
> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map NameNodeRetryCache
> 2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
> 0.029999999329447746% max memory 889 MB = 273.1 KB
> 2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^15 = 32768 entries
> 2015-01-14 11:01:30,687 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
> 2015-01-14 11:01:30,687 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
> 2015-01-14 11:01:30,687 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
> 16384
> 2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage:
> Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
> 11417@bigdata
> 2015-01-14 11:01:30,963 INFO
> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
> 2015-01-14 11:01:31,065 INFO
> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
> file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
> ->
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:01:31,210 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
> INodes.
> 2015-01-14 11:01:31,293 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
> FSImage in 0 seconds.
> 2015-01-14 11:01:31,293 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
> 2015-01-14 11:01:31,294 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
> expecting start txid #84
> 2015-01-14 11:01:31,294 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> 2015-01-14 11:01:31,299 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
> to transaction ID 84
> 2015-01-14 11:01:31,303 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,303 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
> expecting start txid #86
> 2015-01-14 11:01:31,303 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> 2015-01-14 11:01:31,303 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
> to transaction ID 84
> 2015-01-14 11:01:31,304 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,304 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
> expecting start txid #88
> 2015-01-14 11:01:31,304 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> 2015-01-14 11:01:31,304 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
> to transaction ID 84
> 2015-01-14 11:01:31,305 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,305 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
> expecting start txid #90
> 2015-01-14 11:01:31,305 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> 2015-01-14 11:01:31,306 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
> to transaction ID 84
> 2015-01-14 11:01:31,306 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,306 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
> expecting start txid #92
> 2015-01-14 11:01:31,306 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> 2015-01-14 11:01:31,307 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
> to transaction ID 84
> 2015-01-14 11:01:31,307 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:01:31,307 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
> expecting start txid #94
> 2015-01-14 11:01:31,308 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:01:31,308 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
> to transaction ID 84
> 2015-01-14 11:01:31,313 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:01:31,317 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
> 2015-01-14 11:01:31,346 INFO
> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
> 2015-01-14 11:01:31,904 INFO
> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
> entries 0 lookups
> 2015-01-14 11:01:31,904 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
> FSImage in 1216 msecs
> 2015-01-14 11:01:32,427 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
> bigdata:9000
> 2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using
> callQueue class java.util.concurrent.LinkedBlockingQueue
> 2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket
> Reader #1 for port 9000
> 2015-01-14 11:01:32,568 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
> FSNamesystemState MBean
> 2015-01-14 11:01:32,588 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
> construction: 0
> 2015-01-14 11:01:32,588 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
> construction: 0
> 2015-01-14 11:01:32,588 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
> replication queues
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Leaving safe mode after 2 secs
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Network topology has 0 racks and 0 datanodes
> 2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> UnderReplicatedBlocks has 0 blocks
> 2015-01-14 11:01:32,645 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
> blocks            = 0
> 2015-01-14 11:01:32,645 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> invalid blocks          = 0
> 2015-01-14 11:01:32,645 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> under-replicated blocks = 0
> 2015-01-14 11:01:32,645 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>  over-replicated blocks = 0
> 2015-01-14 11:01:32,645 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> blocks being written    = 0
> 2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Replication Queue initialization scan for invalid, over- and
> under-replicated blocks completed in 52 msec
> 2015-01-14 11:01:32,676 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
> bigdata/10.10.10.63:9000
> 2015-01-14 11:01:32,676 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
> required for active state
> 2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
> Responder: starting
> 2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
> listener on 9000: starting
> 2015-01-14 11:01:32,697 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Starting CacheReplicationMonitor with interval 30000 milliseconds
> 2015-01-14 11:01:32,697 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Rescanning after 4192060 milliseconds
> 2015-01-14 11:01:32,704 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
> 2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
> registerDatanode: from DatanodeRegistration(10.10.10.63,
> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
> 2015-01-14 11:01:38,039 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
> failed storage changes from 0 to 0
> 2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding
> a new node: /default-rack/10.10.10.63:50010
> 2015-01-14 11:01:38,557 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
> failed storage changes from 0 to 0
> 2015-01-14 11:01:38,562 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
> 10.10.10.63:50010
> 2015-01-14 11:01:38,692 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
> processReport: Received first block report from
> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
> starting up or becoming active. Its block contents are no longer considered
> stale
> 2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from
> storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
> DatanodeRegistration(10.10.10.63,
> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
> blocks: 0, hasStaleStorages: false, processing time: 9 msecs
> 2015-01-14 11:02:02,697 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Rescanning after 30000 milliseconds
> 2015-01-14 11:02:02,698 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:02:21,288 ERROR
> org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
> 2015-01-14 11:02:21,291 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
> ************************************************************/
> 2015-01-14 11:03:02,845 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = bigdata/10.10.10.63
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.6.0
> STARTUP_MSG:   classpath =
> /usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
> -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
> 2014-11-13T21:10Z
> STARTUP_MSG:   java = 1.7.0_45
> ************************************************************/
> 2015-01-14 11:03:02,861 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
> handlers for [TERM, HUP, INT]
> 2015-01-14 11:03:02,866 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
> 2015-01-14 11:03:03,521 INFO
> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 2015-01-14 11:03:03,697 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2015-01-14 11:03:03,697 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
> started
> 2015-01-14 11:03:03,700 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
> hdfs://bigdata:9000
> 2015-01-14 11:03:03,701 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
> bigdata:9000 to access this namenode/service.
> 2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> 2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
> Web-server for hdfs at: http://0.0.0.0:50070
> 2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http
> request log for http.requests.namenode is not defined
> 2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
> global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context hdfs
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
> filter 'org.apache.hadoop.hdfs.web.AuthFilter'
> (class=org.apache.hadoop.hdfs.web.AuthFilter)
> 2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
> addJerseyResourcePackage:
> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
> pathSpec=/webhdfs/v1/*
> 2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
> bound to port 50070
> 2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
> 2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
> SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
> 2015-01-14 11:03:05,645 WARN
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
> directory (dfs.namenode.name.dir) configured. Beware of data loss due to
> lack of redundant storage directories!
> 2015-01-14 11:03:05,645 WARN
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
> edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
> loss due to lack of redundant storage directories!
> 2015-01-14 11:03:05,746 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
> 2015-01-14 11:03:05,761 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
> 2015-01-14 11:03:05,837 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
> dfs.block.invalidate.limit=1000
> 2015-01-14 11:03:05,837 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
> dfs.namenode.datanode.registration.ip-hostname-check=true
> 2015-01-14 11:03:05,841 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
> 2015-01-14 11:03:05,843 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
> deletion will start around 2015 Jan 14 11:03:05
> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map BlocksMap
> 2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory
> 889 MB = 17.8 MB
> 2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^21 = 2097152 entries
> 2015-01-14 11:03:05,864 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> dfs.block.access.token.enable=false
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> defaultReplication         = 1
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
>             = 512
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
>             = 1
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> maxReplicationStreams      = 2
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> shouldCheckForEnoughRacks  = false
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> replicationRecheckInterval = 3000
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> encryptDataTransfer        = false
> 2015-01-14 11:03:05,865 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
> maxNumBlocksToLog          = 1000
> 2015-01-14 11:03:05,874 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
> hadoop2 (auth:SIMPLE)
> 2015-01-14 11:03:05,874 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
> supergroup
> 2015-01-14 11:03:05,874 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
> true
> 2015-01-14 11:03:05,875 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
> 2015-01-14 11:03:05,878 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map INodeMap
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory
> 889 MB = 8.9 MB
> 2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^20 = 1048576 entries
> 2015-01-14 11:03:06,284 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
> occuring more than 10 times
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map cachedBlocks
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory
> 889 MB = 2.2 MB
> 2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^18 = 262144 entries
> 2015-01-14 11:03:06,301 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.threshold-pct = 0.9990000128746033
> 2015-01-14 11:03:06,301 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.min.datanodes = 0
> 2015-01-14 11:03:06,301 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
> dfs.namenode.safemode.extension     = 30000
> 2015-01-14 11:03:06,304 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
> namenode is enabled
> 2015-01-14 11:03:06,304 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
> 0.03 of total heap and retry cache entry expiry time is 600000 millis
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
> capacity for map NameNodeRetryCache
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       =
> 64-bit
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
> 0.029999999329447746% max memory 889 MB = 273.1 KB
> 2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      =
> 2^15 = 32768 entries
> 2015-01-14 11:03:06,317 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: ACLs enabled? false
> 2015-01-14 11:03:06,318 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: XAttrs enabled? true
> 2015-01-14 11:03:06,318 INFO
> org.apache.hadoop.hdfs.server.namenode.NNConf: Maximum size of an xattr:
> 16384
> 2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage:
> Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
> 13312@bigdata
> 2015-01-14 11:03:06,532 INFO
> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
> unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
> 2015-01-14 11:03:06,622 INFO
> org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
> file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
> ->
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
> 2015-01-14 11:03:06,807 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
> INodes.
> 2015-01-14 11:03:06,888 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
> FSImage in 0 seconds.
> 2015-01-14 11:03:06,888 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
> from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
> 2015-01-14 11:03:06,889 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
> expecting start txid #84
> 2015-01-14 11:03:06,889 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> 2015-01-14 11:03:06,893 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
> to transaction ID 84
> 2015-01-14 11:03:06,897 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,897 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
> expecting start txid #86
> 2015-01-14 11:03:06,898 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> 2015-01-14 11:03:06,898 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
> to transaction ID 84
> 2015-01-14 11:03:06,898 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,899 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
> expecting start txid #88
> 2015-01-14 11:03:06,899 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> 2015-01-14 11:03:06,899 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
> to transaction ID 84
> 2015-01-14 11:03:06,899 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,900 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
> expecting start txid #90
> 2015-01-14 11:03:06,900 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> 2015-01-14 11:03:06,900 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
> to transaction ID 84
> 2015-01-14 11:03:06,901 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,901 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
> expecting start txid #92
> 2015-01-14 11:03:06,901 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> 2015-01-14 11:03:06,901 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
> to transaction ID 84
> 2015-01-14 11:03:06,902 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
> of size 42 edits # 2 loaded in 0 seconds
> 2015-01-14 11:03:06,902 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
> expecting start txid #94
> 2015-01-14 11:03:06,902 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> 2015-01-14 11:03:06,902 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
> to transaction ID 84
> 2015-01-14 11:03:06,907 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
> of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:03:06,908 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
> org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
> expecting start txid #95
> 2015-01-14 11:03:06,908 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
> 2015-01-14 11:03:06,908 INFO
> org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
> stream
> '/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
> to transaction ID 84
> 2015-01-14 11:03:07,266 INFO
> org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
> /home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
> of size 1048576 edits # 1 loaded in 0 seconds
> 2015-01-14 11:03:07,274 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
> false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
> 2015-01-14 11:03:07,313 INFO
> org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
> 2015-01-14 11:03:07,558 INFO
> org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
> entries 0 lookups
> 2015-01-14 11:03:07,559 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
> FSImage in 1240 msecs
> 2015-01-14 11:03:08,011 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
> bigdata:9000
> 2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using
> callQueue class java.util.concurrent.LinkedBlockingQueue
> 2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket
> Reader #1 for port 9000
> 2015-01-14 11:03:08,151 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
> FSNamesystemState MBean
> 2015-01-14 11:03:08,173 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
> construction: 0
> 2015-01-14 11:03:08,173 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
> construction: 0
> 2015-01-14 11:03:08,173 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
> replication queues
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Leaving safe mode after 2 secs
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Network topology has 0 racks and 0 datanodes
> 2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> UnderReplicatedBlocks has 0 blocks
> 2015-01-14 11:03:08,194 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
> blocks            = 0
> 2015-01-14 11:03:08,194 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> invalid blocks          = 0
> 2015-01-14 11:03:08,194 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> under-replicated blocks = 0
> 2015-01-14 11:03:08,194 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
>  over-replicated blocks = 0
> 2015-01-14 11:03:08,194 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
> blocks being written    = 0
> 2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE*
> Replication Queue initialization scan for invalid, over- and
> under-replicated blocks completed in 18 msec
> 2015-01-14 11:03:08,322 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
> bigdata/10.10.10.63:9000
> 2015-01-14 11:03:08,322 INFO
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
> required for active state
> 2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
> Responder: starting
> 2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
> listener on 9000: starting
> 2015-01-14 11:03:08,349 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Starting CacheReplicationMonitor with interval 30000 milliseconds
> 2015-01-14 11:03:08,349 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Rescanning after 4287712 milliseconds
> 2015-01-14 11:03:08,350 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
> registerDatanode: from DatanodeRegistration(10.10.10.63,
> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
> storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
> 2015-01-14 11:03:13,244 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
> failed storage changes from 0 to 0
> 2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding
> a new node: /default-rack/10.10.10.63:50010
> 2015-01-14 11:03:13,743 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
> failed storage changes from 0 to 0
> 2015-01-14 11:03:13,750 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
> new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
> 10.10.10.63:50010
> 2015-01-14 11:03:13,959 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
> processReport: Received first block report from
> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
> starting up or becoming active. Its block contents are no longer considered
> stale
> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from
> storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
> DatanodeRegistration(10.10.10.63,
> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
> blocks: 0, hasStaleStorages: false, processing time: 11 msecs
> 2015-01-14 11:03:38,349 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Rescanning after 30000 milliseconds
> 2015-01-14 11:03:38,350 INFO
> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled
>
>
> Thanks
> Mahesh.S
>
>
> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:
>
>> Hi Mahesh,
>>     We will need the namenode logs to debug this further. Can you restart
>> namenode and paste the logs of that somewhere for us to analyze? Thanks.
>>
>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
>> sankarmahesh37@gmail.com> wrote:
>>
>>> Hi Ramesh,
>>>
>>>                   I didnt see any exception in the hdfs logs.my problem
>>> is agent for hdfs is not created.
>>>
>>> Regards,
>>> Mahesh.S
>>>
>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>>> wrote:
>>>
>>>> Hi Mahesh,
>>>>
>>>> The error you are seeing in is just a notice that  parent folder of the
>>>> resource you are creating doesn’t have read permission for the user whom
>>>> you are creating the policy.
>>>>
>>>> when you start the hdfs namenode and secondarynode do you see any
>>>> exception in the hdfs logs?
>>>>
>>>> Regards,
>>>> Ramesh
>>>>
>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>
>>>> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I successfully configured ranger admin,user sync.now am trying to
>>>> configure hdfs plugin.My steps are following,
>>>>
>>>> 1.Created repository testhdfs.
>>>> 2.cd /usr/local
>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>>> 5.cd ranger-hdfs-plugin
>>>> 6.vi install.properties
>>>>   POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>>           REPOSITORY_NAME=testhdfs
>>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>>> 7.cd /usr/local/hadoop
>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>>> 10.cd /usr/local/ranger-hdfs-plugin
>>>> 11../enable-hdfs-plugin.sh
>>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>>> 13.vi xasecure-audit.xml
>>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>>                    </property>
>>>>                    <property>
>>>>
>>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>>                    <value>rangerlogger</value>
>>>>                    </property>
>>>>                    <property>
>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>>                    <value>rangerlogger</value>
>>>>                    </property>
>>>> 14.Restarted hadoop
>>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>>> agent is not created.Am i missed any steps.
>>>>
>>>> *NOTE:I am not using HDP.*
>>>>
>>>> *here is my xa_portal.log*
>>>>
>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_default.properties]
>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_system.properties]
>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_custom.properties]
>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_ldap.properties]
>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java classes where applicable
>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [db_message_bundle.properties]
>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>>> requestId=10.10.10.53
>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>>> create:nfsnobody@bigdata
>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_default.properties]
>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_system.properties]
>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_custom.properties]
>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_ldap.properties]
>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java classes where applicable
>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [db_message_bundle.properties]
>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>>> requestId=10.10.10.53
>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>>> requestId=10.10.10.53
>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>> Login: security not enabled, using username
>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>>> Login: security not enabled, using username
>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>>> permission on parent folder. Do you want to save this policy?
>>>> javax.ws.rs.WebApplicationException
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at
>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at
>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>>> at
>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at
>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at
>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at
>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at
>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at
>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at
>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at
>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at
>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at
>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at
>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at
>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at
>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>> Validation error:logMessage=null,
>>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>> to save this policy?}
>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>> }
>>>> javax.ws.rs.WebApplicationException
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at
>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at
>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at
>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at
>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at
>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at
>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at
>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at
>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at
>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at
>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at
>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at
>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at
>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at
>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>>> requestId=10.10.10.53
>>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_default.properties]
>>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_system.properties]
>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_custom.properties]
>>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [xa_ldap.properties]
>>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java classes where applicable
>>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>>> path resource [db_message_bundle.properties]
>>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>>> requestId=10.10.10.53
>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>>> RangerDaoManager.getEntityManager(loggingPU)
>>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>>> permission on parent folder. Do you want to save this policy?
>>>> javax.ws.rs.WebApplicationException
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at
>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at
>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at
>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at
>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at
>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at
>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at
>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at
>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at
>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at
>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at
>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at
>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at
>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at
>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>>> Validation error:logMessage=null,
>>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>>> to save this policy?}
>>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>>> }
>>>> javax.ws.rs.WebApplicationException
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>>> at
>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>>> at
>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>>> at
>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>>> at
>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>>> at
>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>>> at
>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>> at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>> at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>> at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>> at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>>> at
>>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>>> at
>>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>>> at
>>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>>> at
>>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>>> at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>>> at
>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>>> at
>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>>> at
>>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>>> at
>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>>> at
>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>>> at
>>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>>> at
>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>>> at
>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>>> at
>>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>>> at
>>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>>> at
>>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> at
>>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>>> at java.lang.Thread.run(Thread.java:744)
>>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>>  org.apache.ranger.security.listener.SpringEventListener
>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>>> user
>>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>>> requestId=10.10.10.53
>>>>
>>>> Thanks
>>>> Mahesh.S
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>
>>>
>>>
>>
>>
>> --
>> Regards,
>> Gautam.
>>
>
>

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Gautam,

              Here is my namenode log.Kindly see it.

/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
************************************************************/
2015-01-14 11:01:27,345 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = bigdata/10.10.10.63
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath =
/usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
-r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
2014-11-13T21:10Z
STARTUP_MSG:   java = 1.7.0_45
************************************************************/
2015-01-14 11:01:27,363 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
handlers for [TERM, HUP, INT]
2015-01-14 11:01:27,368 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2015-01-14 11:01:28,029 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2015-01-14 11:01:28,205 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2015-01-14 11:01:28,205 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
started
2015-01-14 11:01:28,209 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
hdfs://bigdata:9000
2015-01-14 11:01:28,209 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
bigdata:9000 to access this namenode/service.
2015-01-14 11:01:28,433 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2015-01-14 11:01:28,950 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
Web-server for hdfs at: http://0.0.0.0:50070
2015-01-14 11:01:29,050 INFO org.mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
2015-01-14 11:01:29,058 INFO org.apache.hadoop.http.HttpRequestLog: Http
request log for http.requests.namenode is not defined
2015-01-14 11:01:29,079 INFO org.apache.hadoop.http.HttpServer2: Added
global filter 'safety'
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context hdfs
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context static
2015-01-14 11:01:29,085 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context logs
2015-01-14 11:01:29,141 INFO org.apache.hadoop.http.HttpServer2: Added
filter 'org.apache.hadoop.hdfs.web.AuthFilter'
(class=org.apache.hadoop.hdfs.web.AuthFilter)
2015-01-14 11:01:29,144 INFO org.apache.hadoop.http.HttpServer2:
addJerseyResourcePackage:
packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
pathSpec=/webhdfs/v1/*
2015-01-14 11:01:29,210 INFO org.apache.hadoop.http.HttpServer2: Jetty
bound to port 50070
2015-01-14 11:01:29,210 INFO org.mortbay.log: jetty-6.1.26
2015-01-14 11:01:29,984 INFO org.mortbay.log: Started HttpServer2$
SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
2015-01-14 11:01:30,093 WARN
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
directory (dfs.namenode.name.dir) configured. Beware of data loss due to
lack of redundant storage directories!
2015-01-14 11:01:30,093 WARN
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
loss due to lack of redundant storage directories!
2015-01-14 11:01:30,184 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
2015-01-14 11:01:30,196 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
2015-01-14 11:01:30,262 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
dfs.block.invalidate.limit=1000
2015-01-14 11:01:30,262 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
dfs.namenode.datanode.registration.ip-hostname-check=true
2015-01-14 11:01:30,266 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2015-01-14 11:01:30,268 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
deletion will start around 2015 Jan 14 11:01:30
2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: Computing
capacity for map BlocksMap
2015-01-14 11:01:30,271 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: 2.0% max memory
889 MB = 17.8 MB
2015-01-14 11:01:30,274 INFO org.apache.hadoop.util.GSet: capacity      =
2^21 = 2097152 entries
2015-01-14 11:01:30,289 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
dfs.block.access.token.enable=false
2015-01-14 11:01:30,289 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
defaultReplication         = 1
2015-01-14 11:01:30,289 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
            = 512
2015-01-14 11:01:30,289 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
            = 1
2015-01-14 11:01:30,289 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
maxReplicationStreams      = 2
2015-01-14 11:01:30,290 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
shouldCheckForEnoughRacks  = false
2015-01-14 11:01:30,290 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
replicationRecheckInterval = 3000
2015-01-14 11:01:30,290 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
encryptDataTransfer        = false
2015-01-14 11:01:30,290 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
maxNumBlocksToLog          = 1000
2015-01-14 11:01:30,298 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
hadoop2 (auth:SIMPLE)
2015-01-14 11:01:30,299 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
supergroup
2015-01-14 11:01:30,299 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
true
2015-01-14 11:01:30,299 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
2015-01-14 11:01:30,302 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: Computing
capacity for map INodeMap
2015-01-14 11:01:30,644 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: 1.0% max memory
889 MB = 8.9 MB
2015-01-14 11:01:30,645 INFO org.apache.hadoop.util.GSet: capacity      =
2^20 = 1048576 entries
2015-01-14 11:01:30,648 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
occuring more than 10 times
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: Computing
capacity for map cachedBlocks
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: 0.25% max memory
889 MB = 2.2 MB
2015-01-14 11:01:30,665 INFO org.apache.hadoop.util.GSet: capacity      =
2^18 = 262144 entries
2015-01-14 11:01:30,669 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.threshold-pct = 0.9990000128746033
2015-01-14 11:01:30,669 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.min.datanodes = 0
2015-01-14 11:01:30,669 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.extension     = 30000
2015-01-14 11:01:30,674 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
namenode is enabled
2015-01-14 11:01:30,674 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
0.03 of total heap and retry cache entry expiry time is 600000 millis
2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: Computing
capacity for map NameNodeRetryCache
2015-01-14 11:01:30,679 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet:
0.029999999329447746% max memory 889 MB = 273.1 KB
2015-01-14 11:01:30,680 INFO org.apache.hadoop.util.GSet: capacity      =
2^15 = 32768 entries
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
ACLs enabled? false
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
XAttrs enabled? true
2015-01-14 11:01:30,687 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
Maximum size of an xattr: 16384
2015-01-14 11:01:30,729 INFO org.apache.hadoop.hdfs.server.common.Storage:
Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
11417@bigdata
2015-01-14 11:01:30,963 INFO
org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
2015-01-14 11:01:31,065 INFO
org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
file
/home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000094
->
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:01:31,210 INFO
org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
INodes.
2015-01-14 11:01:31,293 INFO
org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
FSImage in 0 seconds.
2015-01-14 11:01:31,293 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
2015-01-14 11:01:31,294 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4fd05dc5
expecting start txid #84
2015-01-14 11:01:31,294 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
2015-01-14 11:01:31,299 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
to transaction ID 84
2015-01-14 11:01:31,303 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,303 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
expecting start txid #86
2015-01-14 11:01:31,303 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
2015-01-14 11:01:31,303 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
to transaction ID 84
2015-01-14 11:01:31,304 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,304 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
expecting start txid #88
2015-01-14 11:01:31,304 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
2015-01-14 11:01:31,304 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
to transaction ID 84
2015-01-14 11:01:31,305 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,305 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
expecting start txid #90
2015-01-14 11:01:31,305 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
2015-01-14 11:01:31,306 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
to transaction ID 84
2015-01-14 11:01:31,306 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,306 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
expecting start txid #92
2015-01-14 11:01:31,306 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
2015-01-14 11:01:31,307 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
to transaction ID 84
2015-01-14 11:01:31,307 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:01:31,307 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
expecting start txid #94
2015-01-14 11:01:31,308 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:01:31,308 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
to transaction ID 84
2015-01-14 11:01:31,313 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:01:31,317 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
2015-01-14 11:01:31,346 INFO
org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 95
2015-01-14 11:01:31,904 INFO
org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
entries 0 lookups
2015-01-14 11:01:31,904 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
FSImage in 1216 msecs
2015-01-14 11:01:32,427 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
bigdata:9000
2015-01-14 11:01:32,443 INFO org.apache.hadoop.ipc.CallQueueManager: Using
callQueue class java.util.concurrent.LinkedBlockingQueue
2015-01-14 11:01:32,489 INFO org.apache.hadoop.ipc.Server: Starting Socket
Reader #1 for port 9000
2015-01-14 11:01:32,568 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
FSNamesystemState MBean
2015-01-14 11:01:32,588 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
construction: 0
2015-01-14 11:01:32,588 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
construction: 0
2015-01-14 11:01:32,588 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
replication queues
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Leaving safe mode after 2 secs
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Network topology has 0 racks and 0 datanodes
2015-01-14 11:01:32,592 INFO org.apache.hadoop.hdfs.StateChange: STATE*
UnderReplicatedBlocks has 0 blocks
2015-01-14 11:01:32,645 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
blocks            = 0
2015-01-14 11:01:32,645 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
invalid blocks          = 0
2015-01-14 11:01:32,645 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
under-replicated blocks = 0
2015-01-14 11:01:32,645 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
 over-replicated blocks = 0
2015-01-14 11:01:32,645 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
blocks being written    = 0
2015-01-14 11:01:32,646 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Replication Queue initialization scan for invalid, over- and
under-replicated blocks completed in 52 msec
2015-01-14 11:01:32,676 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
bigdata/10.10.10.63:9000
2015-01-14 11:01:32,676 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
required for active state
2015-01-14 11:01:32,667 INFO org.apache.hadoop.ipc.Server: IPC Server
Responder: starting
2015-01-14 11:01:32,669 INFO org.apache.hadoop.ipc.Server: IPC Server
listener on 9000: starting
2015-01-14 11:01:32,697 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Starting CacheReplicationMonitor with interval 30000 milliseconds
2015-01-14 11:01:32,697 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Rescanning after 4192060 milliseconds
2015-01-14 11:01:32,704 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Scanned 0 directive(s) and 0 block(s) in 7 millisecond(s).
2015-01-14 11:01:37,967 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
registerDatanode: from DatanodeRegistration(10.10.10.63,
datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
2015-01-14 11:01:38,039 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
failed storage changes from 0 to 0
2015-01-14 11:01:38,042 INFO org.apache.hadoop.net.NetworkTopology: Adding
a new node: /default-rack/10.10.10.63:50010
2015-01-14 11:01:38,557 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
failed storage changes from 0 to 0
2015-01-14 11:01:38,562 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
10.10.10.63:50010
2015-01-14 11:01:38,692 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
processReport: Received first block report from
DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
starting up or becoming active. Its block contents are no longer considered
stale
2015-01-14 11:01:38,692 INFO BlockStateChange: BLOCK* processReport: from
storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
DatanodeRegistration(10.10.10.63,
datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
blocks: 0, hasStaleStorages: false, processing time: 9 msecs
2015-01-14 11:02:02,697 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Rescanning after 30000 milliseconds
2015-01-14 11:02:02,698 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:02:21,288 ERROR
org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
2015-01-14 11:02:21,291 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at bigdata/10.10.10.63
************************************************************/
2015-01-14 11:03:02,845 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = bigdata/10.10.10.63
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath =
/usr/local/hadoop/conf:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-hdfs-plugin-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-cred-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.persistence-2.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-common-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/eclipselink-2.5.2-M1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-impl-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/ranger-plugins-audit-0.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/mysql-connector-java.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git
-r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on
2014-11-13T21:10Z
STARTUP_MSG:   java = 1.7.0_45
************************************************************/
2015-01-14 11:03:02,861 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
handlers for [TERM, HUP, INT]
2015-01-14 11:03:02,866 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2015-01-14 11:03:03,521 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2015-01-14 11:03:03,697 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2015-01-14 11:03:03,697 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
started
2015-01-14 11:03:03,700 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
hdfs://bigdata:9000
2015-01-14 11:03:03,701 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
bigdata:9000 to access this namenode/service.
2015-01-14 11:03:03,925 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2015-01-14 11:03:04,411 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
Web-server for hdfs at: http://0.0.0.0:50070
2015-01-14 11:03:04,560 INFO org.mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
2015-01-14 11:03:04,568 INFO org.apache.hadoop.http.HttpRequestLog: Http
request log for http.requests.namenode is not defined
2015-01-14 11:03:04,590 INFO org.apache.hadoop.http.HttpServer2: Added
global filter 'safety'
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context hdfs
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context logs
2015-01-14 11:03:04,596 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context static
2015-01-14 11:03:04,671 INFO org.apache.hadoop.http.HttpServer2: Added
filter 'org.apache.hadoop.hdfs.web.AuthFilter'
(class=org.apache.hadoop.hdfs.web.AuthFilter)
2015-01-14 11:03:04,705 INFO org.apache.hadoop.http.HttpServer2:
addJerseyResourcePackage:
packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
pathSpec=/webhdfs/v1/*
2015-01-14 11:03:04,755 INFO org.apache.hadoop.http.HttpServer2: Jetty
bound to port 50070
2015-01-14 11:03:04,755 INFO org.mortbay.log: jetty-6.1.26
2015-01-14 11:03:05,536 INFO org.mortbay.log: Started HttpServer2$
SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
2015-01-14 11:03:05,645 WARN
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage
directory (dfs.namenode.name.dir) configured. Beware of data loss due to
lack of redundant storage directories!
2015-01-14 11:03:05,645 WARN
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace
edits storage directory (dfs.namenode.edits.dir) configured. Beware of data
loss due to lack of redundant storage directories!
2015-01-14 11:03:05,746 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
2015-01-14 11:03:05,761 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
2015-01-14 11:03:05,837 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
dfs.block.invalidate.limit=1000
2015-01-14 11:03:05,837 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager:
dfs.namenode.datanode.registration.ip-hostname-check=true
2015-01-14 11:03:05,841 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2015-01-14 11:03:05,843 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block
deletion will start around 2015 Jan 14 11:03:05
2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: Computing
capacity for map BlocksMap
2015-01-14 11:03:05,847 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:03:05,849 INFO org.apache.hadoop.util.GSet: 2.0% max memory
889 MB = 17.8 MB
2015-01-14 11:03:05,850 INFO org.apache.hadoop.util.GSet: capacity      =
2^21 = 2097152 entries
2015-01-14 11:03:05,864 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
dfs.block.access.token.enable=false
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
defaultReplication         = 1
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication
            = 512
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication
            = 1
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
maxReplicationStreams      = 2
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
shouldCheckForEnoughRacks  = false
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
replicationRecheckInterval = 3000
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
encryptDataTransfer        = false
2015-01-14 11:03:05,865 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager:
maxNumBlocksToLog          = 1000
2015-01-14 11:03:05,874 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner             =
hadoop2 (auth:SIMPLE)
2015-01-14 11:03:05,874 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup          =
supergroup
2015-01-14 11:03:05,874 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled =
true
2015-01-14 11:03:05,875 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false
2015-01-14 11:03:05,878 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Append Enabled: true
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: Computing
capacity for map INodeMap
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: 1.0% max memory
889 MB = 8.9 MB
2015-01-14 11:03:06,279 INFO org.apache.hadoop.util.GSet: capacity      =
2^20 = 1048576 entries
2015-01-14 11:03:06,284 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names
occuring more than 10 times
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: Computing
capacity for map cachedBlocks
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: 0.25% max memory
889 MB = 2.2 MB
2015-01-14 11:03:06,298 INFO org.apache.hadoop.util.GSet: capacity      =
2^18 = 262144 entries
2015-01-14 11:03:06,301 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.threshold-pct = 0.9990000128746033
2015-01-14 11:03:06,301 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.min.datanodes = 0
2015-01-14 11:03:06,301 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
dfs.namenode.safemode.extension     = 30000
2015-01-14 11:03:06,304 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on
namenode is enabled
2015-01-14 11:03:06,304 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use
0.03 of total heap and retry cache entry expiry time is 600000 millis
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: Computing
capacity for map NameNodeRetryCache
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: VM type       =
64-bit
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet:
0.029999999329447746% max memory 889 MB = 273.1 KB
2015-01-14 11:03:06,309 INFO org.apache.hadoop.util.GSet: capacity      =
2^15 = 32768 entries
2015-01-14 11:03:06,317 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
ACLs enabled? false
2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
XAttrs enabled? true
2015-01-14 11:03:06,318 INFO org.apache.hadoop.hdfs.server.namenode.NNConf:
Maximum size of an xattr: 16384
2015-01-14 11:03:06,368 INFO org.apache.hadoop.hdfs.server.common.Storage:
Lock on /home/hadoop2/mydata/hdfs/namenode/in_use.lock acquired by nodename
13312@bigdata
2015-01-14 11:03:06,532 INFO
org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering
unfinalized segments in /home/hadoop2/mydata/hdfs/namenode/current
2015-01-14 11:03:06,622 INFO
org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits
file
/home/hadoop2/mydata/hdfs/namenode/current/edits_inprogress_0000000000000000095
->
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
2015-01-14 11:03:06,807 INFO
org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 2
INodes.
2015-01-14 11:03:06,888 INFO
org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded
FSImage in 0 seconds.
2015-01-14 11:03:06,888 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 83
from /home/hadoop2/mydata/hdfs/namenode/current/fsimage_0000000000000000083
2015-01-14 11:03:06,889 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@78bc5972
expecting start txid #84
2015-01-14 11:03:06,889 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
2015-01-14 11:03:06,893 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085'
to transaction ID 84
2015-01-14 11:03:06,897 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000084-0000000000000000085
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,897 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1594894b
expecting start txid #86
2015-01-14 11:03:06,898 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
2015-01-14 11:03:06,898 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087'
to transaction ID 84
2015-01-14 11:03:06,898 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000086-0000000000000000087
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,899 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@4ac1a5fe
expecting start txid #88
2015-01-14 11:03:06,899 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
2015-01-14 11:03:06,899 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089'
to transaction ID 84
2015-01-14 11:03:06,899 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000088-0000000000000000089
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,900 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6f78ed09
expecting start txid #90
2015-01-14 11:03:06,900 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
2015-01-14 11:03:06,900 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091'
to transaction ID 84
2015-01-14 11:03:06,901 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000090-0000000000000000091
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,901 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@6c12230b
expecting start txid #92
2015-01-14 11:03:06,901 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
2015-01-14 11:03:06,901 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093'
to transaction ID 84
2015-01-14 11:03:06,902 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000092-0000000000000000093
of size 42 edits # 2 loaded in 0 seconds
2015-01-14 11:03:06,902 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@1abade9b
expecting start txid #94
2015-01-14 11:03:06,902 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
2015-01-14 11:03:06,902 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094'
to transaction ID 84
2015-01-14 11:03:06,907 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000094-0000000000000000094
of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:03:06,908 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Reading
org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@626c9fd2
expecting start txid #95
2015-01-14 11:03:06,908 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
2015-01-14 11:03:06,908 INFO
org.apache.hadoop.hdfs.server.namenode.EditLogInputStream: Fast-forwarding
stream
'/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095'
to transaction ID 84
2015-01-14 11:03:07,266 INFO
org.apache.hadoop.hdfs.server.namenode.FSImage: Edits file
/home/hadoop2/mydata/hdfs/namenode/current/edits_0000000000000000095-0000000000000000095
of size 1048576 edits # 1 loaded in 0 seconds
2015-01-14 11:03:07,274 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image?
false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
2015-01-14 11:03:07,313 INFO
org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 96
2015-01-14 11:03:07,558 INFO
org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0
entries 0 lookups
2015-01-14 11:03:07,559 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading
FSImage in 1240 msecs
2015-01-14 11:03:08,011 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to
bigdata:9000
2015-01-14 11:03:08,030 INFO org.apache.hadoop.ipc.CallQueueManager: Using
callQueue class java.util.concurrent.LinkedBlockingQueue
2015-01-14 11:03:08,074 INFO org.apache.hadoop.ipc.Server: Starting Socket
Reader #1 for port 9000
2015-01-14 11:03:08,151 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered
FSNamesystemState MBean
2015-01-14 11:03:08,173 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
construction: 0
2015-01-14 11:03:08,173 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of blocks under
construction: 0
2015-01-14 11:03:08,173 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: initializing
replication queues
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Leaving safe mode after 2 secs
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Network topology has 0 racks and 0 datanodes
2015-01-14 11:03:08,174 INFO org.apache.hadoop.hdfs.StateChange: STATE*
UnderReplicatedBlocks has 0 blocks
2015-01-14 11:03:08,194 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of
blocks            = 0
2015-01-14 11:03:08,194 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
invalid blocks          = 0
2015-01-14 11:03:08,194 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
under-replicated blocks = 0
2015-01-14 11:03:08,194 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
 over-replicated blocks = 0
2015-01-14 11:03:08,194 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of
blocks being written    = 0
2015-01-14 11:03:08,194 INFO org.apache.hadoop.hdfs.StateChange: STATE*
Replication Queue initialization scan for invalid, over- and
under-replicated blocks completed in 18 msec
2015-01-14 11:03:08,322 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at:
bigdata/10.10.10.63:9000
2015-01-14 11:03:08,322 INFO
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services
required for active state
2015-01-14 11:03:08,316 INFO org.apache.hadoop.ipc.Server: IPC Server
Responder: starting
2015-01-14 11:03:08,319 INFO org.apache.hadoop.ipc.Server: IPC Server
listener on 9000: starting
2015-01-14 11:03:08,349 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Starting CacheReplicationMonitor with interval 30000 milliseconds
2015-01-14 11:03:08,349 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Rescanning after 4287712 milliseconds
2015-01-14 11:03:08,350 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:03:13,237 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
registerDatanode: from DatanodeRegistration(10.10.10.63,
datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0)
storage e3c24b88-cb98-4a74-8c5f-fee8dba99898
2015-01-14 11:03:13,244 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
failed storage changes from 0 to 0
2015-01-14 11:03:13,252 INFO org.apache.hadoop.net.NetworkTopology: Adding
a new node: /default-rack/10.10.10.63:50010
2015-01-14 11:03:13,743 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Number of
failed storage changes from 0 to 0
2015-01-14 11:03:13,750 INFO
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding
new storage ID DS-7989baef-c501-4a7a-b586-0f943444e099 for DN
10.10.10.63:50010
2015-01-14 11:03:13,959 INFO
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK*
processReport: Received first block report from
DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after
starting up or becoming active. Its block contents are no longer considered
stale
2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: from
storage DS-7989baef-c501-4a7a-b586-0f943444e099 node
DatanodeRegistration(10.10.10.63,
datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0),
blocks: 0, hasStaleStorages: false, processing time: 11 msecs
2015-01-14 11:03:38,349 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Rescanning after 30000 milliseconds
2015-01-14 11:03:38,350 INFO
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor:
Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s).
2015-01-14 11:03:57,100 INFO logs: Aliases are enabled


Thanks
Mahesh.S


On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gb...@gmail.com> wrote:

> Hi Mahesh,
>     We will need the namenode logs to debug this further. Can you restart
> namenode and paste the logs of that somewhere for us to analyze? Thanks.
>
> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <
> sankarmahesh37@gmail.com> wrote:
>
>> Hi Ramesh,
>>
>>                   I didnt see any exception in the hdfs logs.my problem
>> is agent for hdfs is not created.
>>
>> Regards,
>> Mahesh.S
>>
>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
>> wrote:
>>
>>> Hi Mahesh,
>>>
>>> The error you are seeing in is just a notice that  parent folder of the
>>> resource you are creating doesn’t have read permission for the user whom
>>> you are creating the policy.
>>>
>>> when you start the hdfs namenode and secondarynode do you see any
>>> exception in the hdfs logs?
>>>
>>> Regards,
>>> Ramesh
>>>
>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>
>>> wrote:
>>>
>>> Hi all,
>>>
>>> I successfully configured ranger admin,user sync.now am trying to
>>> configure hdfs plugin.My steps are following,
>>>
>>> 1.Created repository testhdfs.
>>> 2.cd /usr/local
>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>>> 5.cd ranger-hdfs-plugin
>>> 6.vi install.properties
>>>   POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>>           REPOSITORY_NAME=testhdfs
>>>           XAAUDIT.DB.HOSTNAME=localhost
>>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>>           XAAUDIT.DB.PASSWORD=rangerlogger
>>> 7.cd /usr/local/hadoop
>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>>> 9.export HADOOP_HOME=/usr/local/hadoop
>>> 10.cd /usr/local/ranger-hdfs-plugin
>>> 11../enable-hdfs-plugin.sh
>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>>> 13.vi xasecure-audit.xml
>>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>>                    <value>jdbc:mysql://localhost/ranger</value>
>>>                    </property>
>>>                    <property>
>>>
>>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>>                    <value>rangerlogger</value>
>>>                    </property>
>>>                    <property>
>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>>                    <value>rangerlogger</value>
>>>                    </property>
>>> 14.Restarted hadoop
>>> when i see Ranger Admin Web interface -> Audit -> Agents
>>> agent is not created.Am i missed any steps.
>>>
>>> *NOTE:I am not using HDP.*
>>>
>>> *here is my xa_portal.log*
>>>
>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_default.properties]
>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_system.properties]
>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_custom.properties]
>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_ldap.properties]
>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>> Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [db_message_bundle.properties]
>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>>> requestId=10.10.10.53
>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>>> create:nfsnobody@bigdata
>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_default.properties]
>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_system.properties]
>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_custom.properties]
>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_ldap.properties]
>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>> Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [db_message_bundle.properties]
>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>>> requestId=10.10.10.53
>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>>> requestId=10.10.10.53
>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>> Login: security not enabled, using username
>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>> RangerDaoManager.getEntityManager(loggingPU)
>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>>> Login: security not enabled, using username
>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>>> permission on parent folder. Do you want to save this policy?
>>> javax.ws.rs.WebApplicationException
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at
>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>> at
>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at
>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at
>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at
>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at
>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at
>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at
>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at
>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at
>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at
>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at
>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at
>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at
>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at
>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at
>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>> Validation error:logMessage=null,
>>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>> to save this policy?}
>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>> }
>>> javax.ws.rs.WebApplicationException
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at
>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>>> at
>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at
>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at
>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at
>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at
>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at
>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at
>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at
>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at
>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at
>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at
>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at
>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at
>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at
>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at
>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>>> requestId=10.10.10.53
>>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_default.properties]
>>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_system.properties]
>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_custom.properties]
>>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [xa_ldap.properties]
>>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>>> Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>>  org.springframework.core.io.support.PropertiesLoaderSupport
>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>>> path resource [db_message_bundle.properties]
>>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>>> requestId=10.10.10.53
>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>>> RangerDaoManager.getEntityManager(loggingPU)
>>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>>> permission on parent folder. Do you want to save this policy?
>>> javax.ws.rs.WebApplicationException
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at
>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>> at
>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at
>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at
>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at
>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at
>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at
>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at
>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at
>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at
>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at
>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at
>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at
>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at
>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at
>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at
>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>>> Validation error:logMessage=null,
>>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>>> to save this policy?}
>>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>>> }
>>> javax.ws.rs.WebApplicationException
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>>> at
>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>>> at
>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>>> at
>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>>> at
>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>>> at
>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>>> at
>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>>> at
>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>> at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>> at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>> at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>> at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>> at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>>> at
>>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>>> at
>>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>>> at
>>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>>> at
>>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>>> at
>>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>>> at
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>>> at
>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>>> at
>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>>> at
>>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>>> at
>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>>> at
>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>>> at
>>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>>> at
>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>>> at
>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>>> at
>>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>>> at
>>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>>> at
>>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at
>>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>>> at java.lang.Thread.run(Thread.java:744)
>>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>>  org.apache.ranger.security.listener.SpringEventListener
>>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>>> user
>>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>>> requestId=10.10.10.53
>>>
>>> Thanks
>>> Mahesh.S
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>
>>
>>
>
>
> --
> Regards,
> Gautam.
>

Re: Hdfs agent not created

Posted by Gautam Borad <gb...@gmail.com>.
Hi Mahesh,
    We will need the namenode logs to debug this further. Can you restart
namenode and paste the logs of that somewhere for us to analyze? Thanks.

On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran <sa...@gmail.com>
wrote:

> Hi Ramesh,
>
>                   I didnt see any exception in the hdfs logs.my problem is
> agent for hdfs is not created.
>
> Regards,
> Mahesh.S
>
> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com>
> wrote:
>
>> Hi Mahesh,
>>
>> The error you are seeing in is just a notice that  parent folder of the
>> resource you are creating doesn’t have read permission for the user whom
>> you are creating the policy.
>>
>> when you start the hdfs namenode and secondarynode do you see any
>> exception in the hdfs logs?
>>
>> Regards,
>> Ramesh
>>
>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>
>> wrote:
>>
>> Hi all,
>>
>> I successfully configured ranger admin,user sync.now am trying to
>> configure hdfs plugin.My steps are following,
>>
>> 1.Created repository testhdfs.
>> 2.cd /usr/local
>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
>> 5.cd ranger-hdfs-plugin
>> 6.vi install.properties
>>   POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>>           REPOSITORY_NAME=testhdfs
>>           XAAUDIT.DB.HOSTNAME=localhost
>>           XAAUDIT.DB.DATABASE_NAME=ranger
>>           XAAUDIT.DB.USER_NAME=rangerlogger
>>           XAAUDIT.DB.PASSWORD=rangerlogger
>> 7.cd /usr/local/hadoop
>> 8.ln -s /usr/local/hadoop/etc/hadoop conf
>> 9.export HADOOP_HOME=/usr/local/hadoop
>> 10.cd /usr/local/ranger-hdfs-plugin
>> 11../enable-hdfs-plugin.sh
>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
>> 13.vi xasecure-audit.xml
>>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>>                    <value>jdbc:mysql://localhost/ranger</value>
>>                    </property>
>>                    <property>
>>
>>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>>                    <value>rangerlogger</value>
>>                    </property>
>>                    <property>
>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>>                    <value>rangerlogger</value>
>>                    </property>
>> 14.Restarted hadoop
>> when i see Ranger Admin Web interface -> Audit -> Agents
>> agent is not created.Am i missed any steps.
>>
>> *NOTE:I am not using HDP.*
>>
>> *here is my xa_portal.log*
>>
>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_default.properties]
>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_system.properties]
>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_custom.properties]
>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_ldap.properties]
>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>> Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [db_message_bundle.properties]
>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
>> requestId=10.10.10.53
>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
>> create:nfsnobody@bigdata
>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_default.properties]
>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_system.properties]
>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_custom.properties]
>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_ldap.properties]
>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>> Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [db_message_bundle.properties]
>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
>> requestId=10.10.10.53
>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
>> requestId=10.10.10.53
>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>> Login: security not enabled, using username
>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>> RangerDaoManager.getEntityManager(loggingPU)
>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
>> Login: security not enabled, using username
>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
>> permission on parent folder. Do you want to save this policy?
>> javax.ws.rs.WebApplicationException
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> at
>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>> at
>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> at
>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> at
>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> at
>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> at
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> at
>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> at
>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> at
>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> at
>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> at
>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> at
>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> at
>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> at
>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> at
>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at
>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>> Validation error:logMessage=null,
>> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>> to save this policy?}
>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>> }
>> javax.ws.rs.WebApplicationException
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> at
>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
>> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
>> at
>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> at
>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> at
>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> at
>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> at
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> at
>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> at
>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> at
>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> at
>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> at
>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> at
>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> at
>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> at
>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> at
>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at
>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
>> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
>> requestId=10.10.10.53
>> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_default.properties]
>> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_system.properties]
>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_custom.properties]
>> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [xa_ldap.properties]
>> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
>> Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>>  org.springframework.core.io.support.PropertiesLoaderSupport
>> (PropertiesLoaderSupport.java:177) - Loading properties file from class
>> path resource [db_message_bundle.properties]
>> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
>> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
>> requestId=10.10.10.53
>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
>> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>>  org.apache.ranger.service.filter.RangerRESTAPIFilter
>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
>> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
>> RangerDaoManager.getEntityManager(loggingPU)
>> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
>> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
>> permission on parent folder. Do you want to save this policy?
>> javax.ws.rs.WebApplicationException
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> at
>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>> at
>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> at
>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> at
>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> at
>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> at
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> at
>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> at
>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> at
>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> at
>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> at
>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> at
>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> at
>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> at
>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> at
>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at
>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
>> Validation error:logMessage=null,
>> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
>> msgDesc={Mahesh may not have read permission on parent folder. Do you want
>> to save this policy?}
>> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
>> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
>> to perform this operation} objectId={null} fieldName={parentPermission} }]}
>> }
>> javax.ws.rs.WebApplicationException
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
>> at
>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
>> at
>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
>> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
>> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
>> at
>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>> at
>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>> at
>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>> at
>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
>> at
>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>> at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>> at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>> at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>> at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>> at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>> at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>> at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
>> at
>> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
>> at
>> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
>> at
>> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
>> at
>> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
>> at
>> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
>> at
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
>> at
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
>> at
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
>> at
>> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
>> at
>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
>> at
>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
>> at
>> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
>> at
>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
>> at
>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
>> at
>> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
>> at
>> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
>> at
>> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at
>> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>> at java.lang.Thread.run(Thread.java:744)
>> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>>  org.apache.ranger.security.listener.SpringEventListener
>> (SpringEventListener.java:69) - Login Successful:admin | Ip
>> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
>> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
>> user
>> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
>> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
>> requestId=10.10.10.53
>>
>> Thanks
>> Mahesh.S
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>
>
>


-- 
Regards,
Gautam.

Re: Hdfs agent not created

Posted by Mahesh Sankaran <sa...@gmail.com>.
Hi Ramesh,

                  I didnt see any exception in the hdfs logs.my problem is
agent for hdfs is not created.

Regards,
Mahesh.S

On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rm...@hortonworks.com> wrote:

> Hi Mahesh,
>
> The error you are seeing in is just a notice that  parent folder of the
> resource you are creating doesn’t have read permission for the user whom
> you are creating the policy.
>
> when you start the hdfs namenode and secondarynode do you see any
> exception in the hdfs logs?
>
> Regards,
> Ramesh
>
> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com>
> wrote:
>
> Hi all,
>
> I successfully configured ranger admin,user sync.now am trying to
> configure hdfs plugin.My steps are following,
>
> 1.Created repository testhdfs.
> 2.cd /usr/local
> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
> 5.cd ranger-hdfs-plugin
> 6.vi install.properties
>   POLICY_MGR_URL=http://IP:6080 <http://ip:6080/>
>           REPOSITORY_NAME=testhdfs
>           XAAUDIT.DB.HOSTNAME=localhost
>           XAAUDIT.DB.DATABASE_NAME=ranger
>           XAAUDIT.DB.USER_NAME=rangerlogger
>           XAAUDIT.DB.PASSWORD=rangerlogger
> 7.cd /usr/local/hadoop
> 8.ln -s /usr/local/hadoop/etc/hadoop conf
> 9.export HADOOP_HOME=/usr/local/hadoop
> 10.cd /usr/local/ranger-hdfs-plugin
> 11../enable-hdfs-plugin.sh
> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
> 13.vi xasecure-audit.xml
>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>                    <value>jdbc:mysql://localhost/ranger</value>
>                    </property>
>                    <property>
>
>  <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>                    <value>rangerlogger</value>
>                    </property>
>                    <property>
> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>                    <value>rangerlogger</value>
>                    </property>
> 14.Restarted hadoop
> when i see Ranger Admin Web interface -> Audit -> Agents
> agent is not created.Am i missed any steps.
>
> *NOTE:I am not using HDP.*
>
> *here is my xa_portal.log*
>
> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_default.properties]
> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_system.properties]
> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_custom.properties]
> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_ldap.properties]
> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN
>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [db_message_bundle.properties]
> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12,
> requestId=10.10.10.53
> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO
>  org.apache.ranger.rest.UserREST (UserREST.java:186) -
> create:nfsnobody@bigdata
> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_default.properties]
> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_system.properties]
> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_custom.properties]
> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_ldap.properties]
> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN
>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [db_message_bundle.properties]
> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0,
> requestId=10.10.10.53
> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD,
> requestId=10.10.10.53
> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO
>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
> Login: security not enabled, using username
> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR
> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
> RangerDaoManager.getEntityManager(loggingPU)
> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO
>  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init
> Login: security not enabled, using username
> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN
>  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is
> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read
> permission on parent folder. Do you want to save this policy?
> javax.ws.rs.WebApplicationException
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> at
> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
> at
> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> at
> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> at
> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> at
> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> at
> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> at
> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO
>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
> Validation error:logMessage=null,
> response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1}
> msgDesc={Mahesh may not have read permission on parent folder. Do you want
> to save this policy?}
> messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION}
> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
> to perform this operation} objectId={null} fieldName={parentPermission} }]}
> }
> javax.ws.rs.WebApplicationException
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> at
> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
> at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
> at
> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> at
> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> at
> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> at
> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> at
> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> at
> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655,
> requestId=10.10.10.53
> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_default.properties]
> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_system.properties]
> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_custom.properties]
> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [xa_ldap.properties]
> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN
>  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) -
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO
>  org.springframework.core.io.support.PropertiesLoaderSupport
> (PropertiesLoaderSupport.java:177) - Loading properties file from class
> path resource [db_message_bundle.properties]
> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A,
> requestId=10.10.10.53
> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO
>  org.apache.ranger.service.filter.RangerRESTAPIFilter
> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR
> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) -
> RangerDaoManager.getEntityManager(loggingPU)
> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO
>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request
> failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read
> permission on parent folder. Do you want to save this policy?
> javax.ws.rs.WebApplicationException
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> at
> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
> at
> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> at
> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> at
> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> at
> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> at
> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> at
> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO
>  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) -
> Validation error:logMessage=null,
> response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1}
> msgDesc={Mahesh may not have read permission on parent folder. Do you want
> to save this policy?}
> messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION}
> rbKey={xa.error.oper_no_permission} message={User doesn't have permission
> to perform this operation} objectId={null} fieldName={parentPermission} }]}
> }
> javax.ws.rs.WebApplicationException
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> at
> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> at
> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
> at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
> at
> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> at
> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> at
> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> at
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> at
> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> at
> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> at
> org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO
>  org.apache.ranger.security.listener.SpringEventListener
> (SpringEventListener.java:69) - Login Successful:admin | Ip
> Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid
> user
> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO
>  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success:
> loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264,
> requestId=10.10.10.53
>
> Thanks
> Mahesh.S
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Re: Hdfs agent not created

Posted by Ramesh Mani <rm...@hortonworks.com>.
Hi Mahesh,

The error you are seeing in is just a notice that  parent folder of the resource you are creating doesn’t have read permission for the user whom you are creating the policy.

when you start the hdfs namenode and secondarynode do you see any exception in the hdfs logs?

Regards,
Ramesh

On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran <sa...@gmail.com> wrote:

> Hi all,
> 
> I successfully configured ranger admin,user sync.now am trying to configure hdfs plugin.My steps are following,
> 
> 1.Created repository testhdfs.
> 2.cd /usr/local
> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz
> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin
> 5.cd ranger-hdfs-plugin
> 6.vi install.properties
> 	  POLICY_MGR_URL=http://IP:6080
>           REPOSITORY_NAME=testhdfs
>           XAAUDIT.DB.HOSTNAME=localhost
>           XAAUDIT.DB.DATABASE_NAME=ranger
>           XAAUDIT.DB.USER_NAME=rangerlogger
>           XAAUDIT.DB.PASSWORD=rangerlogger
> 7.cd /usr/local/hadoop
> 8.ln -s /usr/local/hadoop/etc/hadoop conf
> 9.export HADOOP_HOME=/usr/local/hadoop
> 10.cd /usr/local/ranger-hdfs-plugin
> 11../enable-hdfs-plugin.sh
> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/
> 13.vi xasecure-audit.xml
>  <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name>
>                    <value>jdbc:mysql://localhost/ranger</value>
>                    </property>
>                    <property>
>                    <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name>
>                    <value>rangerlogger</value>
>                    </property>
>                    <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name>
>                    <value>rangerlogger</value>
>                    </property>
> 14.Restarted hadoop
> when i see Ranger Admin Web interface -> Audit -> Agents
> agent is not created.Am i missed any steps.
> 
> NOTE:I am not using HDP.
> 
> here is my xa_portal.log
> 
> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12
> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, requestId=10.10.10.53
> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO  org.apache.ranger.rest.UserREST (UserREST.java:186) - create:nfsnobody@bigdata
> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0
> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, requestId=10.10.10.53
> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD
> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, requestId=10.10.10.53
> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO  apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init Login: security not enabled, using username
> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN  org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead.
> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
> javax.ws.rs.WebApplicationException
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> 	at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
> 	at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> 	at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:03:57,179 [http-bio-6080-exec-8] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@1ac512d2statusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@56a6b9name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
> javax.ws.rs.WebApplicationException
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> 	at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241)
> 	at org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214)
> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> 	at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:05:21,715 [http-bio-6080-exec-2] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=75F19182D1B525A6F2CB13497730A655
> 2015-01-13 16:05:21,718 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 16:05:23,093 [http-bio-6080-exec-2] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=13, sessionId=75F19182D1B525A6F2CB13497730A655, requestId=10.10.10.53
> 2015-01-13 16:14:23,673 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_default.properties]
> 2015-01-13 16:14:23,678 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_system.properties]
> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_custom.properties]
> 2015-01-13 16:14:23,679 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [xa_ldap.properties]
> 2015-01-13 16:14:24,064 [localhost-startStop-1] WARN  org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-01-13 16:14:24,666 [localhost-startStop-1] INFO  org.springframework.core.io.support.PropertiesLoaderSupport (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [db_message_bundle.properties]
> 2015-01-13 16:14:40,338 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A
> 2015-01-13 16:14:41,539 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 16:14:43,320 [http-bio-6080-exec-4] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=14, sessionId=EA5C57A3BE8D17A77D4163D3CE14A20A, requestId=10.10.10.53
> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0
> 2015-01-13 16:14:43,602 [http-bio-6080-exec-4] INFO  org.apache.ranger.service.filter.RangerRESTAPIFilter (RangerRESTAPIFilter.java:254) - Loaded 0 API methods.
> 2015-01-13 16:14:47,055 [http-bio-6080-exec-6] ERROR org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - RangerDaoManager.getEntityManager(loggingPU)
> 2015-01-13 16:16:07,630 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request failed. SessionId=14, loginId=admin, logMessage=Mahesh may not have read permission on parent folder. Do you want to save this policy?
> javax.ws.rs.WebApplicationException
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> 	at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
> 	at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> 	at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:16:07,634 [http-bio-6080-exec-6] INFO  org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:265) - Validation error:logMessage=null, response=VXResponse={org.apache.ranger.view.VXResponse@42f1d50bstatusCode={1} msgDesc={Mahesh may not have read permission on parent folder. Do you want to save this policy?} messageList={[VXMessage={org.apache.ranger.view.VXMessage@12d9e783name={OPER_NO_PERMISSION} rbKey={xa.error.oper_no_permission} message={User doesn't have permission to perform this operation} objectId={null} fieldName={parentPermission} }]} }
> javax.ws.rs.WebApplicationException
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55)
> 	at org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264)
> 	at org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546)
> 	at org.apache.ranger.biz.AssetMgr.updateXResource(AssetMgr.java:377)
> 	at org.apache.ranger.rest.AssetREST.updateXResource(AssetREST.java:223)
> 	at org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>)
> 	at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191)
> 	at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
> 	at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
> 	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
> 	at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622)
> 	at org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$9fb5361d.updateXResource(<generated>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
> 	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
> 	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
> 	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
> 	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
> 	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
> 	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
> 	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
> 	at org.apache.ranger.security.web.filter.RangerSecurityContextFormationFilter.doFilter(RangerSecurityContextFormationFilter.java:130)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
> 	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
> 	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
> 	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
> 	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
> 	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
> 	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
> 	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> 	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> 	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
> 	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
> 	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
> 	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
> 	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> 	at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
> 	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> 	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
> 	at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
> 	at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
> 	at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> 	at java.lang.Thread.run(Thread.java:744)
> 2015-01-13 16:18:03,024 [http-bio-6080-exec-3] INFO  org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:69) - Login Successful:admin | Ip Address:10.10.10.53 | sessionId=DA9EE1C6D1C94EDACD127EA8D4503264
> 2015-01-13 16:18:03,028 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid user
> 2015-01-13 16:18:04,385 [http-bio-6080-exec-3] INFO  org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: loginId=admin, sessionId=15, sessionId=DA9EE1C6D1C94EDACD127EA8D4503264, requestId=10.10.10.53
> 
> Thanks
> Mahesh.S


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.