You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by ch huang <ju...@gmail.com> on 2014/07/22 08:14:50 UTC
issue about run MR job use system user in CDH5
hi,maillist:
i set up CDH5 yarn cluster ,and set the following option in my
mapred-site.xml file
<property>
<name>yarn.app.mapreduce.am.staging-dir</name>
<value>/data</value>
</property>
mapreduce history server will set history dir in the directory /data ,but
if i submit MR job use other user ,i get error , i add the user to hadoop
group also no use ,why?how can i do it? thanks
2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat: Created
table instance for test_1
2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
PriviledgedActionException as:hbase (auth:SIMPLE)
cause:org.apache.hadoop.security.AccessControlException: Permission denied:
user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
Exception in thread "main"
org.apache.hadoop.security.AccessControlException: Permission denied:
user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
Re: issue about run MR job use system user in CDH5
Posted by ch huang <ju...@gmail.com>.
i modify the owner of /data directory as hdfs ,now it's ok
On Tue, Jul 22, 2014 at 2:44 PM, Alexander Alten-Lorenz <wget.null@gmail.com
> wrote:
> Please post vendor specific questions to the mailinglists of the vendor:
> https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
>
> Look closer at:
> security.UserGroupInformation: PriviledgedActionException as:hbase
> (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
> Permission denied: user=hbase, access=EXECUTE,
> inode="/data":mapred:hadoop:drwxrwx---
>
> /data hasn't the proper permissions.
>
> - Alex
>
>
> ------ Originalnachricht ------
> Von: "ch huang" <ju...@gmail.com>
> An: user@hadoop.apache.org
> Gesendet: 22.07.2014 08:14:50
> Betreff: issue about run MR job use system user in CDH5
>
>
> hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
> mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
> mapreduce history server will set history dir in the directory /data ,but
> if i submit MR job use other user ,i get error , i add the user to hadoop
> group also no use ,why?how can i do it? thanks
>
> 2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
> Created table instance for test_1
> 2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
> PriviledgedActionException as:hbase (auth:SIMPLE)
> cause:org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
>
Re: issue about run MR job use system user in CDH5
Posted by ch huang <ju...@gmail.com>.
i modify the owner of /data directory as hdfs ,now it's ok
On Tue, Jul 22, 2014 at 2:44 PM, Alexander Alten-Lorenz <wget.null@gmail.com
> wrote:
> Please post vendor specific questions to the mailinglists of the vendor:
> https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
>
> Look closer at:
> security.UserGroupInformation: PriviledgedActionException as:hbase
> (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
> Permission denied: user=hbase, access=EXECUTE,
> inode="/data":mapred:hadoop:drwxrwx---
>
> /data hasn't the proper permissions.
>
> - Alex
>
>
> ------ Originalnachricht ------
> Von: "ch huang" <ju...@gmail.com>
> An: user@hadoop.apache.org
> Gesendet: 22.07.2014 08:14:50
> Betreff: issue about run MR job use system user in CDH5
>
>
> hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
> mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
> mapreduce history server will set history dir in the directory /data ,but
> if i submit MR job use other user ,i get error , i add the user to hadoop
> group also no use ,why?how can i do it? thanks
>
> 2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
> Created table instance for test_1
> 2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
> PriviledgedActionException as:hbase (auth:SIMPLE)
> cause:org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
>
Re: issue about run MR job use system user in CDH5
Posted by ch huang <ju...@gmail.com>.
i modify the owner of /data directory as hdfs ,now it's ok
On Tue, Jul 22, 2014 at 2:44 PM, Alexander Alten-Lorenz <wget.null@gmail.com
> wrote:
> Please post vendor specific questions to the mailinglists of the vendor:
> https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
>
> Look closer at:
> security.UserGroupInformation: PriviledgedActionException as:hbase
> (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
> Permission denied: user=hbase, access=EXECUTE,
> inode="/data":mapred:hadoop:drwxrwx---
>
> /data hasn't the proper permissions.
>
> - Alex
>
>
> ------ Originalnachricht ------
> Von: "ch huang" <ju...@gmail.com>
> An: user@hadoop.apache.org
> Gesendet: 22.07.2014 08:14:50
> Betreff: issue about run MR job use system user in CDH5
>
>
> hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
> mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
> mapreduce history server will set history dir in the directory /data ,but
> if i submit MR job use other user ,i get error , i add the user to hadoop
> group also no use ,why?how can i do it? thanks
>
> 2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
> Created table instance for test_1
> 2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
> PriviledgedActionException as:hbase (auth:SIMPLE)
> cause:org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
>
Re: issue about run MR job use system user in CDH5
Posted by ch huang <ju...@gmail.com>.
i modify the owner of /data directory as hdfs ,now it's ok
On Tue, Jul 22, 2014 at 2:44 PM, Alexander Alten-Lorenz <wget.null@gmail.com
> wrote:
> Please post vendor specific questions to the mailinglists of the vendor:
> https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
>
> Look closer at:
> security.UserGroupInformation: PriviledgedActionException as:hbase
> (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
> Permission denied: user=hbase, access=EXECUTE,
> inode="/data":mapred:hadoop:drwxrwx---
>
> /data hasn't the proper permissions.
>
> - Alex
>
>
> ------ Originalnachricht ------
> Von: "ch huang" <ju...@gmail.com>
> An: user@hadoop.apache.org
> Gesendet: 22.07.2014 08:14:50
> Betreff: issue about run MR job use system user in CDH5
>
>
> hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
> mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
> mapreduce history server will set history dir in the directory /data ,but
> if i submit MR job use other user ,i get error , i add the user to hadoop
> group also no use ,why?how can i do it? thanks
>
> 2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
> Created table instance for test_1
> 2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
> PriviledgedActionException as:hbase (auth:SIMPLE)
> cause:org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
>
Re: issue about run MR job use system user in CDH5
Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please post vendor specific questions to the mailinglists of the vendor:
https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
Look closer at:
security.UserGroupInformation: PriviledgedActionException as:hbase
(auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
Permission denied: user=hbase, access=EXECUTE,
inode="/data":mapred:hadoop:drwxrwx---
/data hasn't the proper permissions.
- Alex
------ Originalnachricht ------
Von: "ch huang" <ju...@gmail.com>
An: user@hadoop.apache.org
Gesendet: 22.07.2014 08:14:50
Betreff: issue about run MR job use system user in CDH5
>hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
>mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
>mapreduce history server will set history dir in the directory /data
>,but if i submit MR job use other user ,i get error , i add the user to
>hadoop group also no use ,why?how can i do it? thanks
>
>2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
>Created table instance for test_1
>2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
>PriviledgedActionException as:hbase (auth:SIMPLE)
>cause:org.apache.hadoop.security.AccessControlException: Permission
>denied: user=hbase, access=EXECUTE,
>inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>Exception in thread "main"
>org.apache.hadoop.security.AccessControlException: Permission denied:
>user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
Re: issue about run MR job use system user in CDH5
Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please post vendor specific questions to the mailinglists of the vendor:
https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
Look closer at:
security.UserGroupInformation: PriviledgedActionException as:hbase
(auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
Permission denied: user=hbase, access=EXECUTE,
inode="/data":mapred:hadoop:drwxrwx---
/data hasn't the proper permissions.
- Alex
------ Originalnachricht ------
Von: "ch huang" <ju...@gmail.com>
An: user@hadoop.apache.org
Gesendet: 22.07.2014 08:14:50
Betreff: issue about run MR job use system user in CDH5
>hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
>mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
>mapreduce history server will set history dir in the directory /data
>,but if i submit MR job use other user ,i get error , i add the user to
>hadoop group also no use ,why?how can i do it? thanks
>
>2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
>Created table instance for test_1
>2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
>PriviledgedActionException as:hbase (auth:SIMPLE)
>cause:org.apache.hadoop.security.AccessControlException: Permission
>denied: user=hbase, access=EXECUTE,
>inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>Exception in thread "main"
>org.apache.hadoop.security.AccessControlException: Permission denied:
>user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
Re: issue about run MR job use system user in CDH5
Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please post vendor specific questions to the mailinglists of the vendor:
https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
Look closer at:
security.UserGroupInformation: PriviledgedActionException as:hbase
(auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
Permission denied: user=hbase, access=EXECUTE,
inode="/data":mapred:hadoop:drwxrwx---
/data hasn't the proper permissions.
- Alex
------ Originalnachricht ------
Von: "ch huang" <ju...@gmail.com>
An: user@hadoop.apache.org
Gesendet: 22.07.2014 08:14:50
Betreff: issue about run MR job use system user in CDH5
>hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
>mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
>mapreduce history server will set history dir in the directory /data
>,but if i submit MR job use other user ,i get error , i add the user to
>hadoop group also no use ,why?how can i do it? thanks
>
>2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
>Created table instance for test_1
>2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
>PriviledgedActionException as:hbase (auth:SIMPLE)
>cause:org.apache.hadoop.security.AccessControlException: Permission
>denied: user=hbase, access=EXECUTE,
>inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>Exception in thread "main"
>org.apache.hadoop.security.AccessControlException: Permission denied:
>user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>
Re: issue about run MR job use system user in CDH5
Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please post vendor specific questions to the mailinglists of the vendor:
https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user
Look closer at:
security.UserGroupInformation: PriviledgedActionException as:hbase
(auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException:
Permission denied: user=hbase, access=EXECUTE,
inode="/data":mapred:hadoop:drwxrwx---
/data hasn't the proper permissions.
- Alex
------ Originalnachricht ------
Von: "ch huang" <ju...@gmail.com>
An: user@hadoop.apache.org
Gesendet: 22.07.2014 08:14:50
Betreff: issue about run MR job use system user in CDH5
>hi,maillist:
>
> i set up CDH5 yarn cluster ,and set the following option in my
>mapred-site.xml file
>
> <property>
> <name>yarn.app.mapreduce.am.staging-dir</name>
> <value>/data</value>
> </property>
>
>
>mapreduce history server will set history dir in the directory /data
>,but if i submit MR job use other user ,i get error , i add the user to
>hadoop group also no use ,why?how can i do it? thanks
>
>2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat:
>Created table instance for test_1
>2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation:
>PriviledgedActionException as:hbase (auth:SIMPLE)
>cause:org.apache.hadoop.security.AccessControlException: Permission
>denied: user=hbase, access=EXECUTE,
>inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>Exception in thread "main"
>org.apache.hadoop.security.AccessControlException: Permission denied:
>user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx---
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
> at
>org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
> at
>org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499)
> at
>org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
> at
>org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
> at
>org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)
>
>