You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "Ananth T. Sarathy" <an...@gmail.com> on 2009/11/19 19:28:45 UTC

Access Error

I just set up a hadoop cluster. When I try to write to it from my java code,
I get the.error below. When using the core-site.xml, do I need to specify a
user?



org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
    at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
    at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2647)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
    at
com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerImpl.java:151)
    at
com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java:46)
Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4545)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4515)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1023)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:977)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:739)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy0.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy0.create(Unknown Source)
    at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2644)
    ... 8 more
Ananth T Sarathy

Re: Access Error

Posted by Ben Hardy <be...@gmail.com>.
Hi Steve! Long time no see :-)

On Wed, Oct 6, 2010 at 4:25 PM, Steve Kuo <ku...@gmail.com> wrote:

> This could be caused by different user accounts.
>
> Is the user "hadoop" when running job on the master and "bhardy" on remote
> client?
>

Actually this running as users called bhardy on both master (a box running
centos with cdh) and remote client (a mac).

The client would be getting the username from the local mac environment, the
master is set up to authenticate users against ldap.

I believe both work by running 'whoami' right? So it shouldn't matter where
the username comes from.

cheers
b

Re: Access Error

Posted by Steve Kuo <ku...@gmail.com>.
This could be caused by different user accounts.

Is the user "hadoop" when running job on the master and "bhardy" on remote
client?

Re: Access Error

Posted by Ben Hardy <be...@gmail.com>.
I'm getting this same error when running hadoop jobs from a remote client,
but it works fine on the master.

It looks like an HDFS permission issue, but HDFS file operations from the
remote client also work just fine. Very odd!

Error doesn't happen when running jobs from the master. Using CDH 3 beta.

It seems to be complaining about my user not being able to write to a
directory that is owned and writable by my user, unless I'm misreading the
error.

Example:

# copyFromLocal works...
$ export HDFS_ROOT=hdfs://hadoop0001:54310
$ hadoop fs -copyFromLocal run.counter $HDFS_ROOT:/user/bhardy/
# no complaints

# normal hadoop jobs don't
$ hadoop jar Processor-jobjar-1.7.jar   com.mycorp.ProcessorJob \
>   $HDFS_ROOT/user/bhardy/pdp-intermediate \
>   $HDFS_ROOT/user/bhardy/pdp-intermediate2
10/10/06 15:08:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=bhardy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)
at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:264)
at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:573)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:730)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1249)
at
com.eharmony.matching.offline.pdp.phase2.PdpPhase2Job.run(PdpPhase2Job.java:67)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at
com.eharmony.matching.offline.pdp.phase2.PdpPhase2Job.main(PdpPhase2Job.java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=bhardy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)


The client has the following properties set, and the cluster is pretty much
using the defaults.

 fs.default.name=hdfs://hadoop0001:54310/
 dfs.datanode.address=hdfs://hadoop0001:54310/
 mapred.job.tracker=hadoop0002:54311

My question is - where would I start even looking to try to diagnose this
exception?

In my hdfs-site.xml, dfs.permissions is not explicitly set to anything. In
any case if it were defaulting to true, that'd be fine. I can run whoami
with no problem.

I'm confused as to why I get AccessExceptions for hadoop jobs but not dfs
operations when running from the remote client. Hadoop jobs work fine when
run from the master. I'm sure this is some trivial configuration problem.
Any suggestions?

thanks
b


On Thu, Nov 19, 2009 at 3:59 PM, Y G <gy...@gmail.com> wrote:

> you can run you MR programm in the relative *nix account, make sure it
>  is as same as your hdfs dir user and group .
> or you can trun off the hdfs permission conf .
>
> 2009/11/20, Ananth T. Sarathy <an...@gmail.com>:
> > I just set up a hadoop cluster. When I try to write to it from my java
> code,
> > I get the.error below. When using the core-site.xml, do I need to specify
> a
> > user?
> >
> >
> >
> > org.apache.hadoop.security.AccessControlException:
> > org.apache.hadoop.security.AccessControlException: Permission denied:
> > user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
> >     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> >     at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> >     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >     at
> >
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
> >     at
> >
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
> >     at
> >
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2647)
> >     at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
> >     at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
> >     at
> >
> com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerImpl.java:151)
> >     at
> >
> com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java:46)
> > Caused by: org.apache.hadoop.ipc.RemoteException:
> > org.apache.hadoop.security.AccessControlException: Permission denied:
> > user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4545)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4515)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1023)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:977)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389)
> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >     at java.lang.reflect.Method.invoke(Method.java:597)
> >     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:396)
> >     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> >
> >     at org.apache.hadoop.ipc.Client.call(Client.java:739)
> >     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> >     at $Proxy0.create(Unknown Source)
> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >     at java.lang.reflect.Method.invoke(Method.java:597)
> >     at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >     at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >     at $Proxy0.create(Unknown Source)
> >     at
> >
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2644)
> >     ... 8 more
> > Ananth T Sarathy
> >
>
> --
> 从我的移动设备发送
>
> -----
> 天天开心
> 身体健康
>

Re: Access Error

Posted by Y G <gy...@gmail.com>.
you can run you MR programm in the relative *nix account, make sure it
 is as same as your hdfs dir user and group .
or you can trun off the hdfs permission conf .

2009/11/20, Ananth T. Sarathy <an...@gmail.com>:
> I just set up a hadoop cluster. When I try to write to it from my java code,
> I get the.error below. When using the core-site.xml, do I need to specify a
> user?
>
>
>
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>     at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
>     at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
>     at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2647)
>     at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
>     at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
>     at
> com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerImpl.java:151)
>     at
> com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java:46)
> Caused by: org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>     at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)
>     at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)
>     at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4545)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4515)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1023)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:977)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:739)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>     at $Proxy0.create(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>     at $Proxy0.create(Unknown Source)
>     at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2644)
>     ... 8 more
> Ananth T Sarathy
>

-- 
从我的移动设备发送

-----
天天开心
身体健康

RE: Access Error

Posted by "Habermaas, William" <Wi...@fatwire.com>.
Depending on what version of hadoop you are using dictates where this
parameter goes. 

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

On my hadoop-0.20.1 this goes in conf/hdfs-site.xml

Make sure you make the same change at all the nodes. 
After you make the change you'll have to stop/start the cluster before
it will become effective. 

Bill 

-----Original Message-----
From: Ananth T. Sarathy [mailto:ananth.t.sarathy@gmail.com] 
Sent: Thursday, November 19, 2009 2:55 PM
To: common-user@hadoop.apache.org
Subject: Re: Access Error

how do I turn it off?
Ananth T Sarathy


On Thu, Nov 19, 2009 at 1:49 PM, Habermaas, William <
William.Habermaas@fatwire.com> wrote:

> Hadoop will perform a 'whoami' to identify the user that is making the
> HDFS request.  If you have not turned off file permissions in the
Hadoop
> configuration, the user name will be matched to the permission
settings
> related to the path you are going after.  Think of it as a mechanism
> similar (but not exactly the same) as Unix/Linux file system
> permissions. You can use the hadoop fs function to tweak the
permissions
> or you can simply turn checking off for the whole HDFS cluster.
>
> Bill
> -----Original Message-----
> From: Ananth T. Sarathy [mailto:ananth.t.sarathy@gmail.com]
> Sent: Thursday, November 19, 2009 1:29 PM
> To: common-user@hadoop.apache.org
> Subject: Access Error
>
> I just set up a hadoop cluster. When I try to write to it from my java
> code,
> I get the.error below. When using the core-site.xml, do I need to
> specify a
> user?
>
>
>
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>    at
>
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA
> ccessorImpl.java:39)
>    at
>
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons
> tructorAccessorImpl.java:27)
>    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>    at
>
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExcepti
> on.java:96)
>    at
>
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcept
> ion.java:58)
>    at
>
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
> 647)
>    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
>    at
>
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSyste
> m.java:195)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
>    at
>
com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerIm
> pl.java:151)
>    at
>
com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java
> :46)
> Caused by: org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>    at
>
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
> nChecker.java:176)
>    at
>
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
> nChecker.java:157)
>    at
>
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission
> (PermissionChecker.java:105)
>    at
>
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNa
> mesystem.java:4545)
>    at
>
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(
> FSNamesystem.java:4515)
>    at
>
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FS
> Namesystem.java:1023)
>    at
>
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesyst
> em.java:977)
>    at
>
org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389
> )
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
> a:39)
>    at
>
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
> Impl.java:25)
>    at java.lang.reflect.Method.invoke(Method.java:597)
>    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:396)
>    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>    at org.apache.hadoop.ipc.Client.call(Client.java:739)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>    at $Proxy0.create(Unknown Source)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
> a:39)
>    at
>
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
> Impl.java:25)
>    at java.lang.reflect.Method.invoke(Method.java:597)
>    at
>
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo
> cationHandler.java:82)
>    at
>
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocation
> Handler.java:59)
>    at $Proxy0.create(Unknown Source)
>    at
>
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
> 644)
>    ... 8 more
> Ananth T Sarathy
>

Re: Access Error

Posted by "Ananth T. Sarathy" <an...@gmail.com>.
how do I turn it off?
Ananth T Sarathy


On Thu, Nov 19, 2009 at 1:49 PM, Habermaas, William <
William.Habermaas@fatwire.com> wrote:

> Hadoop will perform a 'whoami' to identify the user that is making the
> HDFS request.  If you have not turned off file permissions in the Hadoop
> configuration, the user name will be matched to the permission settings
> related to the path you are going after.  Think of it as a mechanism
> similar (but not exactly the same) as Unix/Linux file system
> permissions. You can use the hadoop fs function to tweak the permissions
> or you can simply turn checking off for the whole HDFS cluster.
>
> Bill
> -----Original Message-----
> From: Ananth T. Sarathy [mailto:ananth.t.sarathy@gmail.com]
> Sent: Thursday, November 19, 2009 1:29 PM
> To: common-user@hadoop.apache.org
> Subject: Access Error
>
> I just set up a hadoop cluster. When I try to write to it from my java
> code,
> I get the.error below. When using the core-site.xml, do I need to
> specify a
> user?
>
>
>
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>    at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA
> ccessorImpl.java:39)
>    at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons
> tructorAccessorImpl.java:27)
>    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>    at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExcepti
> on.java:96)
>    at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcept
> ion.java:58)
>    at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
> 647)
>    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
>    at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSyste
> m.java:195)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
>    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
>    at
> com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerIm
> pl.java:151)
>    at
> com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java
> :46)
> Caused by: org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
>    at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
> nChecker.java:176)
>    at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
> nChecker.java:157)
>    at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission
> (PermissionChecker.java:105)
>    at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNa
> mesystem.java:4545)
>    at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(
> FSNamesystem.java:4515)
>    at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FS
> Namesystem.java:1023)
>    at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesyst
> em.java:977)
>    at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389
> )
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
> a:39)
>    at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
> Impl.java:25)
>    at java.lang.reflect.Method.invoke(Method.java:597)
>    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:396)
>    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>    at org.apache.hadoop.ipc.Client.call(Client.java:739)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>    at $Proxy0.create(Unknown Source)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
> a:39)
>    at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
> Impl.java:25)
>    at java.lang.reflect.Method.invoke(Method.java:597)
>    at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo
> cationHandler.java:82)
>    at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocation
> Handler.java:59)
>    at $Proxy0.create(Unknown Source)
>    at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
> 644)
>    ... 8 more
> Ananth T Sarathy
>

RE: Access Error

Posted by "Habermaas, William" <Wi...@fatwire.com>.
Hadoop will perform a 'whoami' to identify the user that is making the
HDFS request.  If you have not turned off file permissions in the Hadoop
configuration, the user name will be matched to the permission settings
related to the path you are going after.  Think of it as a mechanism
similar (but not exactly the same) as Unix/Linux file system
permissions. You can use the hadoop fs function to tweak the permissions
or you can simply turn checking off for the whole HDFS cluster.

Bill
-----Original Message-----
From: Ananth T. Sarathy [mailto:ananth.t.sarathy@gmail.com] 
Sent: Thursday, November 19, 2009 1:29 PM
To: common-user@hadoop.apache.org
Subject: Access Error

I just set up a hadoop cluster. When I try to write to it from my java
code,
I get the.error below. When using the core-site.xml, do I need to
specify a
user?



org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA
ccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons
tructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExcepti
on.java:96)
    at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcept
ion.java:58)
    at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
647)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSyste
m.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
    at
com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerIm
pl.java:151)
    at
com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java
:46)
Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
nChecker.java:176)
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permissio
nChecker.java:157)
    at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission
(PermissionChecker.java:105)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNa
mesystem.java:4545)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(
FSNamesystem.java:4515)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FS
Namesystem.java:1023)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesyst
em.java:977)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389
)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
a:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
Impl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:739)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy0.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
a:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
Impl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo
cationHandler.java:82)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocation
Handler.java:59)
    at $Proxy0.create(Unknown Source)
    at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2
644)
    ... 8 more
Ananth T Sarathy