You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by stephen mulcahy <st...@deri.org> on 2009/06/26 13:40:27 UTC
Permissions needed to run RandomWriter ?
Hi,
I've just installed a new test cluster and I'm trying to give it a quick
smoke test with RandomWriter and Sort.
I can run these fine with the superuser account. When I try to run them
as another user I run into problems even though I've created the output
directory and given permissions to the other user to write to this
directory. i.e.
1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
mkdir: org.apache.hadoop.fs.permission.AccessControlException:
Permission denied: user=smulcahy, access=WRITE,
inode="":hadoop:supergroup:rwxr-xr-x
OK - we don't have permissions anyways
2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo
OK
3. hadoop fs -chown -R smulcahy /foo
OK
4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test
OK
5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
randomwriter /foo
java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1793)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
Any suggestions on why step 5. is failing even though I have write
permissions to /foo - do I need permissions on some other directory also
or ... ?
Thanks,
-stephen
--
Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
http://di2.deri.ie http://webstar.deri.ie http://sindice.com
Re: Permissions needed to run RandomWriter ?
Posted by stephen mulcahy <st...@deri.org>.
Hi,
I think I've figured this out. It may be obvious to others, but the user
I was trying to run the sort as didn't have write access to the
hadoop.tmp.dir
I did the following
sudo adduser smulcahy hadoop
sudo chmod -R g+w /data1/hadoop-tmp/
and it seems to have started successfully now.
From a configuration perspective, should dfs.permissions.supergroup
default to 'hadoop' rather than supergroup? And should hadoop-tmp be
created with default group-write access? Or is the hadoop group intended
for other purposes (are there reasons why you might not want normal
hadoop users being members of this group)?
Thanks,
-stephen
--
Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
http://di2.deri.ie http://webstar.deri.ie http://sindice.com
Re: Permissions needed to run RandomWriter ?
Posted by stephen mulcahy <st...@deri.org>.
Alex Loddengaard wrote:
> Make sure /user/smulcahy exists in HDFS. Also, make sure that
> /hadoop/mapred/system in HDFS is 733 and owned by hadoop:supergroup.
>
> Let me know if this doesn't work for you. Also, what version of Hadoop are
> you running?
Hi Alex,
Thanks for your patience!
We're actually running 0.18.3-4cloudera0.3.0~intrepid on this specific
cluster.
/user/smulcahy does not exist in HDFS. I ran the following to create it,
hadoop fs -mkdir /user/smulcahy
hadoop fs -chown -R smulcahy:smulcahy /user/smulcahy
but the sort still fails like so
hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar randomwriter /user/smulcahy
java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1793)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
The other directory you mentioned does exist, but in a slightly
different location
hadoop fs -ls /data1/hadoop-tmp/mapred
Found 1 items
drwx-wx-wx - hadoop supergroup 0 2009-06-30 11:49
/data1/hadoop-tmp/mapred/system
Is that a problem?
Thanks,
-stephen
>
> Hope this helps!
>
> Alex
>
> On Mon, Jun 29, 2009 at 1:11 AM, stephen mulcahy
> <st...@deri.org>wrote:
>
>> Alex Loddengaard wrote:
>>
>>> Have you tried to run the example job as the superuser? It seems like
>>> this
>>> might be an issue where hadoop.tmp.dir doesn't have the correctly
>>> permissions. hadoop.tmp.dir and dfs.data.dir should be owned by the unix
>>> user running your Hadoop daemons and owner-writtable and readable.
>>>
>>> Can you confirm this is the case? Thanks,
>>>
>> Hi Alex,
>>
>> The RandomWriter example runs without any problems when run as the hadoop
>> user (i.e. the superuser / user that runs the hadoop daemons).
>>
>> hadoop.tmp.dir permissions
>>
>> smulcahy@hadoop01:~$ ls -la /data1/hadoop-tmp/
>> total 16
>> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 14:01 .
>> drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
>> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 10:16 dfs
>> drwxr-xr-x 3 hadoop hadoop 4096 2009-06-19 10:49 mapred
>>
>>
>>
>> smulcahy@hadoop01:~$ ls -la /data?/hdfs
>> /data1/hdfs:
>> total 8
>> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
>> drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
>>
>> /data2/hdfs:
>> total 8
>> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
>> drwxr-xr-x 4 root root 4096 2009-06-19 10:12 ..
>>
>> Does hadoop.tmp.dir need to be writeable by all users running hadoop jobs?
>>
>>
>> -stephen
>>
>> --
>> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
>> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
>> http://di2.deri.ie http://webstar.deri.ie http://sindice.com
>>
>
--
Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
http://di2.deri.ie http://webstar.deri.ie http://sindice.com
Re: Permissions needed to run RandomWriter ?
Posted by Alex Loddengaard <al...@cloudera.com>.
Make sure /user/smulcahy exists in HDFS. Also, make sure that
/hadoop/mapred/system in HDFS is 733 and owned by hadoop:supergroup.
Let me know if this doesn't work for you. Also, what version of Hadoop are
you running?
Hope this helps!
Alex
On Mon, Jun 29, 2009 at 1:11 AM, stephen mulcahy
<st...@deri.org>wrote:
> Alex Loddengaard wrote:
>
>> Have you tried to run the example job as the superuser? It seems like
>> this
>> might be an issue where hadoop.tmp.dir doesn't have the correctly
>> permissions. hadoop.tmp.dir and dfs.data.dir should be owned by the unix
>> user running your Hadoop daemons and owner-writtable and readable.
>>
>> Can you confirm this is the case? Thanks,
>>
>
> Hi Alex,
>
> The RandomWriter example runs without any problems when run as the hadoop
> user (i.e. the superuser / user that runs the hadoop daemons).
>
> hadoop.tmp.dir permissions
>
> smulcahy@hadoop01:~$ ls -la /data1/hadoop-tmp/
> total 16
> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 14:01 .
> drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 10:16 dfs
> drwxr-xr-x 3 hadoop hadoop 4096 2009-06-19 10:49 mapred
>
>
>
> smulcahy@hadoop01:~$ ls -la /data?/hdfs
> /data1/hdfs:
> total 8
> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
> drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
>
> /data2/hdfs:
> total 8
> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
> drwxr-xr-x 4 root root 4096 2009-06-19 10:12 ..
>
> Does hadoop.tmp.dir need to be writeable by all users running hadoop jobs?
>
>
> -stephen
>
> --
> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> http://di2.deri.ie http://webstar.deri.ie http://sindice.com
>
Re: Permissions needed to run RandomWriter ?
Posted by stephen mulcahy <st...@deri.org>.
Alex Loddengaard wrote:
> Have you tried to run the example job as the superuser? It seems like this
> might be an issue where hadoop.tmp.dir doesn't have the correctly
> permissions. hadoop.tmp.dir and dfs.data.dir should be owned by the unix
> user running your Hadoop daemons and owner-writtable and readable.
>
> Can you confirm this is the case? Thanks,
Hi Alex,
The RandomWriter example runs without any problems when run as the
hadoop user (i.e. the superuser / user that runs the hadoop daemons).
hadoop.tmp.dir permissions
smulcahy@hadoop01:~$ ls -la /data1/hadoop-tmp/
total 16
drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 14:01 .
drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 10:16 dfs
drwxr-xr-x 3 hadoop hadoop 4096 2009-06-19 10:49 mapred
smulcahy@hadoop01:~$ ls -la /data?/hdfs
/data1/hdfs:
total 8
drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
drwxr-xr-x 5 root root 4096 2009-06-19 10:12 ..
/data2/hdfs:
total 8
drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
drwxr-xr-x 4 root root 4096 2009-06-19 10:12 ..
Does hadoop.tmp.dir need to be writeable by all users running hadoop jobs?
-stephen
--
Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
http://di2.deri.ie http://webstar.deri.ie http://sindice.com
Re: Permissions needed to run RandomWriter ?
Posted by Alex Loddengaard <al...@cloudera.com>.
Have you tried to run the example job as the superuser? It seems like this
might be an issue where hadoop.tmp.dir doesn't have the correctly
permissions. hadoop.tmp.dir and dfs.data.dir should be owned by the unix
user running your Hadoop daemons and owner-writtable and readable.
Can you confirm this is the case? Thanks,
Alex
On Fri, Jun 26, 2009 at 1:29 PM, Mulcahy, Stephen
<st...@deri.org>wrote:
> [Apologies for the top-post, sending this from a dodgy webmail client]
>
> Hi Alex,
>
> My hadoop-site.xml is as follows,
>
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
> <!-- Put site-specific property overrides in this file. -->
>
> <configuration>
> <property>
> <name>mapred.job.tracker</name>
> <value>hadoop01:9001</value>
> </property>
>
> <property>
> <name>fs.default.name</name>
> <value>hdfs://hadoop01:9000</value>
> </property>
>
> <property>
> <name>hadoop.tmp.dir</name>
> <value>/data1/hadoop-tmp/</value>
> </property>
>
> <property>
> <name>dfs.data.dir</name>
> <value>/data1/hdfs,/data2/hdfs</value>
> </property>
> </configuration>
>
> Any comments welcome,
>
> -stephen
>
>
>
> -----Original Message-----
> From: Alex Loddengaard [mailto:alex@cloudera.com]
> Sent: Fri 26/06/2009 18:32
> To: core-user@hadoop.apache.org
> Subject: Re: Permissions needed to run RandomWriter ?
>
> Hey Stephen,
>
> What does your hadoop-site.xml look like? The Exception is in
> java.io.UnixFileSystem, which makes me think that you're actually creating
> and modifying directories on your local file system instead of HDFS. Make
> sure "fs.default.name" looks like "hdfs://your-namenode.domain.com:PORT".
>
> Alex
>
> On Fri, Jun 26, 2009 at 4:40 AM, stephen mulcahy
> <st...@deri.org>wrote:
>
> > Hi,
> >
> > I've just installed a new test cluster and I'm trying to give it a quick
> > smoke test with RandomWriter and Sort.
> >
> > I can run these fine with the superuser account. When I try to run them
> as
> > another user I run into problems even though I've created the output
> > directory and given permissions to the other user to write to this
> > directory. i.e.
> >
> > 1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
> > mkdir: org.apache.hadoop.fs.permission.AccessControlException: Permission
> > denied: user=smulcahy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
> >
> > OK - we don't have permissions anyways
> >
> > 2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo
> >
> > OK
> >
> > 3. hadoop fs -chown -R smulcahy /foo
> >
> > OK
> >
> > 4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test
> >
> > OK
> >
> > 5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
> > randomwriter /foo
> > java.io.IOException: Permission denied
> > at java.io.UnixFileSystem.createFileExclusively(Native Method)
> > at java.io.File.checkAndCreate(File.java:1704)
> > at java.io.File.createTempFile(File.java:1793)
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
> > at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> >
> > Any suggestions on why step 5. is failing even though I have write
> > permissions to /foo - do I need permissions on some other directory also
> or
> > ... ?
> >
> > Thanks,
> >
> > -stephen
> >
> > --
> > Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> > NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> > http://di2.deri.ie http://webstar.deri.ie http://sindice.com
> >
>
>
RE: Permissions needed to run RandomWriter ?
Posted by "Mulcahy, Stephen" <st...@deri.org>.
[Apologies for the top-post, sending this from a dodgy webmail client]
Hi Alex,
My hadoop-site.xml is as follows,
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>hadoop01:9001</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://hadoop01:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/data1/hadoop-tmp/</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/data1/hdfs,/data2/hdfs</value>
</property>
</configuration>
Any comments welcome,
-stephen
-----Original Message-----
From: Alex Loddengaard [mailto:alex@cloudera.com]
Sent: Fri 26/06/2009 18:32
To: core-user@hadoop.apache.org
Subject: Re: Permissions needed to run RandomWriter ?
Hey Stephen,
What does your hadoop-site.xml look like? The Exception is in
java.io.UnixFileSystem, which makes me think that you're actually creating
and modifying directories on your local file system instead of HDFS. Make
sure "fs.default.name" looks like "hdfs://your-namenode.domain.com:PORT".
Alex
On Fri, Jun 26, 2009 at 4:40 AM, stephen mulcahy
<st...@deri.org>wrote:
> Hi,
>
> I've just installed a new test cluster and I'm trying to give it a quick
> smoke test with RandomWriter and Sort.
>
> I can run these fine with the superuser account. When I try to run them as
> another user I run into problems even though I've created the output
> directory and given permissions to the other user to write to this
> directory. i.e.
>
> 1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
> mkdir: org.apache.hadoop.fs.permission.AccessControlException: Permission
> denied: user=smulcahy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
>
> OK - we don't have permissions anyways
>
> 2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo
>
> OK
>
> 3. hadoop fs -chown -R smulcahy /foo
>
> OK
>
> 4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test
>
> OK
>
> 5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
> randomwriter /foo
> java.io.IOException: Permission denied
> at java.io.UnixFileSystem.createFileExclusively(Native Method)
> at java.io.File.checkAndCreate(File.java:1704)
> at java.io.File.createTempFile(File.java:1793)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
> Any suggestions on why step 5. is failing even though I have write
> permissions to /foo - do I need permissions on some other directory also or
> ... ?
>
> Thanks,
>
> -stephen
>
> --
> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> http://di2.deri.ie http://webstar.deri.ie http://sindice.com
>
Re: Permissions needed to run RandomWriter ?
Posted by Alex Loddengaard <al...@cloudera.com>.
Hey Stephen,
What does your hadoop-site.xml look like? The Exception is in
java.io.UnixFileSystem, which makes me think that you're actually creating
and modifying directories on your local file system instead of HDFS. Make
sure "fs.default.name" looks like "hdfs://your-namenode.domain.com:PORT".
Alex
On Fri, Jun 26, 2009 at 4:40 AM, stephen mulcahy
<st...@deri.org>wrote:
> Hi,
>
> I've just installed a new test cluster and I'm trying to give it a quick
> smoke test with RandomWriter and Sort.
>
> I can run these fine with the superuser account. When I try to run them as
> another user I run into problems even though I've created the output
> directory and given permissions to the other user to write to this
> directory. i.e.
>
> 1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
> mkdir: org.apache.hadoop.fs.permission.AccessControlException: Permission
> denied: user=smulcahy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
>
> OK - we don't have permissions anyways
>
> 2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo
>
> OK
>
> 3. hadoop fs -chown -R smulcahy /foo
>
> OK
>
> 4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test
>
> OK
>
> 5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
> randomwriter /foo
> java.io.IOException: Permission denied
> at java.io.UnixFileSystem.createFileExclusively(Native Method)
> at java.io.File.checkAndCreate(File.java:1704)
> at java.io.File.createTempFile(File.java:1793)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
> Any suggestions on why step 5. is failing even though I have write
> permissions to /foo - do I need permissions on some other directory also or
> ... ?
>
> Thanks,
>
> -stephen
>
> --
> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> http://di2.deri.ie http://webstar.deri.ie http://sindice.com
>