You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by David Greer <da...@davidgreer.ca> on 2009/10/12 18:52:18 UTC

Security error running hadoop with MaxTemperature example

Hi Everyone,

I'm trying to get my first MapReduce example to work. Background:

RedHat ES 5.2
Sun Java 1.6.0_16-b01
 Hadoop 0.20.1+133 (Cloudera distro)

I've started the hadoop daemons, created an HDFS locally, and checked that
basic operations in HDFS appear to work.

I'm trying to get the first most basic example from Tom White's book "Hadoop
The Definitive Guide" to work. I'm running hadoop in pseudo-mode. I'm using
a generic user. Basic hadoop commands appear to work:

hadoop fs -ls
Found 1 items
-rw-r--r--   1 david supergroup      28081 2009-10-06 23:27
/user/david/docnotes.txt

I compiled the examples in chapter 2 "by hand" (why is a separate thread). I
then try and see if I can invoke MaxTemperature with non-existent files (at
this point I'm just trying to see if we can get everything to load and
initialized):

export HADOOP_CLASSPATH="./"
hadoop MaxTemperature foo bar

I get the error message:

Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

There's a long stack trace the start of which looks like:

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
        at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:909)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1162)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:306)
        .........

I'm out of ideas at this point. Any suggestions for where I should look to
solve this?

Cheers,

David

Re: Security error running hadoop with MaxTemperature example

Posted by Jason Venner <ja...@gmail.com>.
My first guess is that, for what ever reason, the user you are running the
job as, can not write the directory specified for the hadoop.tmp.dir in your
configuration.
This is usually in the system temporary area and not an issue.

On Mon, Oct 12, 2009 at 9:52 AM, David Greer <da...@davidgreer.ca> wrote:

> Hi Everyone,
>
> I'm trying to get my first MapReduce example to work. Background:
>
> RedHat ES 5.2
> Sun Java 1.6.0_16-b01
>  Hadoop 0.20.1+133 (Cloudera distro)
>
> I've started the hadoop daemons, created an HDFS locally, and checked that
> basic operations in HDFS appear to work.
>
> I'm trying to get the first most basic example from Tom White's book
> "Hadoop The Definitive Guide" to work. I'm running hadoop in pseudo-mode.
> I'm using a generic user. Basic hadoop commands appear to work:
>
> hadoop fs -ls
> Found 1 items
> -rw-r--r--   1 david supergroup      28081 2009-10-06 23:27
> /user/david/docnotes.txt
>
> I compiled the examples in chapter 2 "by hand" (why is a separate thread).
> I then try and see if I can invoke MaxTemperature with non-existent files
> (at this point I'm just trying to see if we can get everything to load and
> initialized):
>
> export HADOOP_CLASSPATH="./"
> hadoop MaxTemperature foo bar
>
> I get the error message:
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x
>
> There's a long stack trace the start of which looks like:
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
>         at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
>         at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:909)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
>         at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1162)
>         at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:306)
>         .........
>
> I'm out of ideas at this point. Any suggestions for where I should look to
> solve this?
>
> Cheers,
>
> David
>
>
>


-- 
Pro Hadoop, a book to guide you from beginner to hadoop mastery,
http://www.amazon.com/dp/1430219424?tag=jewlerymall
www.prohadoopbook.com a community for Hadoop Professionals