You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Thomas Weise <th...@yahoo-inc.com> on 2011/08/31 03:08:11 UTC

Starting datanode in secure mode

I'm configuring a local hadoop cluster in secure mode for development/experimental purposes on Ubuntu 11.04 with the hadoop-0.20.203.0 distribution from apache mirror.

I have the basic Kerberos setup working, can start namenode in secure mode and connect to it with hadoop fs -ls 

I'm not able to get the datanode start in secure mode - what do I have to do to make that happen? 

The error I get:

11/08/30 18:01:57 INFO security.UserGroupInformation: Login successful for user hduser/hdev-vm@HADOOP.LOCALDOMAIN using keytab file /opt/hadoop/conf/nn.keytab
11/08/30 18:01:57 ERROR datanode.DataNode: java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
	at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:293)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:268)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1480)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1419)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1437)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1563)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1573)

11/08/30 18:01:57 INFO datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at hdev-vm/127.0.1.1

I have not configured the system to use port numbers that require root (yet). All I want is the datanode to run in secure mode with kerberos authentication.

Any pointers would be greatly appreciated!

Thomas


Re: Starting datanode in secure mode

Posted by Thomas Weise <th...@yahoo-inc.com>.
Thanks Ravi. This has brought my local hadoop cluster to life!

The two things I was missing:

1) Have to use privileged ports

<!-- secure setup requires privileged ports -->
<property>
  <name>dfs.datanode.address</name>
  <value>0.0.0.0:1004</value>
</property>
<property>
  <name>dfs.datanode.http.address</name>
  <value>0.0.0.0:1006</value>
</property>

2) implied by 1) sudo required to launch datanode

Clearly, this is geared towards the production system. For development, having the ability to run with Kerberos but w/o the need for privileged resources would be desirable.


On Aug 30, 2011, at 9:00 PM, Ravi Prakash wrote:

> In short you MUST use priviledged resourced.
> 
> In long:
> 
> Here's what I did to setup a secure single node cluster. I'm sure there's
> other ways, but here's how I did it.
> 
>    1.    Install krb5-server
>    2.    Setup the kerberos configuration (files attached).
> /var/kerberos/krb5kdc/kdc.conf and /etc/krb5.conf
> http://yahoo.github.com/hadoop-common/installing.html
>    3.    To clean up everything :
> http://mailman.mit.edu/pipermail/kerberos/2003-June/003312.html
>    4.    Create Kerberos database $ sudo kdb5_util create -s
>    5.    Start Kerberos $ sudo /etc/rc.d/init.d/kadmin start $ sudo
> /etc/rc.d/init.d/krb5kdc start
>    6.    Create principal raviprak/localhost.localdomain@localdomain
> http://web.mit.edu/kerberos/krb5-1.5/krb5-1.5.4/doc/krb5-admin/Adding-or-Modifying-Principals.html
>    7.    Create keytab fiie using “xst -k /home/raviprak/raviprak.keytab
> raviprak/localhost.localdomain@localdomain”
>    8.    Setup hdfs-site.xml and core-site.xml (files attached)
>    9.    sudo hostname localhost.localdomain
>    10.    hadoop-daemon.sh start namenode
>    11.    sudo bash. Then export HADOOP_SECURE_DN_USER=raviprak . Then
> hadoop-daemon.sh start datanode
> 
> 
> 
> CORE-SITE.XML
> ========================================
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> 
> <!-- Put site-specific property overrides in this file. -->
> 
> <configuration>
>    <property>
>        <name>fs.default.name</name>
>        <value>hdfs://localhost:9001</value>
>    </property>
>    <property>
>        <name>hadoop.security.authorization</name>
>        <value>true</value>
>    </property>
>    <property>
>        <name>hadoop.security.authentication</name>
>        <value>kerberos</value>
>    </property>
>  <property>
>    <name>dfs.namenode.kerberos.principal</name>
>    <value>raviprak/localhost.localdomain</value>
>  </property>
>  <property>
>    <name>dfs.datanode.kerberos.principal</name>
>    <value>raviprak/localhost.localdomain</value>
>  </property>
>  <property>
>    <name>dfs.secondary.namenode.kerberos.principal</name>
>    <value>raviprak/localhost.localdomain</value>
>  </property>
> </configuration>
> 
> =========================================================
> 
> 
> 
> HDFS-SITE.XML
> =========================================================
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> 
> <!-- Put site-specific property overrides in this file. -->
> 
> <configuration>
>    <property>
>        <name>dfs.replication</name>
>        <value>1</value>
>    </property>
> 
>    <property>
>        <name>dfs.name.dir.restore</name>
>        <value>false</value>
>    </property>
> 
>    <property>
>        <name>dfs.namenode.checkpoint.period</name>
>        <value>10</value>
>    </property>
> 
>    <property>
>        <name>dfs.namenode.keytab.file</name>
>        <value>/home/raviprak/raviprak.keytab</value>
>    </property>
> 
>    <property>
>        <name>dfs.secondary.namenode.keytab.file</name>
>        <value>/home/raviprak/raviprak.keytab</value>
>    </property>
> 
>    <property>
>        <name>dfs.datanode.keytab.file</name>
>        <value>/home/raviprak/raviprak.keytab</value>
>    </property>
> 
>    <property>
>        <name>dfs.datanode.address</name>
>        <value>0.0.0.0:1004</value>
>    </property>
> 
>    <property>
>        <name>dfs.datanode.http.address</name>
>        <value>0.0.0.0:1006</value>
>    </property>
> 
>    <property>
>        <name>dfs.namenode.kerberos.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
>    <property>
>        <name>dfs.secondary.namenode.kerberos.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
>    <property>
>        <name>dfs.datanode.kerberos.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
>    <property>
>        <name>dfs.namenode.kerberos.https.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
>    <property>
>        <name>dfs.secondary.namenode.kerberos.https.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
>    <property>
>        <name>dfs.datanode.kerberos.https.principal</name>
>        <value>raviprak/localhost.localdomain@localdomain</value>
>    </property>
> 
> </configuration>
> =========================================================
> 
> 
> On Tue, Aug 30, 2011 at 8:08 PM, Thomas Weise <th...@yahoo-inc.com> wrote:
> 
>> I'm configuring a local hadoop cluster in secure mode for
>> development/experimental purposes on Ubuntu 11.04 with the hadoop-0.20.203.0
>> distribution from apache mirror.
>> 
>> I have the basic Kerberos setup working, can start namenode in secure mode
>> and connect to it with hadoop fs -ls
>> 
>> I'm not able to get the datanode start in secure mode - what do I have to
>> do to make that happen?
>> 
>> The error I get:
>> 
>> 11/08/30 18:01:57 INFO security.UserGroupInformation: Login successful for
>> user hduser/hdev-vm@HADOOP.LOCALDOMAIN using keytab file
>> /opt/hadoop/conf/nn.keytab
>> 11/08/30 18:01:57 ERROR datanode.DataNode: java.lang.RuntimeException:
>> Cannot start secure cluster without privileged resources.
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:293)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:268)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1480)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1419)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1437)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1563)
>>       at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1573)
>> 
>> 11/08/30 18:01:57 INFO datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down DataNode at hdev-vm/127.0.1.1
>> 
>> I have not configured the system to use port numbers that require root
>> (yet). All I want is the datanode to run in secure mode with kerberos
>> authentication.
>> 
>> Any pointers would be greatly appreciated!
>> 
>> Thomas
>> 
>> 


Re: Starting datanode in secure mode

Posted by Ravi Prakash <ra...@gmail.com>.
In short you MUST use priviledged resourced.

In long:

Here's what I did to setup a secure single node cluster. I'm sure there's
other ways, but here's how I did it.

    1.    Install krb5-server
    2.    Setup the kerberos configuration (files attached).
/var/kerberos/krb5kdc/kdc.conf and /etc/krb5.conf
http://yahoo.github.com/hadoop-common/installing.html
    3.    To clean up everything :
http://mailman.mit.edu/pipermail/kerberos/2003-June/003312.html
    4.    Create Kerberos database $ sudo kdb5_util create -s
    5.    Start Kerberos $ sudo /etc/rc.d/init.d/kadmin start $ sudo
/etc/rc.d/init.d/krb5kdc start
    6.    Create principal raviprak/localhost.localdomain@localdomain
http://web.mit.edu/kerberos/krb5-1.5/krb5-1.5.4/doc/krb5-admin/Adding-or-Modifying-Principals.html
    7.    Create keytab fiie using “xst -k /home/raviprak/raviprak.keytab
raviprak/localhost.localdomain@localdomain”
    8.    Setup hdfs-site.xml and core-site.xml (files attached)
    9.    sudo hostname localhost.localdomain
    10.    hadoop-daemon.sh start namenode
    11.    sudo bash. Then export HADOOP_SECURE_DN_USER=raviprak . Then
hadoop-daemon.sh start datanode



CORE-SITE.XML
========================================
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9001</value>
    </property>
    <property>
        <name>hadoop.security.authorization</name>
        <value>true</value>
    </property>
    <property>
        <name>hadoop.security.authentication</name>
        <value>kerberos</value>
    </property>
  <property>
    <name>dfs.namenode.kerberos.principal</name>
    <value>raviprak/localhost.localdomain</value>
  </property>
  <property>
    <name>dfs.datanode.kerberos.principal</name>
    <value>raviprak/localhost.localdomain</value>
  </property>
  <property>
    <name>dfs.secondary.namenode.kerberos.principal</name>
    <value>raviprak/localhost.localdomain</value>
  </property>
</configuration>

=========================================================



HDFS-SITE.XML
=========================================================
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>

    <property>
        <name>dfs.name.dir.restore</name>
        <value>false</value>
    </property>

    <property>
        <name>dfs.namenode.checkpoint.period</name>
        <value>10</value>
    </property>

    <property>
        <name>dfs.namenode.keytab.file</name>
        <value>/home/raviprak/raviprak.keytab</value>
    </property>

    <property>
        <name>dfs.secondary.namenode.keytab.file</name>
        <value>/home/raviprak/raviprak.keytab</value>
    </property>

    <property>
        <name>dfs.datanode.keytab.file</name>
        <value>/home/raviprak/raviprak.keytab</value>
    </property>

    <property>
        <name>dfs.datanode.address</name>
        <value>0.0.0.0:1004</value>
    </property>

    <property>
        <name>dfs.datanode.http.address</name>
        <value>0.0.0.0:1006</value>
    </property>

    <property>
        <name>dfs.namenode.kerberos.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

    <property>
        <name>dfs.secondary.namenode.kerberos.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

    <property>
        <name>dfs.datanode.kerberos.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

    <property>
        <name>dfs.namenode.kerberos.https.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

    <property>
        <name>dfs.secondary.namenode.kerberos.https.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

    <property>
        <name>dfs.datanode.kerberos.https.principal</name>
        <value>raviprak/localhost.localdomain@localdomain</value>
    </property>

</configuration>
=========================================================


On Tue, Aug 30, 2011 at 8:08 PM, Thomas Weise <th...@yahoo-inc.com> wrote:

> I'm configuring a local hadoop cluster in secure mode for
> development/experimental purposes on Ubuntu 11.04 with the hadoop-0.20.203.0
> distribution from apache mirror.
>
> I have the basic Kerberos setup working, can start namenode in secure mode
> and connect to it with hadoop fs -ls
>
> I'm not able to get the datanode start in secure mode - what do I have to
> do to make that happen?
>
> The error I get:
>
> 11/08/30 18:01:57 INFO security.UserGroupInformation: Login successful for
> user hduser/hdev-vm@HADOOP.LOCALDOMAIN using keytab file
> /opt/hadoop/conf/nn.keytab
> 11/08/30 18:01:57 ERROR datanode.DataNode: java.lang.RuntimeException:
> Cannot start secure cluster without privileged resources.
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:293)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:268)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1480)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1419)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1437)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1563)
>        at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1573)
>
> 11/08/30 18:01:57 INFO datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at hdev-vm/127.0.1.1
>
> I have not configured the system to use port numbers that require root
> (yet). All I want is the datanode to run in secure mode with kerberos
> authentication.
>
> Any pointers would be greatly appreciated!
>
> Thomas
>
>