You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by shixing <pa...@gmail.com> on 2012/05/10 16:50:51 UTC

How to start up datanode with kerberos?

Hi,all:
    Now I want to setup the security with hbase by kerberos.
    As I know, the hbase's ugi is based on the hadoop UserGroupInformation
without parameter "hadoop.job.ugi" after 0.20.2. So when I use the cdh3u3,
the ugi can be generated by two authentication : simple or kerberos.
    Firstly I should setup the hdfs based on kerberos. After I setup KDC,
and configuration account for kerberos, I can't start datanode for this
message:

    12/05/10 22:41:10 INFO security.UserGroupInformation: Login successful
for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using keytab
file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
Cannot start secure cluster without privileged resources. In a secure
cluster, the DataNode must be started from within jsvc. If using Cloudera
packages, please install the hadoop-0.20-sbin package.

For development purposes ONLY you may override this check by setting
dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A SECURITY
HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)

And I install the jsvc command to start the datanode as followed(May be it
is wrong?), there is no log or error:
    /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode -Xmx1000m
-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
-Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
-Dhadoop.log.file=hadoop.log
-Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
-Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
-Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
-Dhadoop.policy.file=hadoop-policy.xml -classpath
/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
org.apache.hadoop.hdfs.server.datanode.DataNode


And I see the source code both cdh3u3 and hadoop-1.0.0, that the DataNode's
main() function:
   public static void main(String args[]) {
    secureMain(args, null);
  }
  Here the SecureResources is null, and there is such code:
    void startDataNode(Configuration conf,
                     AbstractList<File> dataDirs, SecureResources resources
                     ) throws IOException {
    if(UserGroupInformation.isSecurityEnabled() && resources == null)
      throw new RuntimeException("Cannot start secure cluster without " +
                "privileged resources.");
    .........

    The resources is always null, and if we setup security by kerberos, it
will always  throw new RuntimeException

    Am I missing something?
-- 
Best wishes!
My Friend~

Re: How to start up datanode with kerberos?

Posted by Allan Yan <ha...@gmail.com>.
Did you start datanode with root? This is for binding privileged ports
(<1024).


On Sun, May 13, 2012 at 6:23 AM, shixing <pa...@gmail.com> wrote:

> I downloaded the sbin rpm package, and just fine there are 2 program:
>
> jsvc and task-controller, and I also run command replcat java to jsvc.
>
> Maybe I use the jsvc wrongly? And there are special settings in cdh's
> hadoop package for jsvc?
>
> On Fri, May 11, 2012 at 1:57 PM, Harsh J <ha...@cloudera.com> wrote:
>
> > (For the CDH question, am moving this to cdh-user@cloudera.org lists,
> > bcc'd hdfs-user@apache)
> >
> > As the message goes, you're missing the sbin package in your cdh3u3
> > install at least. I have secured DN working just fine here after
> > following https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide
> >
> > On Thu, May 10, 2012 at 8:20 PM, shixing <pa...@gmail.com> wrote:
> > > Hi,all:
> > >     Now I want to setup the security with hbase by kerberos.
> > >     As I know, the hbase's ugi is based on the hadoop
> > UserGroupInformation
> > > without parameter "hadoop.job.ugi" after 0.20.2. So when I use the
> > cdh3u3,
> > > the ugi can be generated by two authentication : simple or kerberos.
> > >     Firstly I should setup the hdfs based on kerberos. After I setup
> KDC,
> > > and configuration account for kerberos, I can't start datanode for this
> > > message:
> > >
> > >     12/05/10 22:41:10 INFO security.UserGroupInformation: Login
> > successful
> > > for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using
> > keytab
> > > file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
> > > 12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
> > > Cannot start secure cluster without privileged resources. In a secure
> > > cluster, the DataNode must be started from within jsvc. If using
> Cloudera
> > > packages, please install the hadoop-0.20-sbin package.
> > >
> > > For development purposes ONLY you may override this check by setting
> > > dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A
> SECURITY
> > > HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
> > >         at
> > >
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
> > >         at
> > >
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)
> > >
> > > And I install the jsvc command to start the datanode as followed(May be
> > it
> > > is wrong?), there is no log or error:
> > >     /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode
> -Xmx1000m
> > > -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
> > > -Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
> > > -Dhadoop.log.file=hadoop.log
> > > -Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
> > > -Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
> > >
> >
> -Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
> > > -Dhadoop.policy.file=hadoop-policy.xml -classpath
> > >
> >
> /home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
> > > org.apache.hadoop.hdfs.server.datanode.DataNode
> > >
> > >
> > > And I see the source code both cdh3u3 and hadoop-1.0.0, that the
> > DataNode's
> > > main() function:
> > >    public static void main(String args[]) {
> > >     secureMain(args, null);
> > >   }
> > >   Here the SecureResources is null, and there is such code:
> > >     void startDataNode(Configuration conf,
> > >                      AbstractList<File> dataDirs, SecureResources
> > resources
> > >                      ) throws IOException {
> > >     if(UserGroupInformation.isSecurityEnabled() && resources == null)
> > >       throw new RuntimeException("Cannot start secure cluster without
> " +
> > >                 "privileged resources.");
> > >     .........
> > >
> > >     The resources is always null, and if we setup security by kerberos,
> > it
> > > will always  throw new RuntimeException
> > >
> > >     Am I missing something?
> > > --
> > > Best wishes!
> > > My Friend~
> >
> >
> >
> > --
> > Harsh J
> >
>
>
>
> --
> Best wishes!
> My Friend~
>

Re: How to start up datanode with kerberos?

Posted by Allan Yan <ha...@gmail.com>.
Did you start datanode with root? This is for binding privileged ports
(<1024).


On Sun, May 13, 2012 at 6:23 AM, shixing <pa...@gmail.com> wrote:

> I downloaded the sbin rpm package, and just fine there are 2 program:
>
> jsvc and task-controller, and I also run command replcat java to jsvc.
>
> Maybe I use the jsvc wrongly? And there are special settings in cdh's
> hadoop package for jsvc?
>
> On Fri, May 11, 2012 at 1:57 PM, Harsh J <ha...@cloudera.com> wrote:
>
> > (For the CDH question, am moving this to cdh-user@cloudera.org lists,
> > bcc'd hdfs-user@apache)
> >
> > As the message goes, you're missing the sbin package in your cdh3u3
> > install at least. I have secured DN working just fine here after
> > following https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide
> >
> > On Thu, May 10, 2012 at 8:20 PM, shixing <pa...@gmail.com> wrote:
> > > Hi,all:
> > >     Now I want to setup the security with hbase by kerberos.
> > >     As I know, the hbase's ugi is based on the hadoop
> > UserGroupInformation
> > > without parameter "hadoop.job.ugi" after 0.20.2. So when I use the
> > cdh3u3,
> > > the ugi can be generated by two authentication : simple or kerberos.
> > >     Firstly I should setup the hdfs based on kerberos. After I setup
> KDC,
> > > and configuration account for kerberos, I can't start datanode for this
> > > message:
> > >
> > >     12/05/10 22:41:10 INFO security.UserGroupInformation: Login
> > successful
> > > for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using
> > keytab
> > > file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
> > > 12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
> > > Cannot start secure cluster without privileged resources. In a secure
> > > cluster, the DataNode must be started from within jsvc. If using
> Cloudera
> > > packages, please install the hadoop-0.20-sbin package.
> > >
> > > For development purposes ONLY you may override this check by setting
> > > dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A
> SECURITY
> > > HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
> > >         at
> > >
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
> > >         at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
> > >         at
> > >
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)
> > >
> > > And I install the jsvc command to start the datanode as followed(May be
> > it
> > > is wrong?), there is no log or error:
> > >     /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode
> -Xmx1000m
> > > -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
> > > -Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
> > > -Dhadoop.log.file=hadoop.log
> > > -Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
> > > -Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
> > >
> >
> -Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
> > > -Dhadoop.policy.file=hadoop-policy.xml -classpath
> > >
> >
> /home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
> > > org.apache.hadoop.hdfs.server.datanode.DataNode
> > >
> > >
> > > And I see the source code both cdh3u3 and hadoop-1.0.0, that the
> > DataNode's
> > > main() function:
> > >    public static void main(String args[]) {
> > >     secureMain(args, null);
> > >   }
> > >   Here the SecureResources is null, and there is such code:
> > >     void startDataNode(Configuration conf,
> > >                      AbstractList<File> dataDirs, SecureResources
> > resources
> > >                      ) throws IOException {
> > >     if(UserGroupInformation.isSecurityEnabled() && resources == null)
> > >       throw new RuntimeException("Cannot start secure cluster without
> " +
> > >                 "privileged resources.");
> > >     .........
> > >
> > >     The resources is always null, and if we setup security by kerberos,
> > it
> > > will always  throw new RuntimeException
> > >
> > >     Am I missing something?
> > > --
> > > Best wishes!
> > > My Friend~
> >
> >
> >
> > --
> > Harsh J
> >
>
>
>
> --
> Best wishes!
> My Friend~
>

Re: How to start up datanode with kerberos?

Posted by shixing <pa...@gmail.com>.
I downloaded the sbin rpm package, and just fine there are 2 program:

jsvc and task-controller, and I also run command replcat java to jsvc.

Maybe I use the jsvc wrongly? And there are special settings in cdh's
hadoop package for jsvc?

On Fri, May 11, 2012 at 1:57 PM, Harsh J <ha...@cloudera.com> wrote:

> (For the CDH question, am moving this to cdh-user@cloudera.org lists,
> bcc'd hdfs-user@apache)
>
> As the message goes, you're missing the sbin package in your cdh3u3
> install at least. I have secured DN working just fine here after
> following https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide
>
> On Thu, May 10, 2012 at 8:20 PM, shixing <pa...@gmail.com> wrote:
> > Hi,all:
> >     Now I want to setup the security with hbase by kerberos.
> >     As I know, the hbase's ugi is based on the hadoop
> UserGroupInformation
> > without parameter "hadoop.job.ugi" after 0.20.2. So when I use the
> cdh3u3,
> > the ugi can be generated by two authentication : simple or kerberos.
> >     Firstly I should setup the hdfs based on kerberos. After I setup KDC,
> > and configuration account for kerberos, I can't start datanode for this
> > message:
> >
> >     12/05/10 22:41:10 INFO security.UserGroupInformation: Login
> successful
> > for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using
> keytab
> > file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
> > 12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
> > Cannot start secure cluster without privileged resources. In a secure
> > cluster, the DataNode must be started from within jsvc. If using Cloudera
> > packages, please install the hadoop-0.20-sbin package.
> >
> > For development purposes ONLY you may override this check by setting
> > dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A SECURITY
> > HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)
> >
> > And I install the jsvc command to start the datanode as followed(May be
> it
> > is wrong?), there is no log or error:
> >     /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode -Xmx1000m
> > -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
> > -Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
> > -Dhadoop.log.file=hadoop.log
> > -Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
> > -Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
> >
> -Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
> > -Dhadoop.policy.file=hadoop-policy.xml -classpath
> >
> /home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
> > org.apache.hadoop.hdfs.server.datanode.DataNode
> >
> >
> > And I see the source code both cdh3u3 and hadoop-1.0.0, that the
> DataNode's
> > main() function:
> >    public static void main(String args[]) {
> >     secureMain(args, null);
> >   }
> >   Here the SecureResources is null, and there is such code:
> >     void startDataNode(Configuration conf,
> >                      AbstractList<File> dataDirs, SecureResources
> resources
> >                      ) throws IOException {
> >     if(UserGroupInformation.isSecurityEnabled() && resources == null)
> >       throw new RuntimeException("Cannot start secure cluster without " +
> >                 "privileged resources.");
> >     .........
> >
> >     The resources is always null, and if we setup security by kerberos,
> it
> > will always  throw new RuntimeException
> >
> >     Am I missing something?
> > --
> > Best wishes!
> > My Friend~
>
>
>
> --
> Harsh J
>



-- 
Best wishes!
My Friend~

Re: How to start up datanode with kerberos?

Posted by shixing <pa...@gmail.com>.
I downloaded the sbin rpm package, and just fine there are 2 program:

jsvc and task-controller, and I also run command replcat java to jsvc.

Maybe I use the jsvc wrongly? And there are special settings in cdh's
hadoop package for jsvc?

On Fri, May 11, 2012 at 1:57 PM, Harsh J <ha...@cloudera.com> wrote:

> (For the CDH question, am moving this to cdh-user@cloudera.org lists,
> bcc'd hdfs-user@apache)
>
> As the message goes, you're missing the sbin package in your cdh3u3
> install at least. I have secured DN working just fine here after
> following https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide
>
> On Thu, May 10, 2012 at 8:20 PM, shixing <pa...@gmail.com> wrote:
> > Hi,all:
> >     Now I want to setup the security with hbase by kerberos.
> >     As I know, the hbase's ugi is based on the hadoop
> UserGroupInformation
> > without parameter "hadoop.job.ugi" after 0.20.2. So when I use the
> cdh3u3,
> > the ugi can be generated by two authentication : simple or kerberos.
> >     Firstly I should setup the hdfs based on kerberos. After I setup KDC,
> > and configuration account for kerberos, I can't start datanode for this
> > message:
> >
> >     12/05/10 22:41:10 INFO security.UserGroupInformation: Login
> successful
> > for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using
> keytab
> > file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
> > 12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
> > Cannot start secure cluster without privileged resources. In a secure
> > cluster, the DataNode must be started from within jsvc. If using Cloudera
> > packages, please install the hadoop-0.20-sbin package.
> >
> > For development purposes ONLY you may override this check by setting
> > dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A SECURITY
> > HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
> >         at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)
> >
> > And I install the jsvc command to start the datanode as followed(May be
> it
> > is wrong?), there is no log or error:
> >     /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode -Xmx1000m
> > -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
> > -Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
> > -Dhadoop.log.file=hadoop.log
> > -Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
> > -Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
> >
> -Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
> > -Dhadoop.policy.file=hadoop-policy.xml -classpath
> >
> /home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
> > org.apache.hadoop.hdfs.server.datanode.DataNode
> >
> >
> > And I see the source code both cdh3u3 and hadoop-1.0.0, that the
> DataNode's
> > main() function:
> >    public static void main(String args[]) {
> >     secureMain(args, null);
> >   }
> >   Here the SecureResources is null, and there is such code:
> >     void startDataNode(Configuration conf,
> >                      AbstractList<File> dataDirs, SecureResources
> resources
> >                      ) throws IOException {
> >     if(UserGroupInformation.isSecurityEnabled() && resources == null)
> >       throw new RuntimeException("Cannot start secure cluster without " +
> >                 "privileged resources.");
> >     .........
> >
> >     The resources is always null, and if we setup security by kerberos,
> it
> > will always  throw new RuntimeException
> >
> >     Am I missing something?
> > --
> > Best wishes!
> > My Friend~
>
>
>
> --
> Harsh J
>



-- 
Best wishes!
My Friend~

Re: How to start up datanode with kerberos?

Posted by Harsh J <ha...@cloudera.com>.
(For the CDH question, am moving this to cdh-user@cloudera.org lists,
bcc'd hdfs-user@apache)

As the message goes, you're missing the sbin package in your cdh3u3
install at least. I have secured DN working just fine here after
following https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide

On Thu, May 10, 2012 at 8:20 PM, shixing <pa...@gmail.com> wrote:
> Hi,all:
>     Now I want to setup the security with hbase by kerberos.
>     As I know, the hbase's ugi is based on the hadoop UserGroupInformation
> without parameter "hadoop.job.ugi" after 0.20.2. So when I use the cdh3u3,
> the ugi can be generated by two authentication : simple or kerberos.
>     Firstly I should setup the hdfs based on kerberos. After I setup KDC,
> and configuration account for kerberos, I can't start datanode for this
> message:
>
>     12/05/10 22:41:10 INFO security.UserGroupInformation: Login successful
> for user shubao.sx/dwbasis130001.sqa.cm4.tbsite.net@TBSITE.NET using keytab
> file /home/shubao.sx/hadoop-0.20.2-cdh3u3/conf/kadm5.keytab
> 12/05/10 22:41:10 ERROR datanode.DataNode: java.lang.RuntimeException:
> Cannot start secure cluster without privileged resources. In a secure
> cluster, the DataNode must be started from within jsvc. If using Cloudera
> packages, please install the hadoop-0.20-sbin package.
>
> For development purposes ONLY you may override this check by setting
> dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A SECURITY
> HOLE AND MUST NOT BE USED FOR A REAL CLUSTER ***.
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:331)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707)
>
> And I install the jsvc command to start the datanode as followed(May be it
> is wrong?), there is no log or error:
>     /home/shubao.sx/commons-daemon-1.0.10/jsvc -Dproc_datanode -Xmx1000m
> -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote
> -Dhadoop.log.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../logs
> -Dhadoop.log.file=hadoop.log
> -Dhadoop.home.dir=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/..
> -Dhadoop.id.str=shubao.sx -Dhadoop.root.logger=INFO,console
> -Djava.library.path=/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/native/Linux-amd64-64
> -Dhadoop.policy.file=hadoop-policy.xml -classpath
> /home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../conf:/home/shubao.sx/java6_64/lib/tools.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../hadoop-core-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/ant-contrib-1.0b3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjrt-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/aspectjtools-1.6.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-cli-1.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-codec-1.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-daemon-1.0.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-el-1.0.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-httpclient-3.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-lang-2.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-logging-api-1.0.4.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/commons-net-1.4.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/core-3.1.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/guava-r09-jarjar.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/hsqldb-1.8.0.10.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-core-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jackson-mapper-asl-1.5.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-compiler-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jasper-runtime-5.5.12.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jets3t-0.6.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jetty-util-6.1.26.cloudera.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsch-0.1.42.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/junit-4.5.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/kfs-0.2.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/log4j-1.2.15.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/mockito-all-1.8.2.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/oro-2.0.8.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-20081211.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-api-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/slf4j-log4j12-1.4.3.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/xmlenc-0.52.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/shubao.sx/hadoop-0.20.2-cdh3u3/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:/home/shubao.sx/commons-daemon-1.0.10/commons-daemon-1.0.10.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/conf:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/hbase-0.90.2-test.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/zookeeper-3.3.3.jar:/home/shubao.s/hbase-0.90.2-taobao-1.0RC5/lib/guava-r06.jar:
> org.apache.hadoop.hdfs.server.datanode.DataNode
>
>
> And I see the source code both cdh3u3 and hadoop-1.0.0, that the DataNode's
> main() function:
>    public static void main(String args[]) {
>     secureMain(args, null);
>   }
>   Here the SecureResources is null, and there is such code:
>     void startDataNode(Configuration conf,
>                      AbstractList<File> dataDirs, SecureResources resources
>                      ) throws IOException {
>     if(UserGroupInformation.isSecurityEnabled() && resources == null)
>       throw new RuntimeException("Cannot start secure cluster without " +
>                 "privileged resources.");
>     .........
>
>     The resources is always null, and if we setup security by kerberos, it
> will always  throw new RuntimeException
>
>     Am I missing something?
> --
> Best wishes!
> My Friend~



-- 
Harsh J