You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@knox.apache.org by Juan Carlos <jc...@redoop.org> on 2014/02/12 16:54:50 UTC

Having problems to start knox

I have running a secured hdfs cluster, and now I need to set the
peripherical security. I have been following the user guide.
If I execute
kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
--negotiate -i -k -u : -X GET '
http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
the output looks fine

But If I try to do it throw knox....I receive some errors:
When executing:
kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i -k -u
jcfernandez -X GET '
https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
'
 I receive
HTTP/1.1 500 Server Error
Set-Cookie:
JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
Content-Type: text/html;charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 21864
Server: Jetty(8.1.12.v20130726)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 500 Server Error</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
<pre>    Server Error</pre></p><h3>Caused
by:</h3><pre>org.apache.shiro.subject.ExecutionException:
java.security.PrivilegedActionException: java.io.IOException: Service
connectivity error.
        at
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
.......
</pre>
<hr /><i><small>Powered by Jetty://</small></i><br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>

</body>
</html>

And in server side:
14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
/webhdfs/v1/user?op=LISTSTATUS
14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
URL: http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
authentication error: No valid credentials provided (Mechanism level: No
valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
credentials failed! (null)))
14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop SPNegotiation
authentication for URL:
http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
request:
http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException:
SPNego authn failed, can not get hadoop.auth cookie
java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
.....


And executing
java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy

Caught: org.apache.hadoop.gateway.shell.HadoopException:
org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
org.apache.hadoop.gateway.shell.HadoopException:
org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
        at
org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
        at org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown
Source)
        at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
        at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
        at
org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
        at
org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
        at org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
        at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
        at
org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
Server Error
        at
org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
        at
org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
        at org.apache.hadoop.gateway.shell.hdfs.Ls
$Request.access$200(Ls.java:31)
        at org.apache.hadoop.gateway.shell.hdfs.Ls
$Request$1.call(Ls.java:51)
        at org.apache.hadoop.gateway.shell.hdfs.Ls
$Request$1.call(Ls.java:45)
        at
org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
        ... 8 more
......

Here are my config files:
*conf/gateway.xml*
<configuration>

    <property>
        <name>gateway.port</name>
        <value>8443</value>
        <description>The HTTP port for the Gateway.</description>
    </property>

    <property>
        <name>gateway.path</name>
        <value>gateway</value>
        <description>The default context path for the gateway.</description>
    </property>

    <property>
        <name>gateway.gateway.conf.dir</name>
        <value>deployments</value>
        <description>The directory within GATEWAY_HOME that contains
gateway topology files and deployments.</description>
    </property>

    <property>
        <name>gateway.hadoop.kerberos.secured</name>
        <value>true</value>
        <description>Boolean flag indicating whether the Hadoop cluster
protected by Gateway is secured with Kerberos</description>
    </property>

    <property>
        <name>java.security.krb5.conf</name>
        <value>/etc/krb5.conf</value>
        <description>Absolute path to krb5.conf file</description>
    </property>

    <property>
        <name>java.security.auth.login.config</name>
        <value>/etc/knox/config/krb5JAASLogin.conf</value>
        <description>Absolute path to JASS login config file</description>
    </property>

    <property>
        <name>sun.security.krb5.debug</name>
        <value>false</value>
        <description>Boolean flag indicating whether to enable debug
messages for krb5 authentication</description>
    </property>

</configuration>


*conf/hdfscluster.xml:*
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>ShiroProvider</name>
            <enabled>true</enabled>
            <param>
                <name>sessionTimeout</name>
                <value>30</value>
            </param>
            <param>
                <name>main.ldapRealm</name>
                <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
            </param>
            <param>
                <name>main.ldapRealm.userDnTemplate</name>

<value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
            </param>
            <param>
                <name>main.ldapRealm.contextFactory.url</name>
                <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
            </param>
            <param>

<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
                <value>simple</value>
            </param>
            <param>
                <name>urls./**</name>
                <value>authcBasic</value>
            </param>
        </provider>

<provider>
            <role>identity-assertion</role>
            <name>Pseudo</name>
            <enabled>true</enabled>
            <!--enabled>true</enabled-->
            <param>
                <name>group.principal.mapping</name>
                <value>*=hadoop;</value>
            </param>
        </provider>

        <provider>
            <role>authorization</role>
            <name>AclsAuthz</name>
            <enabled>False</enabled>
        </provider>
    </gateway>

    <service>
        <role>NAMENODE</role>
        <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
    </service>

    <service>
        <role>JOBTRACKER</role>
        <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
    </service>

    <service>
        <role>WEBHDFS</role>
        <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>

    </service>
</topology>

*/etc/knox/config/krb5JAASLogin.conf*
com.sun.security.jgss.initiate {
    com.sun.security.auth.module.Krb5LoginModule required
    renewTGT=true
    doNotPrompt=true
    useKeyTab=true
    keyTab="/opt/hadoop/security/knox.service.keytab"
    principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
    isInitiator=true
    storeKey=true
    useTicketCache=true
    client=true;
};

Any help?
Regards.

Re: Having problems to start knox

Posted by Dilli Arumugam <da...@hortonworks.com>.
Thanks Juan for the update.
This is consistent with what I have seen and expected to see.
Dilli


On Wed, Feb 12, 2014 at 12:40 PM, Juan Carlos <jc...@redoop.org>wrote:

> The issue only happens when I start the gateway with the knox principal
> cached. As soon as I execute kdestroy it works perfectly, without restart,
> and if I execute again kinit -kt .... knox it continues working perfectly.
> If I start the gateway with anther principal cached or none it works fine.
> From client side I don't need to destroy any principal to have this
> behaviors.
> I hope this were helpfull.
> Regards
>
>
> 2014-02-12 20:48 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:
>
> Hi Juan,
>>
>> Good to know you are unblocked.
>>
>> The following information would help:
>>
>>
>>    1. What was the principal cached when you hit the problem
>>    2. What was the principal cached when you do not hit the problem
>>
>> Thanks
>> Dilli
>>
>>
>>
>> On Wed, Feb 12, 2014 at 11:43 AM, Juan Carlos <jc...@redoop.org>wrote:
>>
>>> Thanks so much,
>>> it was the problem. After destroying any principal it worked fine. I'll
>>> try to reproduce the error again if it is interesting for you, I tried to
>>> start the gateway with another principal cached but now it doesn't cause
>>> any problem. Shall I try to reproduce the issue again or it is useless for
>>> you?
>>> Regards
>>>
>>>
>>> 2014-02-12 19:38 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:
>>>
>>> Thanks Kevin.
>>>>
>>>> Thanks Juan for trying out Knox.
>>>>
>>>> Clearly Knox to Hadoop SPNego has failed.
>>>>
>>>> How did you  start Knox? (meaning the command used at terminal)
>>>>
>>>> Could you try the follwoing
>>>>
>>>>
>>>>    1. Stop Knox Server.
>>>>    2. From the same terminal window,  issue command: kdestroy
>>>>    3. Start Knox
>>>>    4. Repeat the test
>>>>    5. Share the observations
>>>>
>>>> Thanks
>>>> Dilli
>>>>
>>>>
>>>> On Wed, Feb 12, 2014 at 10:34 AM, Kevin Minder <
>>>> kevin.minder@hortonworks.com> wrote:
>>>>
>>>>>  Hi Juan,
>>>>> We look forward to helping you.  I'm responding so that one of our
>>>>> other committers can see the thread as he had an issue with his user@knoxsubscription.  Please expect him to engage once he has the thread.
>>>>> Kevin.
>>>>>
>>>>>
>>>>> On 2/12/14 11:41 AM, Vinay Shukla wrote:
>>>>>
>>>>> Juan,
>>>>>
>>>>>
>>>>>  Thanks for the detailed email. Can you verify that the REALM name
>>>>> matches? (KDC REALM property in kerberos conf matches )
>>>>>
>>>>>
>>>>> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>>>>>
>>>>>> I have running a secured hdfs cluster, and now I need to set the
>>>>>> peripherical security. I have been following the user guide.
>>>>>> If I execute
>>>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>>>>>> --negotiate -i -k -u : -X GET '
>>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>>>>> '
>>>>>> the output looks fine
>>>>>>
>>>>>>  But If I try to do it throw knox....I receive some errors:
>>>>>> When executing:
>>>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i
>>>>>> -k -u jcfernandez -X GET '
>>>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>>>>>> '
>>>>>>   I receive
>>>>>>  HTTP/1.1 500 Server Error
>>>>>> Set-Cookie:
>>>>>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>>>>>> Content-Type: text/html;charset=ISO-8859-1
>>>>>> Cache-Control: must-revalidate,no-cache,no-store
>>>>>> Content-Length: 21864
>>>>>> Server: Jetty(8.1.12.v20130726)
>>>>>>
>>>>>>  <html>
>>>>>> <head>
>>>>>> <meta http-equiv="Content-Type" content="text/html;
>>>>>> charset=ISO-8859-1"/>
>>>>>> <title>Error 500 Server Error</title>
>>>>>> </head>
>>>>>> <body><h2>HTTP ERROR 500</h2>
>>>>>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>>>>>> <pre>    Server Error</pre></p><h3>Caused
>>>>>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>>>>>> java.security.PrivilegedActionException: java.io.IOException: Service
>>>>>> connectivity error.
>>>>>>         at
>>>>>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>>>>>> .......
>>>>>> </pre>
>>>>>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>>  <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>> <br/>
>>>>>>
>>>>>>  </body>
>>>>>> </html>
>>>>>>
>>>>>>  And in server side:
>>>>>>  14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>>>>>> /webhdfs/v1/user?op=LISTSTATUS
>>>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>>>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>>>>>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>>>>>> URL:
>>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>>>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication:
>>>>>> NEGOTIATE authentication error: No valid credentials provided (Mechanism
>>>>>> level: No valid credentials provided (Mechanism level: Attempt to obtain
>>>>>> new INITIATE credentials failed! (null)))
>>>>>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop
>>>>>> SPNegotiation authentication for URL:
>>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>>>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception
>>>>>> dispatching request:
>>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>>>>> java.io.IOException: SPNego authn failed, can not get hadoop.auth
>>>>>> cookie
>>>>>> .....
>>>>>>
>>>>>>
>>>>>>  And executing
>>>>>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>>>>>
>>>>>>  Caught: org.apache.hadoop.gateway.shell.HadoopException:
>>>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>>>> org.apache.hadoop.gateway.shell.HadoopException:
>>>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>>>>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>>>>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>>>>>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1
>>>>>> 500 Server Error
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>>> $Request.access$200(Ls.java:31)
>>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>>> $Request$1.call(Ls.java:51)
>>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>>> $Request$1.call(Ls.java:45)
>>>>>>         at
>>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>>>>>         ... 8 more
>>>>>>  ......
>>>>>>
>>>>>>  Here are my config files:
>>>>>> *conf/gateway.xml*
>>>>>>  <configuration>
>>>>>>
>>>>>>      <property>
>>>>>>          <name>gateway.port</name>
>>>>>>         <value>8443</value>
>>>>>>         <description>The HTTP port for the Gateway.</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>gateway.path</name>
>>>>>>         <value>gateway</value>
>>>>>>         <description>The default context path for the
>>>>>> gateway.</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>gateway.gateway.conf.dir</name>
>>>>>>         <value>deployments</value>
>>>>>>         <description>The directory within GATEWAY_HOME that contains
>>>>>> gateway topology files and deployments.</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>gateway.hadoop.kerberos.secured</name>
>>>>>>         <value>true</value>
>>>>>>         <description>Boolean flag indicating whether the Hadoop
>>>>>> cluster protected by Gateway is secured with Kerberos</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>java.security.krb5.conf</name>
>>>>>>         <value>/etc/krb5.conf</value>
>>>>>>         <description>Absolute path to krb5.conf file</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>java.security.auth.login.config</name>
>>>>>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>>>>>         <description>Absolute path to JASS login config
>>>>>> file</description>
>>>>>>     </property>
>>>>>>
>>>>>>      <property>
>>>>>>         <name>sun.security.krb5.debug</name>
>>>>>>         <value>false</value>
>>>>>>         <description>Boolean flag indicating whether to enable debug
>>>>>> messages for krb5 authentication</description>
>>>>>>     </property>
>>>>>>
>>>>>>  </configuration>
>>>>>>
>>>>>>
>>>>>>  *conf/hdfscluster.xml:*
>>>>>> <topology>
>>>>>>
>>>>>>      <gateway>
>>>>>>
>>>>>>          <provider>
>>>>>>             <role>authentication</role>
>>>>>>             <name>ShiroProvider</name>
>>>>>>             <enabled>true</enabled>
>>>>>>             <param>
>>>>>>                 <name>sessionTimeout</name>
>>>>>>                  <value>30</value>
>>>>>>             </param>
>>>>>>             <param>
>>>>>>                 <name>main.ldapRealm</name>
>>>>>>
>>>>>> <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>>>>>             </param>
>>>>>>             <param>
>>>>>>                 <name>main.ldapRealm.userDnTemplate</name>
>>>>>>
>>>>>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>>>>>             </param>
>>>>>>             <param>
>>>>>>                 <name>main.ldapRealm.contextFactory.url</name>
>>>>>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>>>>>             </param>
>>>>>>             <param>
>>>>>>
>>>>>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>>>>>                 <value>simple</value>
>>>>>>             </param>
>>>>>>             <param>
>>>>>>                 <name>urls./**</name>
>>>>>>                 <value>authcBasic</value>
>>>>>>             </param>
>>>>>>         </provider>
>>>>>>
>>>>>>  <provider>
>>>>>>             <role>identity-assertion</role>
>>>>>>             <name>Pseudo</name>
>>>>>>             <enabled>true</enabled>
>>>>>>             <!--enabled>true</enabled-->
>>>>>>             <param>
>>>>>>                 <name>group.principal.mapping</name>
>>>>>>                 <value>*=hadoop;</value>
>>>>>>             </param>
>>>>>>         </provider>
>>>>>>
>>>>>>          <provider>
>>>>>>              <role>authorization</role>
>>>>>>             <name>AclsAuthz</name>
>>>>>>             <enabled>False</enabled>
>>>>>>         </provider>
>>>>>>     </gateway>
>>>>>>
>>>>>>      <service>
>>>>>>         <role>NAMENODE</role>
>>>>>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>>>>>     </service>
>>>>>>
>>>>>>      <service>
>>>>>>         <role>JOBTRACKER</role>
>>>>>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>>>>>     </service>
>>>>>>
>>>>>>      <service>
>>>>>>         <role>WEBHDFS</role>
>>>>>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>>>>>
>>>>>>      </service>
>>>>>>  </topology>
>>>>>>
>>>>>>  */etc/knox/config/krb5JAASLogin.conf*
>>>>>>  com.sun.security.jgss.initiate {
>>>>>>     com.sun.security.auth.module.Krb5LoginModule required
>>>>>>     renewTGT=true
>>>>>>     doNotPrompt=true
>>>>>>     useKeyTab=true
>>>>>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>>>>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>>>>>     isInitiator=true
>>>>>>     storeKey=true
>>>>>>     useTicketCache=true
>>>>>>     client=true;
>>>>>> };
>>>>>>
>>>>>>  Any help?
>>>>>> Regards.
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Having problems to start knox

Posted by Juan Carlos <jc...@redoop.org>.
The issue only happens when I start the gateway with the knox principal
cached. As soon as I execute kdestroy it works perfectly, without restart,
and if I execute again kinit -kt .... knox it continues working perfectly.
If I start the gateway with anther principal cached or none it works fine.
>From client side I don't need to destroy any principal to have this
behaviors.
I hope this were helpfull.
Regards


2014-02-12 20:48 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:

> Hi Juan,
>
> Good to know you are unblocked.
>
> The following information would help:
>
>
>    1. What was the principal cached when you hit the problem
>    2. What was the principal cached when you do not hit the problem
>
> Thanks
> Dilli
>
>
>
> On Wed, Feb 12, 2014 at 11:43 AM, Juan Carlos <jc...@redoop.org>wrote:
>
>> Thanks so much,
>> it was the problem. After destroying any principal it worked fine. I'll
>> try to reproduce the error again if it is interesting for you, I tried to
>> start the gateway with another principal cached but now it doesn't cause
>> any problem. Shall I try to reproduce the issue again or it is useless for
>> you?
>> Regards
>>
>>
>> 2014-02-12 19:38 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:
>>
>> Thanks Kevin.
>>>
>>> Thanks Juan for trying out Knox.
>>>
>>> Clearly Knox to Hadoop SPNego has failed.
>>>
>>> How did you  start Knox? (meaning the command used at terminal)
>>>
>>> Could you try the follwoing
>>>
>>>
>>>    1. Stop Knox Server.
>>>    2. From the same terminal window,  issue command: kdestroy
>>>    3. Start Knox
>>>    4. Repeat the test
>>>    5. Share the observations
>>>
>>> Thanks
>>> Dilli
>>>
>>>
>>> On Wed, Feb 12, 2014 at 10:34 AM, Kevin Minder <
>>> kevin.minder@hortonworks.com> wrote:
>>>
>>>>  Hi Juan,
>>>> We look forward to helping you.  I'm responding so that one of our
>>>> other committers can see the thread as he had an issue with his user@knoxsubscription.  Please expect him to engage once he has the thread.
>>>> Kevin.
>>>>
>>>>
>>>> On 2/12/14 11:41 AM, Vinay Shukla wrote:
>>>>
>>>> Juan,
>>>>
>>>>
>>>>  Thanks for the detailed email. Can you verify that the REALM name
>>>> matches? (KDC REALM property in kerberos conf matches )
>>>>
>>>>
>>>> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>>>>
>>>>> I have running a secured hdfs cluster, and now I need to set the
>>>>> peripherical security. I have been following the user guide.
>>>>> If I execute
>>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>>>>> --negotiate -i -k -u : -X GET '
>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>>>> '
>>>>> the output looks fine
>>>>>
>>>>>  But If I try to do it throw knox....I receive some errors:
>>>>> When executing:
>>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i
>>>>> -k -u jcfernandez -X GET '
>>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>>>>> '
>>>>>   I receive
>>>>>  HTTP/1.1 500 Server Error
>>>>> Set-Cookie:
>>>>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>>>>> Content-Type: text/html;charset=ISO-8859-1
>>>>> Cache-Control: must-revalidate,no-cache,no-store
>>>>> Content-Length: 21864
>>>>> Server: Jetty(8.1.12.v20130726)
>>>>>
>>>>>  <html>
>>>>> <head>
>>>>> <meta http-equiv="Content-Type" content="text/html;
>>>>> charset=ISO-8859-1"/>
>>>>> <title>Error 500 Server Error</title>
>>>>> </head>
>>>>> <body><h2>HTTP ERROR 500</h2>
>>>>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>>>>> <pre>    Server Error</pre></p><h3>Caused
>>>>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>>>>> java.security.PrivilegedActionException: java.io.IOException: Service
>>>>> connectivity error.
>>>>>         at
>>>>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>>>>> .......
>>>>> </pre>
>>>>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>>  <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>> <br/>
>>>>>
>>>>>  </body>
>>>>> </html>
>>>>>
>>>>>  And in server side:
>>>>>  14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>>>>> /webhdfs/v1/user?op=LISTSTATUS
>>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>>>>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>>>>> URL:
>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
>>>>> authentication error: No valid credentials provided (Mechanism level: No
>>>>> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
>>>>> credentials failed! (null)))
>>>>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop
>>>>> SPNegotiation authentication for URL:
>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception
>>>>> dispatching request:
>>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>>>> java.io.IOException: SPNego authn failed, can not get hadoop.auth
>>>>> cookie
>>>>> .....
>>>>>
>>>>>
>>>>>  And executing
>>>>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>>>>
>>>>>  Caught: org.apache.hadoop.gateway.shell.HadoopException:
>>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>>> org.apache.hadoop.gateway.shell.HadoopException:
>>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>>>         at
>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>>>>         at
>>>>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>>>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>>>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>>>>         at
>>>>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>>>>         at
>>>>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>>>>         at
>>>>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>>>>         at
>>>>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>>>>         at
>>>>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>>>>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
>>>>> Server Error
>>>>>         at
>>>>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>>>>         at
>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>> $Request.access$200(Ls.java:31)
>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>> $Request$1.call(Ls.java:51)
>>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>>> $Request$1.call(Ls.java:45)
>>>>>         at
>>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>>>>         ... 8 more
>>>>>  ......
>>>>>
>>>>>  Here are my config files:
>>>>> *conf/gateway.xml*
>>>>>  <configuration>
>>>>>
>>>>>      <property>
>>>>>          <name>gateway.port</name>
>>>>>         <value>8443</value>
>>>>>         <description>The HTTP port for the Gateway.</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>gateway.path</name>
>>>>>         <value>gateway</value>
>>>>>         <description>The default context path for the
>>>>> gateway.</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>gateway.gateway.conf.dir</name>
>>>>>         <value>deployments</value>
>>>>>         <description>The directory within GATEWAY_HOME that contains
>>>>> gateway topology files and deployments.</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>gateway.hadoop.kerberos.secured</name>
>>>>>         <value>true</value>
>>>>>         <description>Boolean flag indicating whether the Hadoop
>>>>> cluster protected by Gateway is secured with Kerberos</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>java.security.krb5.conf</name>
>>>>>         <value>/etc/krb5.conf</value>
>>>>>         <description>Absolute path to krb5.conf file</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>java.security.auth.login.config</name>
>>>>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>>>>         <description>Absolute path to JASS login config
>>>>> file</description>
>>>>>     </property>
>>>>>
>>>>>      <property>
>>>>>         <name>sun.security.krb5.debug</name>
>>>>>         <value>false</value>
>>>>>         <description>Boolean flag indicating whether to enable debug
>>>>> messages for krb5 authentication</description>
>>>>>     </property>
>>>>>
>>>>>  </configuration>
>>>>>
>>>>>
>>>>>  *conf/hdfscluster.xml:*
>>>>> <topology>
>>>>>
>>>>>      <gateway>
>>>>>
>>>>>          <provider>
>>>>>             <role>authentication</role>
>>>>>             <name>ShiroProvider</name>
>>>>>             <enabled>true</enabled>
>>>>>             <param>
>>>>>                 <name>sessionTimeout</name>
>>>>>                  <value>30</value>
>>>>>             </param>
>>>>>             <param>
>>>>>                 <name>main.ldapRealm</name>
>>>>>
>>>>> <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>>>>             </param>
>>>>>             <param>
>>>>>                 <name>main.ldapRealm.userDnTemplate</name>
>>>>>
>>>>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>>>>             </param>
>>>>>             <param>
>>>>>                 <name>main.ldapRealm.contextFactory.url</name>
>>>>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>>>>             </param>
>>>>>             <param>
>>>>>
>>>>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>>>>                 <value>simple</value>
>>>>>             </param>
>>>>>             <param>
>>>>>                 <name>urls./**</name>
>>>>>                 <value>authcBasic</value>
>>>>>             </param>
>>>>>         </provider>
>>>>>
>>>>>  <provider>
>>>>>             <role>identity-assertion</role>
>>>>>             <name>Pseudo</name>
>>>>>             <enabled>true</enabled>
>>>>>             <!--enabled>true</enabled-->
>>>>>             <param>
>>>>>                 <name>group.principal.mapping</name>
>>>>>                 <value>*=hadoop;</value>
>>>>>             </param>
>>>>>         </provider>
>>>>>
>>>>>          <provider>
>>>>>              <role>authorization</role>
>>>>>             <name>AclsAuthz</name>
>>>>>             <enabled>False</enabled>
>>>>>         </provider>
>>>>>     </gateway>
>>>>>
>>>>>      <service>
>>>>>         <role>NAMENODE</role>
>>>>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>>>>     </service>
>>>>>
>>>>>      <service>
>>>>>         <role>JOBTRACKER</role>
>>>>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>>>>     </service>
>>>>>
>>>>>      <service>
>>>>>         <role>WEBHDFS</role>
>>>>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>>>>
>>>>>      </service>
>>>>>  </topology>
>>>>>
>>>>>  */etc/knox/config/krb5JAASLogin.conf*
>>>>>  com.sun.security.jgss.initiate {
>>>>>     com.sun.security.auth.module.Krb5LoginModule required
>>>>>     renewTGT=true
>>>>>     doNotPrompt=true
>>>>>     useKeyTab=true
>>>>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>>>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>>>>     isInitiator=true
>>>>>     storeKey=true
>>>>>     useTicketCache=true
>>>>>     client=true;
>>>>> };
>>>>>
>>>>>  Any help?
>>>>> Regards.
>>>>>
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: Having problems to start knox

Posted by Dilli Arumugam <da...@hortonworks.com>.
Hi Juan,

Good to know you are unblocked.

The following information would help:


   1. What was the principal cached when you hit the problem
   2. What was the principal cached when you do not hit the problem

Thanks
Dilli



On Wed, Feb 12, 2014 at 11:43 AM, Juan Carlos <jc...@redoop.org>wrote:

> Thanks so much,
> it was the problem. After destroying any principal it worked fine. I'll
> try to reproduce the error again if it is interesting for you, I tried to
> start the gateway with another principal cached but now it doesn't cause
> any problem. Shall I try to reproduce the issue again or it is useless for
> you?
> Regards
>
>
> 2014-02-12 19:38 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:
>
> Thanks Kevin.
>>
>> Thanks Juan for trying out Knox.
>>
>> Clearly Knox to Hadoop SPNego has failed.
>>
>> How did you  start Knox? (meaning the command used at terminal)
>>
>> Could you try the follwoing
>>
>>
>>    1. Stop Knox Server.
>>    2. From the same terminal window,  issue command: kdestroy
>>    3. Start Knox
>>    4. Repeat the test
>>    5. Share the observations
>>
>> Thanks
>> Dilli
>>
>>
>> On Wed, Feb 12, 2014 at 10:34 AM, Kevin Minder <
>> kevin.minder@hortonworks.com> wrote:
>>
>>>  Hi Juan,
>>> We look forward to helping you.  I'm responding so that one of our other
>>> committers can see the thread as he had an issue with his user@knoxsubscription.  Please expect him to engage once he has the thread.
>>> Kevin.
>>>
>>>
>>> On 2/12/14 11:41 AM, Vinay Shukla wrote:
>>>
>>> Juan,
>>>
>>>
>>>  Thanks for the detailed email. Can you verify that the REALM name
>>> matches? (KDC REALM property in kerberos conf matches )
>>>
>>>
>>> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>>>
>>>> I have running a secured hdfs cluster, and now I need to set the
>>>> peripherical security. I have been following the user guide.
>>>> If I execute
>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>>>> --negotiate -i -k -u : -X GET '
>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
>>>> the output looks fine
>>>>
>>>>  But If I try to do it throw knox....I receive some errors:
>>>> When executing:
>>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i
>>>> -k -u jcfernandez -X GET '
>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>>>> '
>>>>   I receive
>>>>  HTTP/1.1 500 Server Error
>>>> Set-Cookie:
>>>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>>>> Content-Type: text/html;charset=ISO-8859-1
>>>> Cache-Control: must-revalidate,no-cache,no-store
>>>> Content-Length: 21864
>>>> Server: Jetty(8.1.12.v20130726)
>>>>
>>>>  <html>
>>>> <head>
>>>> <meta http-equiv="Content-Type" content="text/html;
>>>> charset=ISO-8859-1"/>
>>>> <title>Error 500 Server Error</title>
>>>> </head>
>>>> <body><h2>HTTP ERROR 500</h2>
>>>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>>>> <pre>    Server Error</pre></p><h3>Caused
>>>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>>>> java.security.PrivilegedActionException: java.io.IOException: Service
>>>> connectivity error.
>>>>         at
>>>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>>>> .......
>>>> </pre>
>>>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>>  <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>> <br/>
>>>>
>>>>  </body>
>>>> </html>
>>>>
>>>>  And in server side:
>>>>  14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>>>> /webhdfs/v1/user?op=LISTSTATUS
>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>>>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>>>> URL:
>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
>>>> authentication error: No valid credentials provided (Mechanism level: No
>>>> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
>>>> credentials failed! (null)))
>>>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop
>>>> SPNegotiation authentication for URL:
>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
>>>> request:
>>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>>> java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>>> .....
>>>>
>>>>
>>>>  And executing
>>>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>>>
>>>>  Caught: org.apache.hadoop.gateway.shell.HadoopException:
>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>> org.apache.hadoop.gateway.shell.HadoopException:
>>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>>         at
>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>>>         at
>>>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>>>         at
>>>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>>>         at
>>>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>>>         at
>>>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>>>         at
>>>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>>>         at
>>>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>>>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
>>>> Server Error
>>>>         at
>>>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>>>         at
>>>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>> $Request.access$200(Ls.java:31)
>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>> $Request$1.call(Ls.java:51)
>>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>>> $Request$1.call(Ls.java:45)
>>>>         at
>>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>>>         ... 8 more
>>>>  ......
>>>>
>>>>  Here are my config files:
>>>> *conf/gateway.xml*
>>>>  <configuration>
>>>>
>>>>      <property>
>>>>          <name>gateway.port</name>
>>>>         <value>8443</value>
>>>>         <description>The HTTP port for the Gateway.</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>gateway.path</name>
>>>>         <value>gateway</value>
>>>>         <description>The default context path for the
>>>> gateway.</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>gateway.gateway.conf.dir</name>
>>>>         <value>deployments</value>
>>>>         <description>The directory within GATEWAY_HOME that contains
>>>> gateway topology files and deployments.</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>gateway.hadoop.kerberos.secured</name>
>>>>         <value>true</value>
>>>>         <description>Boolean flag indicating whether the Hadoop cluster
>>>> protected by Gateway is secured with Kerberos</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>java.security.krb5.conf</name>
>>>>         <value>/etc/krb5.conf</value>
>>>>         <description>Absolute path to krb5.conf file</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>java.security.auth.login.config</name>
>>>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>>>         <description>Absolute path to JASS login config
>>>> file</description>
>>>>     </property>
>>>>
>>>>      <property>
>>>>         <name>sun.security.krb5.debug</name>
>>>>         <value>false</value>
>>>>         <description>Boolean flag indicating whether to enable debug
>>>> messages for krb5 authentication</description>
>>>>     </property>
>>>>
>>>>  </configuration>
>>>>
>>>>
>>>>  *conf/hdfscluster.xml:*
>>>> <topology>
>>>>
>>>>      <gateway>
>>>>
>>>>          <provider>
>>>>             <role>authentication</role>
>>>>             <name>ShiroProvider</name>
>>>>             <enabled>true</enabled>
>>>>             <param>
>>>>                 <name>sessionTimeout</name>
>>>>                  <value>30</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>main.ldapRealm</name>
>>>>                 <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>main.ldapRealm.userDnTemplate</name>
>>>>
>>>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>main.ldapRealm.contextFactory.url</name>
>>>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>>>             </param>
>>>>             <param>
>>>>
>>>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>>>                 <value>simple</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>urls./**</name>
>>>>                 <value>authcBasic</value>
>>>>             </param>
>>>>         </provider>
>>>>
>>>>  <provider>
>>>>             <role>identity-assertion</role>
>>>>             <name>Pseudo</name>
>>>>             <enabled>true</enabled>
>>>>             <!--enabled>true</enabled-->
>>>>             <param>
>>>>                 <name>group.principal.mapping</name>
>>>>                 <value>*=hadoop;</value>
>>>>             </param>
>>>>         </provider>
>>>>
>>>>          <provider>
>>>>              <role>authorization</role>
>>>>             <name>AclsAuthz</name>
>>>>             <enabled>False</enabled>
>>>>         </provider>
>>>>     </gateway>
>>>>
>>>>      <service>
>>>>         <role>NAMENODE</role>
>>>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>>>     </service>
>>>>
>>>>      <service>
>>>>         <role>JOBTRACKER</role>
>>>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>>>     </service>
>>>>
>>>>      <service>
>>>>         <role>WEBHDFS</role>
>>>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>>>
>>>>      </service>
>>>>  </topology>
>>>>
>>>>  */etc/knox/config/krb5JAASLogin.conf*
>>>>  com.sun.security.jgss.initiate {
>>>>     com.sun.security.auth.module.Krb5LoginModule required
>>>>     renewTGT=true
>>>>     doNotPrompt=true
>>>>     useKeyTab=true
>>>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>>>     isInitiator=true
>>>>     storeKey=true
>>>>     useTicketCache=true
>>>>     client=true;
>>>> };
>>>>
>>>>  Any help?
>>>> Regards.
>>>>
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Having problems to start knox

Posted by Juan Carlos <jc...@redoop.org>.
Thanks so much,
it was the problem. After destroying any principal it worked fine. I'll try
to reproduce the error again if it is interesting for you, I tried to start
the gateway with another principal cached but now it doesn't cause any
problem. Shall I try to reproduce the issue again or it is useless for you?
Regards


2014-02-12 19:38 GMT+01:00 Dilli Arumugam <da...@hortonworks.com>:

> Thanks Kevin.
>
> Thanks Juan for trying out Knox.
>
> Clearly Knox to Hadoop SPNego has failed.
>
> How did you  start Knox? (meaning the command used at terminal)
>
> Could you try the follwoing
>
>
>    1. Stop Knox Server.
>    2. From the same terminal window,  issue command: kdestroy
>    3. Start Knox
>    4. Repeat the test
>    5. Share the observations
>
> Thanks
> Dilli
>
>
> On Wed, Feb 12, 2014 at 10:34 AM, Kevin Minder <
> kevin.minder@hortonworks.com> wrote:
>
>>  Hi Juan,
>> We look forward to helping you.  I'm responding so that one of our other
>> committers can see the thread as he had an issue with his user@knoxsubscription.  Please expect him to engage once he has the thread.
>> Kevin.
>>
>>
>> On 2/12/14 11:41 AM, Vinay Shukla wrote:
>>
>> Juan,
>>
>>
>>  Thanks for the detailed email. Can you verify that the REALM name
>> matches? (KDC REALM property in kerberos conf matches )
>>
>>
>> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>>
>>> I have running a secured hdfs cluster, and now I need to set the
>>> peripherical security. I have been following the user guide.
>>> If I execute
>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>>> --negotiate -i -k -u : -X GET '
>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
>>> the output looks fine
>>>
>>>  But If I try to do it throw knox....I receive some errors:
>>> When executing:
>>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i -k
>>> -u jcfernandez -X GET '
>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>>> '
>>>   I receive
>>>  HTTP/1.1 500 Server Error
>>> Set-Cookie:
>>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>>> Content-Type: text/html;charset=ISO-8859-1
>>> Cache-Control: must-revalidate,no-cache,no-store
>>> Content-Length: 21864
>>> Server: Jetty(8.1.12.v20130726)
>>>
>>>  <html>
>>> <head>
>>> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
>>> <title>Error 500 Server Error</title>
>>> </head>
>>> <body><h2>HTTP ERROR 500</h2>
>>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>>> <pre>    Server Error</pre></p><h3>Caused
>>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>>> java.security.PrivilegedActionException: java.io.IOException: Service
>>> connectivity error.
>>>         at
>>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>>> .......
>>> </pre>
>>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>>  <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>> <br/>
>>>
>>>  </body>
>>> </html>
>>>
>>>  And in server side:
>>>  14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>>> /webhdfs/v1/user?op=LISTSTATUS
>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>>> URL:
>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
>>> authentication error: No valid credentials provided (Mechanism level: No
>>> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
>>> credentials failed! (null)))
>>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop
>>> SPNegotiation authentication for URL:
>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
>>> request:
>>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>> java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>>> .....
>>>
>>>
>>>  And executing
>>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>>
>>>  Caught: org.apache.hadoop.gateway.shell.HadoopException:
>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>> org.apache.hadoop.gateway.shell.HadoopException:
>>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>>         at
>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>>         at
>>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>>         at
>>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>>         at
>>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>>         at
>>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>>         at
>>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>>         at
>>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
>>> Server Error
>>>         at
>>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>>         at
>>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>> $Request.access$200(Ls.java:31)
>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>> $Request$1.call(Ls.java:51)
>>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>>> $Request$1.call(Ls.java:45)
>>>         at
>>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>>         ... 8 more
>>>  ......
>>>
>>>  Here are my config files:
>>> *conf/gateway.xml*
>>>  <configuration>
>>>
>>>      <property>
>>>          <name>gateway.port</name>
>>>         <value>8443</value>
>>>         <description>The HTTP port for the Gateway.</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>gateway.path</name>
>>>         <value>gateway</value>
>>>         <description>The default context path for the
>>> gateway.</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>gateway.gateway.conf.dir</name>
>>>         <value>deployments</value>
>>>         <description>The directory within GATEWAY_HOME that contains
>>> gateway topology files and deployments.</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>gateway.hadoop.kerberos.secured</name>
>>>         <value>true</value>
>>>         <description>Boolean flag indicating whether the Hadoop cluster
>>> protected by Gateway is secured with Kerberos</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>java.security.krb5.conf</name>
>>>         <value>/etc/krb5.conf</value>
>>>         <description>Absolute path to krb5.conf file</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>java.security.auth.login.config</name>
>>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>>         <description>Absolute path to JASS login config
>>> file</description>
>>>     </property>
>>>
>>>      <property>
>>>         <name>sun.security.krb5.debug</name>
>>>         <value>false</value>
>>>         <description>Boolean flag indicating whether to enable debug
>>> messages for krb5 authentication</description>
>>>     </property>
>>>
>>>  </configuration>
>>>
>>>
>>>  *conf/hdfscluster.xml:*
>>> <topology>
>>>
>>>      <gateway>
>>>
>>>          <provider>
>>>             <role>authentication</role>
>>>             <name>ShiroProvider</name>
>>>             <enabled>true</enabled>
>>>             <param>
>>>                 <name>sessionTimeout</name>
>>>                  <value>30</value>
>>>             </param>
>>>             <param>
>>>                 <name>main.ldapRealm</name>
>>>                 <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>>             </param>
>>>             <param>
>>>                 <name>main.ldapRealm.userDnTemplate</name>
>>>
>>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>>             </param>
>>>             <param>
>>>                 <name>main.ldapRealm.contextFactory.url</name>
>>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>>             </param>
>>>             <param>
>>>
>>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>>                 <value>simple</value>
>>>             </param>
>>>             <param>
>>>                 <name>urls./**</name>
>>>                 <value>authcBasic</value>
>>>             </param>
>>>         </provider>
>>>
>>>  <provider>
>>>             <role>identity-assertion</role>
>>>             <name>Pseudo</name>
>>>             <enabled>true</enabled>
>>>             <!--enabled>true</enabled-->
>>>             <param>
>>>                 <name>group.principal.mapping</name>
>>>                 <value>*=hadoop;</value>
>>>             </param>
>>>         </provider>
>>>
>>>          <provider>
>>>              <role>authorization</role>
>>>             <name>AclsAuthz</name>
>>>             <enabled>False</enabled>
>>>         </provider>
>>>     </gateway>
>>>
>>>      <service>
>>>         <role>NAMENODE</role>
>>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>>     </service>
>>>
>>>      <service>
>>>         <role>JOBTRACKER</role>
>>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>>     </service>
>>>
>>>      <service>
>>>         <role>WEBHDFS</role>
>>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>>
>>>      </service>
>>>  </topology>
>>>
>>>  */etc/knox/config/krb5JAASLogin.conf*
>>>  com.sun.security.jgss.initiate {
>>>     com.sun.security.auth.module.Krb5LoginModule required
>>>     renewTGT=true
>>>     doNotPrompt=true
>>>     useKeyTab=true
>>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>>     isInitiator=true
>>>     storeKey=true
>>>     useTicketCache=true
>>>     client=true;
>>> };
>>>
>>>  Any help?
>>> Regards.
>>>
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: Having problems to start knox

Posted by Dilli Arumugam <da...@hortonworks.com>.
Thanks Kevin.

Thanks Juan for trying out Knox.

Clearly Knox to Hadoop SPNego has failed.

How did you  start Knox? (meaning the command used at terminal)

Could you try the follwoing


   1. Stop Knox Server.
   2. From the same terminal window,  issue command: kdestroy
   3. Start Knox
   4. Repeat the test
   5. Share the observations

Thanks
Dilli


On Wed, Feb 12, 2014 at 10:34 AM, Kevin Minder <kevin.minder@hortonworks.com
> wrote:

>  Hi Juan,
> We look forward to helping you.  I'm responding so that one of our other
> committers can see the thread as he had an issue with his user@knoxsubscription.  Please expect him to engage once he has the thread.
> Kevin.
>
>
> On 2/12/14 11:41 AM, Vinay Shukla wrote:
>
> Juan,
>
>
>  Thanks for the detailed email. Can you verify that the REALM name
> matches? (KDC REALM property in kerberos conf matches )
>
>
> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>
>> I have running a secured hdfs cluster, and now I need to set the
>> peripherical security. I have been following the user guide.
>> If I execute
>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>> --negotiate -i -k -u : -X GET '
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
>> the output looks fine
>>
>>  But If I try to do it throw knox....I receive some errors:
>> When executing:
>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i -k
>> -u jcfernandez -X GET '
>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>> '
>>   I receive
>>  HTTP/1.1 500 Server Error
>> Set-Cookie:
>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>> Content-Type: text/html;charset=ISO-8859-1
>> Cache-Control: must-revalidate,no-cache,no-store
>> Content-Length: 21864
>> Server: Jetty(8.1.12.v20130726)
>>
>>  <html>
>> <head>
>> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
>> <title>Error 500 Server Error</title>
>> </head>
>> <body><h2>HTTP ERROR 500</h2>
>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>> <pre>    Server Error</pre></p><h3>Caused
>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>> java.security.PrivilegedActionException: java.io.IOException: Service
>> connectivity error.
>>         at
>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>> .......
>> </pre>
>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>>  <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>>
>>  </body>
>> </html>
>>
>>  And in server side:
>>  14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>> /webhdfs/v1/user?op=LISTSTATUS
>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>> URL:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
>> authentication error: No valid credentials provided (Mechanism level: No
>> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
>> credentials failed! (null)))
>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop SPNegotiation
>> authentication for URL:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
>> request:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>> java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>> .....
>>
>>
>>  And executing
>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>
>>  Caught: org.apache.hadoop.gateway.shell.HadoopException:
>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>> org.apache.hadoop.gateway.shell.HadoopException:
>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>         at
>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>         at
>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>         at
>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>         at
>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>         at
>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
>> Server Error
>>         at
>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request.access$200(Ls.java:31)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request$1.call(Ls.java:51)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request$1.call(Ls.java:45)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>         ... 8 more
>>  ......
>>
>>  Here are my config files:
>> *conf/gateway.xml*
>>  <configuration>
>>
>>      <property>
>>          <name>gateway.port</name>
>>         <value>8443</value>
>>         <description>The HTTP port for the Gateway.</description>
>>     </property>
>>
>>      <property>
>>         <name>gateway.path</name>
>>         <value>gateway</value>
>>         <description>The default context path for the
>> gateway.</description>
>>     </property>
>>
>>      <property>
>>         <name>gateway.gateway.conf.dir</name>
>>         <value>deployments</value>
>>         <description>The directory within GATEWAY_HOME that contains
>> gateway topology files and deployments.</description>
>>     </property>
>>
>>      <property>
>>         <name>gateway.hadoop.kerberos.secured</name>
>>         <value>true</value>
>>         <description>Boolean flag indicating whether the Hadoop cluster
>> protected by Gateway is secured with Kerberos</description>
>>     </property>
>>
>>      <property>
>>         <name>java.security.krb5.conf</name>
>>         <value>/etc/krb5.conf</value>
>>         <description>Absolute path to krb5.conf file</description>
>>     </property>
>>
>>      <property>
>>         <name>java.security.auth.login.config</name>
>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>         <description>Absolute path to JASS login config file</description>
>>     </property>
>>
>>      <property>
>>         <name>sun.security.krb5.debug</name>
>>         <value>false</value>
>>         <description>Boolean flag indicating whether to enable debug
>> messages for krb5 authentication</description>
>>     </property>
>>
>>  </configuration>
>>
>>
>>  *conf/hdfscluster.xml:*
>> <topology>
>>
>>      <gateway>
>>
>>          <provider>
>>             <role>authentication</role>
>>             <name>ShiroProvider</name>
>>             <enabled>true</enabled>
>>             <param>
>>                 <name>sessionTimeout</name>
>>                  <value>30</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm</name>
>>                 <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm.userDnTemplate</name>
>>
>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm.contextFactory.url</name>
>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>             </param>
>>             <param>
>>
>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>                 <value>simple</value>
>>             </param>
>>             <param>
>>                 <name>urls./**</name>
>>                 <value>authcBasic</value>
>>             </param>
>>         </provider>
>>
>>  <provider>
>>             <role>identity-assertion</role>
>>             <name>Pseudo</name>
>>             <enabled>true</enabled>
>>             <!--enabled>true</enabled-->
>>             <param>
>>                 <name>group.principal.mapping</name>
>>                 <value>*=hadoop;</value>
>>             </param>
>>         </provider>
>>
>>          <provider>
>>              <role>authorization</role>
>>             <name>AclsAuthz</name>
>>             <enabled>False</enabled>
>>         </provider>
>>     </gateway>
>>
>>      <service>
>>         <role>NAMENODE</role>
>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>     </service>
>>
>>      <service>
>>         <role>JOBTRACKER</role>
>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>     </service>
>>
>>      <service>
>>         <role>WEBHDFS</role>
>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>
>>      </service>
>>  </topology>
>>
>>  */etc/knox/config/krb5JAASLogin.conf*
>>  com.sun.security.jgss.initiate {
>>     com.sun.security.auth.module.Krb5LoginModule required
>>     renewTGT=true
>>     doNotPrompt=true
>>     useKeyTab=true
>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>     isInitiator=true
>>     storeKey=true
>>     useTicketCache=true
>>     client=true;
>> };
>>
>>  Any help?
>> Regards.
>>
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Having problems to start knox

Posted by Kevin Minder <ke...@hortonworks.com>.
Hi Juan,
We look forward to helping you.  I'm responding so that one of our other 
committers can see the thread as he had an issue with his user@knox 
subscription.  Please expect him to engage once he has the thread.
Kevin.

On 2/12/14 11:41 AM, Vinay Shukla wrote:
> Juan,
>
>
> Thanks for the detailed email. Can you verify that the REALM name 
> matches? (KDC REALM property in kerberos conf matches )
>
>
> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jcfernandez@redoop.org 
> <ma...@redoop.org>> wrote:
>
>     I have running a secured hdfs cluster, and now I need to set the
>     peripherical security. I have been following the user guide.
>     If I execute
>     kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>     --negotiate -i -k -u : -X GET
>     'http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
>     the output looks fine
>
>     But If I try to do it throw knox....I receive some errors:
>     When executing:
>     kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>     -i -k -u jcfernandez -X GET
>     'https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS'
>      I receive
>     HTTP/1.1 500 Server Error
>     Set-Cookie:
>     JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>     Content-Type: text/html;charset=ISO-8859-1
>     Cache-Control: must-revalidate,no-cache,no-store
>     Content-Length: 21864
>     Server: Jetty(8.1.12.v20130726)
>
>     <html>
>     <head>
>     <meta http-equiv="Content-Type" content="text/html;
>     charset=ISO-8859-1"/>
>     <title>Error 500 Server Error</title>
>     </head>
>     <body><h2>HTTP ERROR 500</h2>
>     <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>     <pre>    Server Error</pre></p><h3>Caused
>     by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>     java.security.PrivilegedActionException: java.io.IOException:
>     Service connectivity error.
>             at
>     org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>     .......
>     </pre>
>     <hr /><i><small>Powered by Jetty://</small></i><br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>     <br/>
>
>     </body>
>     </html>
>
>     And in server side:
>     14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>     /webhdfs/v1/user?op=LISTSTATUS
>     14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>     https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>     direction: IN via explicit rule:
>     WEBHDFS/webhdfs/inbound/namenode/file to URL:
>     http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>     14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>     http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>     14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication:
>     NEGOTIATE authentication error: No valid credentials provided
>     (Mechanism level: No valid credentials provided (Mechanism level:
>     Attempt to obtain new INITIATE credentials failed! (null)))
>     14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop
>     SPNegotiation authentication for URL:
>     http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>     14/02/12 15:12:43 WARN hadoop.gateway: Connection exception
>     dispatching request:
>     http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>     java.io.IOException: SPNego authn failed, can not get hadoop.auth
>     cookie
>     java.io.IOException: SPNego authn failed, can not get hadoop.auth
>     cookie
>     .....
>
>
>     And executing
>     java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>
>     Caught: org.apache.hadoop.gateway.shell.HadoopException:
>     org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server
>     Error
>     org.apache.hadoop.gateway.shell.HadoopException:
>     org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server
>     Error
>             at
>     org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>             at
>     org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown
>     Source)
>             at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>             at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>             at
>     org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>             at
>     org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>             at
>     org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>             at
>     org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>             at
>     org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>     Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1
>     500 Server Error
>             at
>     org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>             at
>     org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>             at org.apache.hadoop.gateway.shell.hdfs.Ls
>     <http://org.apache.hadoop.gateway.shell.hdfs.Ls>$Request.access$200(Ls.java:31)
>             at org.apache.hadoop.gateway.shell.hdfs.Ls
>     <http://org.apache.hadoop.gateway.shell.hdfs.Ls>$Request$1.call(Ls.java:51)
>             at org.apache.hadoop.gateway.shell.hdfs.Ls
>     <http://org.apache.hadoop.gateway.shell.hdfs.Ls>$Request$1.call(Ls.java:45)
>             at
>     org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>             ... 8 more
>     ......
>
>     Here are my config files:
>     *conf/gateway.xml*
>     <configuration>
>
>         <property>
>             <name>gateway.port</name>
>             <value>8443</value>
>             <description>The HTTP port for the Gateway.</description>
>         </property>
>
>         <property>
>             <name>gateway.path</name>
>             <value>gateway</value>
>             <description>The default context path for the
>     gateway.</description>
>         </property>
>
>         <property>
>     <name>gateway.gateway.conf.dir</name>
>             <value>deployments</value>
>             <description>The directory within GATEWAY_HOME that
>     contains gateway topology files and deployments.</description>
>         </property>
>
>         <property>
>     <name>gateway.hadoop.kerberos.secured</name>
>             <value>true</value>
>             <description>Boolean flag indicating whether the Hadoop
>     cluster protected by Gateway is secured with Kerberos</description>
>         </property>
>
>         <property>
>     <name>java.security.krb5.conf</name>
>             <value>/etc/krb5.conf</value>
>             <description>Absolute path to krb5.conf file</description>
>         </property>
>
>         <property>
>     <name>java.security.auth.login.config</name>
>     <value>/etc/knox/config/krb5JAASLogin.conf</value>
>             <description>Absolute path to JASS login config
>     file</description>
>         </property>
>
>         <property>
>     <name>sun.security.krb5.debug</name>
>             <value>false</value>
>             <description>Boolean flag indicating whether to enable
>     debug messages for krb5 authentication</description>
>         </property>
>
>     </configuration>
>
>
>     *conf/hdfscluster.xml:*
>     <topology>
>
>         <gateway>
>
>             <provider>
>                 <role>authentication</role>
>                 <name>ShiroProvider</name>
>                 <enabled>true</enabled>
>                 <param>
>     <name>sessionTimeout</name>
>                     <value>30</value>
>                 </param>
>                 <param>
>     <name>main.ldapRealm</name>
>     <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>                 </param>
>                 <param>
>     <name>main.ldapRealm.userDnTemplate</name>
>     <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>                 </param>
>                 <param>
>     <name>main.ldapRealm.contextFactory.url</name>
>                     <value>ldap://jcr1.jcfernandez.cediant.es:389
>     <http://jcr1.jcfernandez.cediant.es:389></value>
>                 </param>
>                 <param>
>     <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>                     <value>simple</value>
>                 </param>
>                 <param>
>                     <name>urls./**</name>
>     <value>authcBasic</value>
>                 </param>
>             </provider>
>
>     <provider>
>     <role>identity-assertion</role>
>                 <name>Pseudo</name>
>                 <enabled>true</enabled>
>     <!--enabled>true</enabled-->
>                 <param>
>     <name>group.principal.mapping</name>
>     <value>*=hadoop;</value>
>                 </param>
>             </provider>
>
>             <provider>
>                 <role>authorization</role>
>                 <name>AclsAuthz</name>
>                 <enabled>False</enabled>
>             </provider>
>         </gateway>
>
>         <service>
>             <role>NAMENODE</role>
>             <url>hdfs://jcr1.jcfernandez.cediant.es:8020
>     <http://jcr1.jcfernandez.cediant.es:8020></url>
>         </service>
>
>         <service>
>             <role>JOBTRACKER</role>
>             <url>rpc://jcr1.jcfernandez.cediant.es:8050
>     <http://jcr1.jcfernandez.cediant.es:8050></url>
>         </service>
>
>         <service>
>             <role>WEBHDFS</role>
>             <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>
>         </service>
>     </topology>
>
>     */etc/knox/config/krb5JAASLogin.conf*
>     com.sun.security.jgss.initiate {
>         com.sun.security.auth.module.Krb5LoginModule required
>         renewTGT=true
>         doNotPrompt=true
>         useKeyTab=true
>     keyTab="/opt/hadoop/security/knox.service.keytab"
>         principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>     <ma...@JCFERNANDEZ.CEDIANT.ES>
>         isInitiator=true
>         storeKey=true
>         useTicketCache=true
>         client=true;
>     };
>
>     Any help?
>     Regards.
>
>


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Having problems to start knox

Posted by Juan Carlos <jc...@redoop.org>.
This is my krb5.conf I think it is fine

[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = JCFERNANDEZ.CEDIANT.ES
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
JCFERNANDEZ.CEDIANT.ES = {
  database_module = LDAP
  kdc = jcr1.jcfernandez.cediant.es
  admin_server = jcr1.jcfernandez.cediant.es
  default_domain=jcfernandez.cediant.es
 }

[domain_realm]
 .jcfernandez.cediant.es = JCFERNANDEZ.CEDIANT.ES
 jcfernandez.cediant.es = JCFERNANDEZ.CEDIANT.ES

[dbdefaults]
  ldap_kerberos_container_dn =
cn=krbcontainer,dc=jcfernandez,dc=cediant,dc=es

[dbmodules]
  LDAP = {
        db_library = kldap
        ldap_kdc_dn = "cn=Manager,dc=jcfernandez,dc=cediant,dc=es"
        ldap_kadmind_dn = "cn=Manager,dc=jcfernandez,dc=cediant,dc=es"
        ldap_service_password_file = /etc/kerberos/admin.stash
        ldap_servers = ldaps://jcr1.jcfernandez.cediant.es
        ldap_conns_per_server = 5
  }



2014-02-12 17:41 GMT+01:00 Vinay Shukla <vi...@gmail.com>:

> Juan,
>
>
> Thanks for the detailed email. Can you verify that the REALM name matches?
> (KDC REALM property in kerberos conf matches )
>
>
> On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org>wrote:
>
>> I have running a secured hdfs cluster, and now I need to set the
>> peripherical security. I have been following the user guide.
>> If I execute
>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
>> --negotiate -i -k -u : -X GET '
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
>> the output looks fine
>>
>> But If I try to do it throw knox....I receive some errors:
>> When executing:
>> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i -k
>> -u jcfernandez -X GET '
>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
>> '
>>  I receive
>> HTTP/1.1 500 Server Error
>> Set-Cookie:
>> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
>> Content-Type: text/html;charset=ISO-8859-1
>> Cache-Control: must-revalidate,no-cache,no-store
>> Content-Length: 21864
>> Server: Jetty(8.1.12.v20130726)
>>
>> <html>
>> <head>
>> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
>> <title>Error 500 Server Error</title>
>> </head>
>> <body><h2>HTTP ERROR 500</h2>
>> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
>> <pre>    Server Error</pre></p><h3>Caused
>> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
>> java.security.PrivilegedActionException: java.io.IOException: Service
>> connectivity error.
>>         at
>> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
>> .......
>> </pre>
>> <hr /><i><small>Powered by Jetty://</small></i><br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>>  <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>> <br/>
>>
>> </body>
>> </html>
>>
>> And in server side:
>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
>> /webhdfs/v1/user?op=LISTSTATUS
>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
>> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
>> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
>> URL:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
>> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
>> authentication error: No valid credentials provided (Mechanism level: No
>> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
>> credentials failed! (null)))
>> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop SPNegotiation
>> authentication for URL:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
>> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
>> request:
>> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>> java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
>> .....
>>
>>
>> And executing
>> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>>
>> Caught: org.apache.hadoop.gateway.shell.HadoopException:
>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>> org.apache.hadoop.gateway.shell.HadoopException:
>> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>>         at
>> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>>         at
>> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>>         at
>> org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>>         at
>> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>>         at
>> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
>> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
>> Server Error
>>         at
>> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request.access$200(Ls.java:31)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request$1.call(Ls.java:51)
>>         at org.apache.hadoop.gateway.shell.hdfs.Ls
>> $Request$1.call(Ls.java:45)
>>         at
>> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>>         ... 8 more
>> ......
>>
>> Here are my config files:
>> *conf/gateway.xml*
>> <configuration>
>>
>>     <property>
>>          <name>gateway.port</name>
>>         <value>8443</value>
>>         <description>The HTTP port for the Gateway.</description>
>>     </property>
>>
>>     <property>
>>         <name>gateway.path</name>
>>         <value>gateway</value>
>>         <description>The default context path for the
>> gateway.</description>
>>     </property>
>>
>>     <property>
>>         <name>gateway.gateway.conf.dir</name>
>>         <value>deployments</value>
>>         <description>The directory within GATEWAY_HOME that contains
>> gateway topology files and deployments.</description>
>>     </property>
>>
>>     <property>
>>         <name>gateway.hadoop.kerberos.secured</name>
>>         <value>true</value>
>>         <description>Boolean flag indicating whether the Hadoop cluster
>> protected by Gateway is secured with Kerberos</description>
>>     </property>
>>
>>     <property>
>>         <name>java.security.krb5.conf</name>
>>         <value>/etc/krb5.conf</value>
>>         <description>Absolute path to krb5.conf file</description>
>>     </property>
>>
>>     <property>
>>         <name>java.security.auth.login.config</name>
>>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>>         <description>Absolute path to JASS login config file</description>
>>     </property>
>>
>>     <property>
>>         <name>sun.security.krb5.debug</name>
>>         <value>false</value>
>>         <description>Boolean flag indicating whether to enable debug
>> messages for krb5 authentication</description>
>>     </property>
>>
>> </configuration>
>>
>>
>> *conf/hdfscluster.xml:*
>> <topology>
>>
>>     <gateway>
>>
>>         <provider>
>>             <role>authentication</role>
>>             <name>ShiroProvider</name>
>>             <enabled>true</enabled>
>>             <param>
>>                 <name>sessionTimeout</name>
>>                 <value>30</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm</name>
>>                 <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm.userDnTemplate</name>
>>
>> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>>             </param>
>>             <param>
>>                 <name>main.ldapRealm.contextFactory.url</name>
>>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>>             </param>
>>             <param>
>>
>> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>>                 <value>simple</value>
>>             </param>
>>             <param>
>>                 <name>urls./**</name>
>>                 <value>authcBasic</value>
>>             </param>
>>         </provider>
>>
>> <provider>
>>             <role>identity-assertion</role>
>>             <name>Pseudo</name>
>>             <enabled>true</enabled>
>>             <!--enabled>true</enabled-->
>>             <param>
>>                 <name>group.principal.mapping</name>
>>                 <value>*=hadoop;</value>
>>             </param>
>>         </provider>
>>
>>         <provider>
>>             <role>authorization</role>
>>             <name>AclsAuthz</name>
>>             <enabled>False</enabled>
>>         </provider>
>>     </gateway>
>>
>>     <service>
>>         <role>NAMENODE</role>
>>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>>     </service>
>>
>>     <service>
>>         <role>JOBTRACKER</role>
>>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>>     </service>
>>
>>     <service>
>>         <role>WEBHDFS</role>
>>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>>
>>     </service>
>> </topology>
>>
>> */etc/knox/config/krb5JAASLogin.conf*
>> com.sun.security.jgss.initiate {
>>     com.sun.security.auth.module.Krb5LoginModule required
>>     renewTGT=true
>>     doNotPrompt=true
>>     useKeyTab=true
>>     keyTab="/opt/hadoop/security/knox.service.keytab"
>>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>>     isInitiator=true
>>     storeKey=true
>>     useTicketCache=true
>>     client=true;
>> };
>>
>> Any help?
>> Regards.
>>
>
>

Re: Having problems to start knox

Posted by Vinay Shukla <vi...@gmail.com>.
Juan,


Thanks for the detailed email. Can you verify that the REALM name matches?
(KDC REALM property in kerberos conf matches )


On Wed, Feb 12, 2014 at 9:24 PM, Juan Carlos <jc...@redoop.org> wrote:

> I have running a secured hdfs cluster, and now I need to set the
> peripherical security. I have been following the user guide.
> If I execute
> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl
> --negotiate -i -k -u : -X GET '
> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS'
> the output looks fine
>
> But If I try to do it throw knox....I receive some errors:
> When executing:
> kinit -kt /home/jcfernandez/w/jcfernandez.keytab jcfernandez; curl -i -k
> -u jcfernandez -X GET '
> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS
> '
>  I receive
> HTTP/1.1 500 Server Error
> Set-Cookie:
> JSESSIONID=rxsvzwqdzo1uv5852zeoqjrr;Path=/gateway/hdfscluster;Secure
> Content-Type: text/html;charset=ISO-8859-1
> Cache-Control: must-revalidate,no-cache,no-store
> Content-Length: 21864
> Server: Jetty(8.1.12.v20130726)
>
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
> <title>Error 500 Server Error</title>
> </head>
> <body><h2>HTTP ERROR 500</h2>
> <p>Problem accessing /gateway/hdfscluster/webhdfs/v1/user. Reason:
> <pre>    Server Error</pre></p><h3>Caused
> by:</h3><pre>org.apache.shiro.subject.ExecutionException:
> java.security.PrivilegedActionException: java.io.IOException: Service
> connectivity error.
>         at
> org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:385)
> .......
> </pre>
> <hr /><i><small>Powered by Jetty://</small></i><br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
>
> </body>
> </html>
>
> And in server side:
> 14/02/12 15:12:43 DEBUG hadoop.gateway: Received request: GET
> /webhdfs/v1/user?op=LISTSTATUS
> 14/02/12 15:12:43 DEBUG hadoop.gateway: Rewrote URL:
> https://jcr1.jcfernandez.cediant.es:8443/gateway/hdfscluster/webhdfs/v1/user?op=LISTSTATUS,
> direction: IN via explicit rule: WEBHDFS/webhdfs/inbound/namenode/file to
> URL:
> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?op=LISTSTATUS
> 14/02/12 15:12:43 DEBUG hadoop.gateway: Dispatch request: GET
> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
> 14/02/12 15:12:43 WARN protocol.RequestTargetAuthentication: NEGOTIATE
> authentication error: No valid credentials provided (Mechanism level: No
> valid credentials provided (Mechanism level: Attempt to obtain new INITIATE
> credentials failed! (null)))
> 14/02/12 15:12:43 ERROR hadoop.gateway: Failed Knox->Hadoop SPNegotiation
> authentication for URL:
> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUS
> 14/02/12 15:12:43 WARN hadoop.gateway: Connection exception dispatching
> request:
> http://jcr1.jcfernandez.cediant.es:50070/webhdfs/v1/user?doAs=jcfernandez&op=LISTSTATUSjava.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
> java.io.IOException: SPNego authn failed, can not get hadoop.auth cookie
> .....
>
>
> And executing
> java -jar bin/shell.jar ~/ExampleWebHdfsLs.groovy
>
> Caught: org.apache.hadoop.gateway.shell.HadoopException:
> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
> org.apache.hadoop.gateway.shell.HadoopException:
> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>         at
> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>         at
> org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>         at ExampleWebHdfsLs.run(ExampleWebHdfsLs.groovy:28)
>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>         at
> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>         at
> org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>         at org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
>         at
> org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>         at
> org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500
> Server Error
>         at
> org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>         at
> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>         at org.apache.hadoop.gateway.shell.hdfs.Ls
> $Request.access$200(Ls.java:31)
>         at org.apache.hadoop.gateway.shell.hdfs.Ls
> $Request$1.call(Ls.java:51)
>         at org.apache.hadoop.gateway.shell.hdfs.Ls
> $Request$1.call(Ls.java:45)
>         at
> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>         ... 8 more
> ......
>
> Here are my config files:
> *conf/gateway.xml*
> <configuration>
>
>     <property>
>         <name>gateway.port</name>
>         <value>8443</value>
>         <description>The HTTP port for the Gateway.</description>
>     </property>
>
>     <property>
>         <name>gateway.path</name>
>         <value>gateway</value>
>         <description>The default context path for the
> gateway.</description>
>     </property>
>
>     <property>
>         <name>gateway.gateway.conf.dir</name>
>         <value>deployments</value>
>         <description>The directory within GATEWAY_HOME that contains
> gateway topology files and deployments.</description>
>     </property>
>
>     <property>
>         <name>gateway.hadoop.kerberos.secured</name>
>         <value>true</value>
>         <description>Boolean flag indicating whether the Hadoop cluster
> protected by Gateway is secured with Kerberos</description>
>     </property>
>
>     <property>
>         <name>java.security.krb5.conf</name>
>         <value>/etc/krb5.conf</value>
>         <description>Absolute path to krb5.conf file</description>
>     </property>
>
>     <property>
>         <name>java.security.auth.login.config</name>
>         <value>/etc/knox/config/krb5JAASLogin.conf</value>
>         <description>Absolute path to JASS login config file</description>
>     </property>
>
>     <property>
>         <name>sun.security.krb5.debug</name>
>         <value>false</value>
>         <description>Boolean flag indicating whether to enable debug
> messages for krb5 authentication</description>
>     </property>
>
> </configuration>
>
>
> *conf/hdfscluster.xml:*
> <topology>
>
>     <gateway>
>
>         <provider>
>             <role>authentication</role>
>             <name>ShiroProvider</name>
>             <enabled>true</enabled>
>             <param>
>                 <name>sessionTimeout</name>
>                 <value>30</value>
>             </param>
>             <param>
>                 <name>main.ldapRealm</name>
>                 <value>org.apache.shiro.realm.ldap.JndiLdapRealm</value>
>             </param>
>             <param>
>                 <name>main.ldapRealm.userDnTemplate</name>
>
> <value>cn={0},ou=People,dc=jcfernandez,dc=cediant,dc=es</value>
>             </param>
>             <param>
>                 <name>main.ldapRealm.contextFactory.url</name>
>                 <value>ldap://jcr1.jcfernandez.cediant.es:389</value>
>             </param>
>             <param>
>
> <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
>                 <value>simple</value>
>             </param>
>             <param>
>                 <name>urls./**</name>
>                 <value>authcBasic</value>
>             </param>
>         </provider>
>
> <provider>
>             <role>identity-assertion</role>
>             <name>Pseudo</name>
>             <enabled>true</enabled>
>             <!--enabled>true</enabled-->
>             <param>
>                 <name>group.principal.mapping</name>
>                 <value>*=hadoop;</value>
>             </param>
>         </provider>
>
>         <provider>
>             <role>authorization</role>
>             <name>AclsAuthz</name>
>             <enabled>False</enabled>
>         </provider>
>     </gateway>
>
>     <service>
>         <role>NAMENODE</role>
>         <url>hdfs://jcr1.jcfernandez.cediant.es:8020</url>
>     </service>
>
>     <service>
>         <role>JOBTRACKER</role>
>         <url>rpc://jcr1.jcfernandez.cediant.es:8050</url>
>     </service>
>
>     <service>
>         <role>WEBHDFS</role>
>         <url>http://jcr1.jcfernandez.cediant.es:50070/webhdfs</url>
>
>     </service>
> </topology>
>
> */etc/knox/config/krb5JAASLogin.conf*
> com.sun.security.jgss.initiate {
>     com.sun.security.auth.module.Krb5LoginModule required
>     renewTGT=true
>     doNotPrompt=true
>     useKeyTab=true
>     keyTab="/opt/hadoop/security/knox.service.keytab"
>     principal="knox/knox@JCFERNANDEZ.CEDIANT.ES
>     isInitiator=true
>     storeKey=true
>     useTicketCache=true
>     client=true;
> };
>
> Any help?
> Regards.
>