You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@knox.apache.org by "Jovanic, Ben" <be...@cgi.com> on 2017/02/21 11:24:38 UTC

Using Kerberos with Knox

Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:

  *   I'm using HDP-2.4.0.0-169.
  *   I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
  *   Kerberos has been set up using these instructions (https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html)
  *   I've validated the Kerberos set up by using the following curl statement after a kinit:

$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-store
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM&t=kerberos&e=1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus":[
{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397,"group":"hdfs","length":0,"modificationTime":1456768692570,"owner":"hdfs","pathSuffix":"entity-file-history","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
{"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434,"group":"hdfs","length":0,"modificationTime":1456785191888,"owner":"ambari-qa","pathSuffix":"hive","permission":"733","replication":0,"storagePolicy":0,"type":"DIRECTORY"}
]}}


What I've tried with Knox


I've gone through these instructions (https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox keytab, update the 2 conf files and update gateway-site.xml.


krb5.conf:

[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM = {
  kdc = sandbox.hortonworks.com
  admin_server = sandbox.hortonworks.com
 }


During Kerberos set up I did leave the realm as EXAMPLE.COM.


krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.Krb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.service.keytab and /etc/security/keytabs/knox.service.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</value>
            </param>
            <param>
                <name>hadoop.auth.config.signature.secret</name>
                <!--<value>78hdkjaka</value>-->
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type</name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simple.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token.validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.domain</name>
                <value>sandbox.hortonworks.com</value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.path</name>
                <!--<value>gateway/hada</value>-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.principal</name>
                <value>HTTP/sandbox.hortonworks.com@EXAMPLE.COM</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.keytab</name>
                <value>/usr/hdp/current/knox-server/conf/knox.service.keytab</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role>
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonworks.com</name><value>sandbox,sandbox.hortonworks.com</value></param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS"
HTTP/1.1 401
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.

RE: Using Kerberos with Knox

Posted by "Jovanic, Ben" <be...@cgi.com>.
Thanks for the help guys.

I started with a fresh HDP server and now everything works as I want it to.

I guess my old HDP server got into a weird state after many many configurations.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Mohammad Islam [mislam77@yahoo.com]
Sent: 03 March 2017 09:08
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

>I updated both files as you have suggested with the value * for the groups and hosts properties.
>I then logged into Ambari and restarted HBase and HDFS, however this causes both core-site.xml files to revert >tohadoop.proxyuser.knox.groups=users and
> hadoop.proxyuser.knox.hosts=sandbox.hortonworks.com

I'm not familiar with Ambari.
You may need to change those values through Ambari UI -- not updating the files.
Please try with "*" first.
I have a feeling either sandbox.hortonworks.com and user are not matching when verified. The "*" change can help in some extent to start with.






On Thursday, March 2, 2017 4:49 AM, larry mccay <lm...@apache.org> wrote:


Hi Ben -

It seems like you are really close to getting everything in line.

When you are managing the cluster and Knox instance/s with Ambari, you have to make such changes in Ambari and then restart.
Saving the changes in Ambari saves them to the database then restarting pushing out the new config and restarts the components.
If you change the files locally and restart with Ambari then the local changes will be overwritten with what is in the Ambari database.

In addition, it seems that you are trying to impersonate the knox user which has a couple implications.
1. the knox user will not likely ever be in the "users" group - therefore the check for whether the impoersonated user is in an approved group to impersonate will fail based on:
<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
2. you would have to login as knox to ldap or as you were doing previously with a kinit and this is not recommended

You should use the credentials of a enduser not a service account that happens to be in the "users" group or whatever is appropriate based on your deployment. We generally use one of the users in our DEMO LDAP server such as guest:guest-password for testing.

You will also need to ensure that the enduser has an OS account on your cluster machines and is a member of the appropriate group at the OS level.

HTH,

--larry


On Thu, Mar 2, 2017 at 5:04 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi Mohammad,

The values of the properties was:
<property>
    <name>hadoop.proxyuser.knox. groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox. hosts</name>
    <value>sandbox.hortonworks.com<http://sandbox.hortonworks.com/> </value>
</property>

For both of the files:
/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

FYI
# hostname -f
sandbox.hortonworks.com<http://sandbox.hortonworks.com/>



I updated both files as you have suggested with the value * for the groups and hosts properties.
I then logged into Ambari and restarted HBase and HDFS, however this causes both core-site.xml files to revert to hadoop.proxyuser.knox.groups= users and hadoop.proxyuser.knox.hosts=sa ndbox.hortonworks.com<http://sandbox.hortonworks.com/>



As a side I have realised I do not need to use HadoopAuth on Knox to secure it with Kerberos. LDAP is what I want to secure Knox.
I'm still hit with the same problem when Knox tries to access webhdfs:

# curl -kiu guest:guest-password https://sandbox.hortonworks. com:8443/gateway/sandbox/ webhdfs/v1/?op=LISTSTATUS<https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>
HTTP/1.1 401 Unauthorized
Date: Thu, 02 Mar 2017 10:00:55 GMT
Set-Cookie: JSESSIONID= 1w45jwocxv7fh6gc9k12gl6jy; Path=/gateway/sandbox;Secure; HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/sandbox; Max-Age=0; Expires=Wed, 01-Mar-2017 10:00:55 GMT
Cache-Control: must-revalidate,no-cache,no- store
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Content-Type: text/html; charset=ISO-8859-1
Server: Jetty(6.1.26.hwx)
Content-Length: 1403

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /webhdfs/v1/. Reason:
<pre>    Authentication required</pre></p><hr/><i>< small>Powered by Jetty://</small></i><br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>

</body>
</html>



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<UrlBlockedError.aspx>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Mohammad Islam [mislam77@yahoo.com<ma...@yahoo.com>]
Sent: 02 March 2017 05:15

To: user@knox.apache.org<ma...@knox.apache.org>
Subject: Re: Using Kerberos with Knox

Hi Ben,
What is the value you put in hadoop.proxyuser.knox.hosts  ?
Can you please set  "*" for  both hadoop.proxyuser.knox. hosts and hadoop.proxyuser.knox.groups properties in core-site.xml? You may need to restart the services. BTW This is ONLY for testing purpose.

Regards,
Mohammad



On Wednesday, March 1, 2017 1:02 AM, "Jovanic, Ben" <be...@cgi.com>> wrote:


Hi Sandeep,

If you meant ones of these files:

/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

Then they both are already had the properties you mentioned.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<UrlBlockedError.aspx>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [moresandeep@gmail.com<ma...@gmail.com>]
Sent: 27 February 2017 15:40
To: user@knox.apache.org<ma...@knox.apache.org>
Subject: Re: Using Kerberos with Knox

Hello Ben,

I misspoke a bit on my previous reply, you should *not* add user 'knox' to the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox. groups' property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property in core-site.xml and you should be good.

i.e. the following should be sufficient.


<property>
    <name>hadoop.proxyuser.knox. groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox. hosts</name>
    <value>FQDN_OF_KNOX_HOST</ value>
</property>

See this for more info:
http://knox.apache.org/books/ knox-0-11-0/user-guide.html# Related+Cluster+Configuration<http://knox.apache.org/books/knox-0-11-0/user-guide.html#Related+Cluster+Configuration>

Best,
Sandeep


On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com>> wrote:
Good to know you about the Progress Ben !
About the error "knox is not allowed to impersonate knox" , this might be because the user 'knox' does not have sufficient group privileges to perform the operation.

If you are using Ambari, you can check the
'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then add the user 'knox' to 'users' group (or which ever group you have in hadoop.proxyuser.knox.groups).

Let me know how it goes !

Best,
Sandeep

On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com<http://sanbox.hortonworks.com/> as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.co m:8443/gateway/sandbox/webhdfs /v1/tmp?op=LISTSTATUS<https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS>
HTTP/1.1 401
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox @EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e=1488 188772724&s=5A//jMYbfdVTp1ggiN E3jsLZ1bE=;Path=/;Domain=sandb ox.hortonworks.com<http://sandbox.hortonworks.com/>
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception ":"SecurityException"," javaClassName":"java.lang.Secu rityException","message":"Fail ed to obtain user group information: org.apache.hadoop.security.aut horize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<http://urlblockederror.aspx/>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [more@apache.org<ma...@apache.org>]
Sent: 24 February 2017 18:41
To: user@knox.apache.org<ma...@knox.apache.org>
Subject: Re: Using Kerberos with Knox

Hello Ben,

Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1].

I noticed that you are using cookie domain 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' and see if that helps ?

i.e.
curl -ki --negotiate -u : "https://<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>:8443/gateway/sandbox/webhdf s/v1/?op=LISTSTATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"


[1] https://cwiki.apache.org/confl uence/display/KNOX/2017/02/24/ Hadoop+Auth+%28SPNEGO+and+ delegation+token+based+authent ication%29+with+Apache+Knox<https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox>

Best,
Sandeep

On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>> wrote:
Hello Ben,

Welcome to the list !

At first glance your knox configs look ok to me, it could be related to a setup issue.

In the setup procedure mentioned, you followed
1. Ambari instruction to setup Kerberos (item #3) and
2. Knox instructions for setting up Kerberos

Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)

I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).


Best,
sandeep

On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:

  *   I'm using HDP-2.4.0.0-169.
  *   I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
  *   Kerberos has been set up using these instructions (https://docs.hortonworks.com/ HDPDocuments/Ambari-2.1.1.0/bk _Ambari_Security_Guide/content /ch_configuring_amb_hdp_for_ke rberos.html<https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>)
  *   I've validated the Kerberos set up by using the following curl statement after a kinit:

$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/ v1/tmp?op=LISTSTATUS<http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS>"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-st ore
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/a dmin@EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e= 1487710154130&s=gt9iw89RJ7XMd0 XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus": [
{"accessTime":0,"blockSize":0, "childrenNum":1,"fileId":16397 ,"group":"hdfs","length":0,"mo dificationTime":1456768692570, "owner":"hdfs","pathSuffix":"e ntity-file-history","permissio n":"755","replication":0,"stor agePolicy":0,"type":"DIRECTORY "},
{"accessTime":0,"blockSize":0, "childrenNum":3,"fileId":16434 ,"group":"hdfs","length":0,"mo dificationTime":1456785191888, "owner":"ambari-qa","pathSuffi x":"hive","permission":"733"," replication":0,"storagePolicy" :0,"type":"DIRECTORY"}
]}}

What I've tried with Knox

I've gone through these instructions (https://knox.apache.org/books /knox-0-11-0/user-guide.html#S ecure+Clusters<https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters>) to create the knox keytab, update the 2 conf files and update gateway-site.xml.

krb5.conf:
[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM<http://example.com/>
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM<http://example.com/> = {
  kdc = sandbox.hortonworks.com<http://sandbox.hortonworks.com/>
  admin_server = sandbox.hortonworks.com<http://sandbox.hortonworks.com/>
 }

During Kerberos set up I did leave the realm as EXAMPLE.COM<http://example.com/>.

krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.K rb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox- server/conf/knox.service.keyta b"
principal="HTTP/sandbox.horton works.com@EXAMPLE.COM<ma...@EXAMPLE.COM>"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.s ervice.keytab and /etc/security/keytabs/knox.ser vice.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM<ma...@EXAMPLE.COM>.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</val ue>
            </param>
            <param>
                <name>hadoop.auth.config.signa ture.secret</name>
                <!--<value>78hdkjaka</value>-- >
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type< /name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simpl e.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token .validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.domain</name>
                <value>sandbox.hortonworks.com<http://sandbox.hortonworks.com/> </value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.path</name>
                <!--<value>gateway/hada</value >-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.principal</name>
                <value>HTTP/sandbox.hortonwork s.com@EXAMPLE.COM<ma...@EXAMPLE.COM></value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.keytab</name>
                <value>/usr/hdp/current/knox-s erver/conf/knox.service.keytab </value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role >
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonwor ks.com<http://sandbox.hortonworks.com/></name><value>sandbox,sa ndbox.hortonworks.com<http://sandbox.hortonworks.com/></value>< /param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gatewa y/sandbox/webhdfs/v1/?op=LISTS TATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
HTTP/1.1 401
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i>< small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<http://urlblockederror.aspx/>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.










Re: Using Kerberos with Knox

Posted by Mohammad Islam <mi...@yahoo.com>.
>I updated both files as you have suggested with the value * for the groups and hosts properties.
>I then logged into Ambari and restarted HBase and HDFS, however this causes both core-site.xml files to revert >tohadoop.proxyuser.knox.groups=users and > hadoop.proxyuser.knox.hosts=sandbox.hortonworks.com
I'm not familiar with Ambari.You may need to change those values through Ambari UI -- not updating the files.Please try with "*" first.I have a feeling either sandbox.hortonworks.com and user are not matching when verified. The "*" change can help in some extent to start with.



 

    On Thursday, March 2, 2017 4:49 AM, larry mccay <lm...@apache.org> wrote:
 

 Hi Ben -
It seems like you are really close to getting everything in line.
When you are managing the cluster and Knox instance/s with Ambari, you have to make such changes in Ambari and then restart.Saving the changes in Ambari saves them to the database then restarting pushing out the new config and restarts the components.If you change the files locally and restart with Ambari then the local changes will be overwritten with what is in the Ambari database.
In addition, it seems that you are trying to impersonate the knox user which has a couple implications.1. the knox user will not likely ever be in the "users" group - therefore the check for whether the impoersonated user is in an approved group to impersonate will fail based on:<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
2. you would have to login as knox to ldap or as you were doing previously with a kinit and this is not recommended
You should use the credentials of a enduser not a service account that happens to be in the "users" group or whatever is appropriate based on your deployment. We generally use one of the users in our DEMO LDAP server such as guest:guest-password for testing.
You will also need to ensure that the enduser has an OS account on your cluster machines and is a member of the appropriate group at the OS level.
HTH,
--larry

On Thu, Mar 2, 2017 at 5:04 AM, Jovanic, Ben <be...@cgi.com> wrote:

Hi Mohammad,

The values of the properties was:
<property>
    <name>hadoop.proxyuser.knox. groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox. hosts</name>
    <value>sandbox.hortonworks.com </value>
</property>

For both of the files:
/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

FYI
# hostname -f
sandbox.hortonworks.com



I updated both files as you have suggested with the value * for the groups and hosts properties.
I then logged into Ambari and restarted HBase and HDFS, however this causes both core-site.xml files to revert tohadoop.proxyuser.knox.groups= users and hadoop.proxyuser.knox.hosts=sa ndbox.hortonworks.com



As a side I have realised I do not need to use HadoopAuth on Knox to secure it with Kerberos. LDAP is what I want to secure Knox.
I'm still hit with the same problem when Knox tries to access webhdfs:

# curl -kiu guest:guest-password https://sandbox.hortonworks. com:8443/gateway/sandbox/ webhdfs/v1/?op=LISTSTATUS
HTTP/1.1 401 Unauthorized
Date: Thu, 02 Mar 2017 10:00:55 GMT
Set-Cookie: JSESSIONID= 1w45jwocxv7fh6gc9k12gl6jy; Path=/gateway/sandbox;Secure; HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/sandbox; Max-Age=0; Expires=Wed, 01-Mar-2017 10:00:55 GMT
Cache-Control: must-revalidate,no-cache,no- store
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Content-Type: text/html; charset=ISO-8859-1
Server: Jetty(6.1.26.hwx)
Content-Length: 1403

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /webhdfs/v1/. Reason:
<pre>    Authentication required</pre></p><hr/><i>< small>Powered by Jetty://</small></i><br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 
<br/>                                                 

</body>
</html>



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com | cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
From: Mohammad Islam [mislam77@yahoo.com]
Sent: 02 March 2017 05:15
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hi Ben,What is the value you put in hadoop.proxyuser.knox.hosts  ?Can you please set  "*" for  both hadoop.proxyuser.knox. hosts andhadoop.proxyuser.knox.groups properties in core-site.xml? You may need to restart the services. BTW This is ONLY for testing purpose.
Regards,Mohammad


On Wednesday, March 1, 2017 1:02 AM, "Jovanic, Ben" <be...@cgi.com> wrote:


Hi Sandeep,

If you meant ones of these files:

/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

Then they both are already had the properties you mentioned.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com |cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
From: Sandeep More [moresandeep@gmail.com]
Sent: 27 February 2017 15:40
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,
I misspoke a bit on my previous reply, you should *not* add user 'knox' to the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox. groups' property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property in core-site.xml and you should be good.
i.e. the following should be sufficient.
<property>
    <name>hadoop.proxyuser.knox. groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox. hosts</name>
    <value>FQDN_OF_KNOX_HOST</ value>
</property>See this for more info:
http://knox.apache.org/books/ knox-0-11-0/user-guide.html# Related+Cluster+Configuration
Best,Sandeep

On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com> wrote:

Good to know you about the Progress Ben ! About the error "knox is not allowed to impersonate knox" , this might be because the user 'knox' does not have sufficient group privileges to perform the operation.
If you are using Ambari, you can check the 'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then add the user 'knox' to 'users' group (or which ever group you have in hadoop.proxyuser.knox.groups).  
Let me know how it goes !
Best,Sandeep
On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com> wrote:

Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.co m:8443/gateway/sandbox/webhdfs /v1/tmp?op=LISTSTATUS
HTTP/1.1 401 
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox @EXAMPLE.COM&t=kerberos&e=1488 188772724&s=5A//jMYbfdVTp1ggiN E3jsLZ1bE=;Path=/;Domain=sandb ox.hortonworks.com
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception ":"SecurityException"," javaClassName":"java.lang.Secu rityException","message":"Fail ed to obtain user group information: org.apache.hadoop.security.aut horize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com |cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
From: Sandeep More [more@apache.org]
Sent: 24 February 2017 18:41
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,
Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1]. 
I noticed that you are using cookie domain 'sandbox.hortonworks.com' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com' and see if that helps ?
i.e. curl -ki --negotiate -u : "https://sandbox.hortonworks.c om:8443/gateway/sandbox/webhdf s/v1/?op=LISTSTATUS"

[1] https://cwiki.apache.org/confl uence/display/KNOX/2017/02/24/ Hadoop+Auth+%28SPNEGO+and+ delegation+token+based+authent ication%29+with+Apache+Knox
Best,Sandeep
On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com> wrote:

Hello Ben,
Welcome to the list !
At first glance your knox configs look ok to me, it could be related to a setup issue.
In the setup procedure mentioned, you followed 1. Ambari instruction to setup Kerberos (item #3) and 2. Knox instructions for setting up Kerberos 
Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)
I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).

Best,sandeep
On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com> wrote:

Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:
   
   - I'm using HDP-2.4.0.0-169.
   - I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and runhdp-select set knox-server 0.11.0.
   - Kerberos has been set up using these instructions (https://docs.hortonworks.com/ HDPDocuments/Ambari-2.1.1.0/bk _Ambari_Security_Guide/content /ch_configuring_amb_hdp_for_ke rberos.html)
   - I've validated the Kerberos set up by using the following curl statement after a kinit:


$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/ v1/tmp?op=LISTSTATUS"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-st ore
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/a dmin@EXAMPLE.COM&t=kerberos&e= 1487710154130&s=gt9iw89RJ7XMd0 XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus": [
{"accessTime":0,"blockSize":0, "childrenNum":1,"fileId":16397 ,"group":"hdfs","length":0,"mo dificationTime":1456768692570, "owner":"hdfs","pathSuffix":"e ntity-file-history","permissio n":"755","replication":0,"stor agePolicy":0,"type":"DIRECTORY "},
{"accessTime":0,"blockSize":0, "childrenNum":3,"fileId":16434 ,"group":"hdfs","length":0,"mo dificationTime":1456785191888, "owner":"ambari-qa","pathSuffi x":"hive","permission":"733"," replication":0,"storagePolicy" :0,"type":"DIRECTORY"}
]}}



What I've tried with Knox
I've gone through these instructions (https://knox.apache.org/books /knox-0-11-0/user-guide.html#S ecure+Clusters) to create the knox keytab, update the 2 conf files and update gateway-site.xml. 

krb5.conf:[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM = {
  kdc = sandbox.hortonworks.com
  admin_server = sandbox.hortonworks.com
 }

During Kerberos set up I did leave the realm as EXAMPLE.COM.

krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.K rb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox- server/conf/knox.service.keyta b"
principal="HTTP/sandbox.horton works.com@EXAMPLE.COM"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.s ervice.keytab and/etc/security/keytabs/knox.ser vice.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</val ue>
            </param>
            <param>
                <name>hadoop.auth.config.signa ture.secret</name>
                <!--<value>78hdkjaka</value>-- >
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type< /name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simpl e.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token .validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.domain</name>
                <value>sandbox.hortonworks.com </value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.path</name>
                <!--<value>gateway/hada</value >-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.principal</name>
                <value>HTTP/sandbox.hortonwork s.com@EXAMPLE.COM</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.keytab</name>
                <value>/usr/hdp/current/knox-s erver/conf/knox.service.keytab </value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role >
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonwor ks.com</name><value>sandbox,sa ndbox.hortonworks.com</value>< /param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gatewa y/sandbox/webhdfs/v1/?op=LISTS TATUS"
HTTP/1.1 401 
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i>< small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive. 

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com |cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.














   

Re: Using Kerberos with Knox

Posted by larry mccay <lm...@apache.org>.
Hi Ben -

It seems like you are really close to getting everything in line.

When you are managing the cluster and Knox instance/s with Ambari, you have
to make such changes in Ambari and then restart.
Saving the changes in Ambari saves them to the database then restarting
pushing out the new config and restarts the components.
If you change the files locally and restart with Ambari then the local
changes will be overwritten with what is in the Ambari database.

In addition, it seems that you are trying to impersonate the knox user
which has a couple implications.
1. the knox user will not likely ever be in the "users" group - therefore
the check for whether the impoersonated user is in an approved group to
impersonate will fail based on:
<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
2. you would have to login as knox to ldap or as you were doing previously
with a kinit and this is not recommended

You should use the credentials of a enduser not a service account that
happens to be in the "users" group or whatever is appropriate based on your
deployment. We generally use one of the users in our DEMO LDAP server such
as guest:guest-password for testing.

You will also need to ensure that the enduser has an OS account on your
cluster machines and is a member of the appropriate group at the OS level.

HTH,

--larry


On Thu, Mar 2, 2017 at 5:04 AM, Jovanic, Ben <be...@cgi.com> wrote:

> Hi Mohammad,
>
> The values of the properties was:
> <property>
>     <name>hadoop.proxyuser.knox.groups</name>
>     <value>users</value>
> </property>
> <property>
>     <name>hadoop.proxyuser.knox.hosts</name>
>     <value>sandbox.hortonworks.com</value>
> </property>
>
> For both of the files:
> /etc/hbase/conf/core-site.xml
> /etc/hadoop/conf/core-site.xml
>
> FYI
> # hostname -f
> sandbox.hortonworks.com
>
>
>
> I updated both files as you have suggested with the value * for the
> groups and hosts properties.
> I then logged into Ambari and restarted HBase and HDFS, however this
> causes both core-site.xml files to revert to hadoop.proxyuser.knox.groups=
> users and hadoop.proxyuser.knox.hosts=sandbox.hortonworks.com
>
>
>
> As a side I have realised I do not need to use HadoopAuth on Knox to
> secure it with Kerberos. LDAP is what I want to secure Knox.
> I'm still hit with the same problem when Knox tries to access webhdfs:
>
> # curl -kiu guest:guest-password https://sandbox.hortonworks.
> com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS
> HTTP/1.1 401 Unauthorized
> Date: Thu, 02 Mar 2017 10:00:55 GMT
> Set-Cookie: JSESSIONID=1w45jwocxv7fh6gc9k12gl6jy;
> Path=/gateway/sandbox;Secure;HttpOnly
> Expires: Thu, 01 Jan 1970 00:00:00 GMT
> Set-Cookie: rememberMe=deleteMe; Path=/gateway/sandbox; Max-Age=0;
> Expires=Wed, 01-Mar-2017 10:00:55 GMT
> Cache-Control: must-revalidate,no-cache,no-store
> Date: Thu, 02 Mar 2017 10:00:55 GMT
> Pragma: no-cache
> Date: Thu, 02 Mar 2017 10:00:55 GMT
> Pragma: no-cache
> Content-Type: text/html; charset=ISO-8859-1
> Server: Jetty(6.1.26.hwx)
> Content-Length: 1403
>
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
> <title>Error 401 Authentication required</title>
> </head>
> <body><h2>HTTP ERROR 401</h2>
> <p>Problem accessing /webhdfs/v1/. Reason:
> <pre>    Authentication required</pre></p><hr/><i><small>Powered by
> Jetty://</small></i><br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
> <br/>
>
> </body>
> </html>
>
>
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <+44%207917%20505645>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
> ------------------------------
> *From:* Mohammad Islam [mislam77@yahoo.com]
> *Sent:* 02 March 2017 05:15
>
> *To:* user@knox.apache.org
> *Subject:* Re: Using Kerberos with Knox
>
> Hi Ben,
> What is the value you put in hadoop.proxyuser.knox.hosts ?
> Can you please set  "*" for  both hadoop.proxyuser.knox.hosts and hadoop.proxyuser.knox.groups
> properties in core-site.xml? You may need to restart the services. BTW This
> is ONLY for testing purpose.
>
> Regards,
> Mohammad
>
>
>
> On Wednesday, March 1, 2017 1:02 AM, "Jovanic, Ben" <be...@cgi.com>
> wrote:
>
>
> Hi Sandeep,
>
> If you meant ones of these files:
>
> /etc/hbase/conf/core-site.xml
> /etc/hadoop/conf/core-site.xml
>
> Then they both are already had the properties you mentioned.
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <+44%207917%20505645>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk/>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
> ------------------------------
> *From:* Sandeep More [moresandeep@gmail.com]
> *Sent:* 27 February 2017 15:40
> *To:* user@knox.apache.org
> *Subject:* Re: Using Kerberos with Knox
>
> Hello Ben,
>
> I misspoke a bit on my previous reply, you should *not* add user 'knox' to
> the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox.groups'
> property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts'
> property in core-site.xml and you should be good.
>
> i.e. the following should be sufficient.
>
> <property>
>     <name>hadoop.proxyuser.knox.groups</name>
>     <value>users</value>
> </property>
> <property>
>     <name>hadoop.proxyuser.knox.hosts</name>
>     <value>FQDN_OF_KNOX_HOST</value>
> </property>
>
> See this for more info:
> http://knox.apache.org/books/knox-0-11-0/user-guide.html#
> Related+Cluster+Configuration
>
> Best,
> Sandeep
>
>
> On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com>
> wrote:
>
> Good to know you about the Progress Ben !
> About the error "knox is not allowed to impersonate knox" , this might be
> because the user 'knox' does not have sufficient group privileges to
> perform the operation.
>
> If you are using Ambari, you can check the
> 'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then
> add the user 'knox' to 'users' group (or which ever group you have in
> hadoop.proxyuser.knox.groups).
>
> Let me know how it goes !
>
> Best,
> Sandeep
>
> On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com> wrote:
>
> Hi Sandeep,
>
> Your suggestion of using sanbox.hortonworks.com as the domain has gotten
> me a step farther. Thank you!
>
> Now I'm getting an authorisation error which I'll dig into (unless anyone
> can offer a solution :))
>
>
> # curl -ki --negotiate -u : https://sandbox.hortonworks.co
> m:8443/gateway/sandbox/webhdfs /v1/tmp?op=LISTSTATUS
> <https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS>
> HTTP/1.1 401
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c
> om <http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00
> GMT;Max-Age=0
> Content-Type: text/html; charset=ISO-8859-1
> Cache-Control: must-revalidate,no-cache,no-st ore
> Content-Length: 320
> Server: Jetty(9.2.15.v20160210)
>
> HTTP/1.1 403 Forbidden
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Set-Cookie: hadoop.auth=u=knox&p=knox/knox @EXAMPLE.COM <kn...@EXAMPLE.COM>&t=kerberos&e=1488
> 188772724&s=5A//jMYbfdVTp1ggiN E3jsLZ1bE=;Path=/;Domain=sandb
> ox.hortonworks.com <http://sandbox.hortonworks.com/>
> Expires: Thu, 01 Jan 1970 00:00:00 GMT
> Cache-Control: no-cache
> Expires: Mon, 27 Feb 2017 09:16:12 GMT
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Pragma: no-cache
> Expires: Mon, 27 Feb 2017 09:16:12 GMT
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Pragma: no-cache
> Content-Type: application/json; charset=UTF-8
> Server: Jetty(6.1.26.hwx)
> Content-Length: 259
>
> {"RemoteException":{"exception ":"SecurityException","
> javaClassName":"java.lang.Secu rityException","message":"Fail ed to obtain
> user group information: org.apache.hadoop.security.aut
> horize.AuthorizationException: User: knox is not allowed to impersonate
> knox"}}
>
>
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <http://UrlBlockedError.aspx>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk/>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
> ------------------------------
> *From:* Sandeep More [more@apache.org]
> *Sent:* 24 February 2017 18:41
> *To:* user@knox.apache.org
> *Subject:* Re: Using Kerberos with Knox
>
> Hello Ben,
>
> Just following up on this issue, I did try testing HadoopAuth provider
> with Knox and it seems to work in my case, I tried to document the process
> in the blog post [1].
>
> I noticed that you are using cookie domain 'sandbox.hortonworks.com' but
> in your curl request you are using 'localhost', IMO this would fail, can
> you try using 'sandbox.hortonworks.com' and see if that helps ?
>
> i.e.
> curl -ki --negotiate -u : "https://
> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>sandbox.hortonworks.c
> om <http://sandbox.hortonworks.com/>:8443/gateway/sandbox/webhdf
> s/v1/?op=LISTSTATUS
> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
>
>
> [1] https://cwiki.apache.org/confl uence/display/KNOX/2017/02/24/
> Hadoop+Auth+%28SPNEGO+and+ delegation+token+based+authent
> ication%29+with+Apache+Knox
> <https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox>
>
> Best,
> Sandeep
>
> On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>
> wrote:
>
> Hello Ben,
>
> Welcome to the list !
>
> At first glance your knox configs look ok to me, it could be related to a
> setup issue.
>
> In the setup procedure mentioned, you followed
> 1. Ambari instruction to setup Kerberos (item #3) and
> 2. Knox instructions for setting up Kerberos
>
> Ambari setup already takes care of Knox Kerberos setup, so you just have
> to go with Ambari instructions (assuming you setup Knox from Ambari
> initially)
>
> I am assuming you installed Apache Knox from a Zip or a tgz file (since
> 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command
> did you take into account the directory structure for zip installs (since
> the directory structure from the rpms are different than the zip/tgz ones).
>
>
> Best,
> sandeep
>
> On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com> wrote:
>
> Hi,
>
> First time emailing the user mailing list. I work for CGI and am currently
> working on Knox for one of our projects.
>
> I'm struggling to get Kerberos and Knox set up together on my HDP. Knox
> works fine on its own with LDAP and Kerberos works with WebHDFS.
>
> *The set up:*
>
>    - I'm using HDP-2.4.0.0-169.
>    - I'm using Knox 0.11.0 -- which I've installed at
>    /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
>    - Kerberos has been set up using these instructions (https://docs.hortonworks.com/
>    HDPDocuments/Ambari-2.1.1.0/bk _Ambari_Security_Guide/content
>    /ch_configuring_amb_hdp_for_ke rberos.html
>    <https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>
>    )
>    - I've validated the Kerberos set up by using the following curl
>    statement after a kinit:
>
> $ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/
> v1/tmp?op=LISTSTATUS <http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS>"
> HTTP/1.1 401 Authentication required
> Cache-Control: must-revalidate,no-cache,no-st ore
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Content-Type: text/html; charset=iso-8859-1
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth=; Path=/; HttpOnly
> Content-Length: 1407
> Server: Jetty(6.1.26.hwx)
>
> HTTP/1.1 200 OK
> Cache-Control: no-cache
> Expires: Tue, 21 Feb 2017 10:49:14 GMT
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Expires: Tue, 21 Feb 2017 10:49:14 GMT
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Content-Type: application/json
> Set-Cookie: hadoop.auth="u=admin&p=admin/a dmin@EXAMPLE.COM
> <ad...@EXAMPLE.COM>&t=kerberos&e= 1487710154130&s=gt9iw89RJ7XMd0
> XFA+xm49hUet0="; Path=/; HttpOnly
> Transfer-Encoding: chunked
> Server: Jetty(6.1.26.hwx)
>
> {"FileStatuses":{"FileStatus": [
> {"accessTime":0,"blockSize":0, "childrenNum":1,"fileId":16397
> ,"group":"hdfs","length":0,"mo dificationTime":1456768692570,
> "owner":"hdfs","pathSuffix":"e ntity-file-history","permissio
> n":"755","replication":0,"stor agePolicy":0,"type":"DIRECTORY "},
> {"accessTime":0,"blockSize":0, "childrenNum":3,"fileId":16434
> ,"group":"hdfs","length":0,"mo dificationTime":1456785191888,
> "owner":"ambari-qa","pathSuffi x":"hive","permission":"733","
> replication":0,"storagePolicy" :0,"type":"DIRECTORY"}
> ]}}
>
> *What I've tried with Knox*
>
> I've gone through these instructions (https://knox.apache.org/books
> /knox-0-11-0/user-guide.html#S ecure+Clusters
> <https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters>)
> to create the knox keytab, update the 2 conf files and update
> gateway-site.xml.
>
> krb5.conf:
> [logging]
>  default = FILE:/var/log/krb5libs.log
>  kdc = FILE:/var/log/krb5kdc.log
>  admin_server = FILE:/var/log/kadmind.log
>
> [libdefaults]
>  default_realm = EXAMPLE.COM <http://example.com/>
>  dns_lookup_realm = false
>  dns_lookup_kdc = false
>  ticket_lifetime = 24h
>  renew_lifetime = 7d
>  forwardable = true
>
> [realms]
>  EXAMPLE.COM <http://example.com/> = {
>   kdc = sandbox.hortonworks.com
>   admin_server = sandbox.hortonworks.com
>  }
>
> During Kerberos set up I did leave the realm as EXAMPLE.COM
> <http://example.com/>.
>
> krb5JAASLogin.conf:
> com.sun.security.jgss.initiate {
> com.sun.security.auth.module.K rb5LoginModule required
> renewTGT=true
> doNotPrompt=true
> useKeyTab=true
> keyTab="/usr/hdp/current/knox- server/conf/knox.service.keyta b"
> principal="HTTP/sandbox.horton works.com@EXAMPLE.COM
> <sa...@EXAMPLE.COM>"
> isInitiator=true
> storeKey=true
> useTicketCache=true
> client=true;
> };
>
> I have tried different keytabs, like /etc/security/keytabs/spnego.s
> ervice.keytab and /etc/security/keytabs/knox.ser vice.keytab.
> I have tried other principles like knox/knox@EXAMPLE.COM.
>
> I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.
>
> sandbox.xml:
> <topology>
>
>     <gateway>
>
>         <provider>
>             <role>authentication</role>
>             <name>HadoopAuth</name>
>             <enabled>true</enabled>
>
>             <param>
>                 <name>config.prefix</name>
>                 <value>hadoop.auth.config</val ue>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.signa ture.secret</name>
>                 <!--<value>78hdkjaka</value>-- >
>                 <value></value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.type< /name>
>                 <value>kerberos</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.simpl e.anonymous.allowed</name>
>                 <value>false</value> <!-- default: false -->
>             </param>
>             <param>
>                 <name>hadoop.auth.config.token .validity</name>
>                 <value>1800</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.cooki e.domain</name>
>                 <value>sandbox.hortonworks.com </value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.cooki e.path</name>
>                 <!--<value>gateway/hada</value >-->
>                 <value>/</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerbe ros.principal</name>
>                 <value>HTTP/sandbox.hortonwork s.com@EXAMPLE.COM
> <sa...@EXAMPLE.COM></value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerbe ros.keytab</name>
>                 <value>/usr/hdp/current/knox-s
> erver/conf/knox.service.keytab </value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerbe ros.name.rules</name>
>                 <value>DEFAULT</value>
>             </param>
>
>         </provider>
>
>         <provider>
>             <role>identity-assertion</role >
>             <name>Default</name>
>             <enabled>true</enabled>
>          <!-- param>
>                 <name>principal.mapping</name>
>                 <value>sam=god;</value>
>          </param -->
>
>         </provider>
>
>         <provider>
>             <role>hostmap</role>
>             <name>static</name>
>             <enabled>false</enabled>
>             <param><name>sandbox.hortonwor ks.com
> <http://sandbox.hortonworks.com/></name><value>sandbox,sa
> ndbox.hortonworks.com <http://sandbox.hortonworks.com/></value>< /param>
>         </provider>
>
>     </gateway>
>
>   <service>...</service>
>   ...
>
> </topology>
>
> Again, tried different principle and keytab values with no success. And
> every attempt to use kinit with different principles and keytabs results in
> the following message:
>
> $ curl -ki --negotiate -u : "https://localhost:8443/gatewa
> y/sandbox/webhdfs/v1/?op=LISTS TATUS
> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
> HTTP/1.1 401
> Date: Tue, 21 Feb 2017 11:18:32 GMT
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c
> om <http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00
> GMT;Max-Age=0
> Content-Type: text/html; charset=ISO-8859-1
> Cache-Control: must-revalidate,no-cache,no-st ore
> Content-Length: 317
> Server: Jetty(9.2.15.v20160210)
>
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
> <title>Error 401 Unauthorized</title>
> </head>
> <body><h2>HTTP ERROR 401</h2>
> <p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
> <pre>    Unauthorized</pre></p><hr><i>< small>Powered by
> Jetty://</small></i><hr/>
>
> </body>
> </html>
>
> Thanks in advance for any help I receive.
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <http://UrlBlockedError.aspx>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk/>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
>
>
>
>
>
>
>
>

RE: Using Kerberos with Knox

Posted by "Jovanic, Ben" <be...@cgi.com>.
Hi Mohammad,

The values of the properties was:
<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox.hosts</name>
    <value>sandbox.hortonworks.com</value>
</property>

For both of the files:
/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

FYI
# hostname -f
sandbox.hortonworks.com



I updated both files as you have suggested with the value * for the groups and hosts properties.
I then logged into Ambari and restarted HBase and HDFS, however this causes both core-site.xml files to revert to hadoop.proxyuser.knox.groups=users and hadoop.proxyuser.knox.hosts=sandbox.hortonworks.com



As a side I have realised I do not need to use HadoopAuth on Knox to secure it with Kerberos. LDAP is what I want to secure Knox.
I'm still hit with the same problem when Knox tries to access webhdfs:

# curl -kiu guest:guest-password https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS
HTTP/1.1 401 Unauthorized
Date: Thu, 02 Mar 2017 10:00:55 GMT
Set-Cookie: JSESSIONID=1w45jwocxv7fh6gc9k12gl6jy;Path=/gateway/sandbox;Secure;HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/sandbox; Max-Age=0; Expires=Wed, 01-Mar-2017 10:00:55 GMT
Cache-Control: must-revalidate,no-cache,no-store
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Date: Thu, 02 Mar 2017 10:00:55 GMT
Pragma: no-cache
Content-Type: text/html; charset=ISO-8859-1
Server: Jetty(6.1.26.hwx)
Content-Length: 1403

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /webhdfs/v1/. Reason:
<pre>    Authentication required</pre></p><hr/><i><small>Powered by Jetty://</small></i><br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>

</body>
</html>



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Mohammad Islam [mislam77@yahoo.com]
Sent: 02 March 2017 05:15
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hi Ben,
What is the value you put in hadoop.proxyuser.knox.hosts ?
Can you please set  "*" for  both hadoop.proxyuser.knox.hosts and hadoop.proxyuser.knox.groups properties in core-site.xml? You may need to restart the services. BTW This is ONLY for testing purpose.

Regards,
Mohammad



On Wednesday, March 1, 2017 1:02 AM, "Jovanic, Ben" <be...@cgi.com> wrote:


Hi Sandeep,

If you meant ones of these files:

/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

Then they both are already had the properties you mentioned.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [moresandeep@gmail.com]
Sent: 27 February 2017 15:40
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,

I misspoke a bit on my previous reply, you should *not* add user 'knox' to the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox.groups' property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property in core-site.xml and you should be good.

i.e. the following should be sufficient.


<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox.hosts</name>
    <value>FQDN_OF_KNOX_HOST</value>
</property>

See this for more info:
http://knox.apache.org/books/knox-0-11-0/user-guide.html#Related+Cluster+Configuration

Best,
Sandeep


On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com>> wrote:
Good to know you about the Progress Ben !
About the error "knox is not allowed to impersonate knox" , this might be because the user 'knox' does not have sufficient group privileges to perform the operation.

If you are using Ambari, you can check the
'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then add the user 'knox' to 'users' group (or which ever group you have in hadoop.proxyuser.knox.groups).

Let me know how it goes !

Best,
Sandeep

On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com<http://sanbox.hortonworks.com/> as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.co m:8443/gateway/sandbox/webhdfs /v1/tmp?op=LISTSTATUS<https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS>
HTTP/1.1 401
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox @EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e=1488 188772724&s=5A//jMYbfdVTp1ggiN E3jsLZ1bE=;Path=/;Domain=sandb ox.hortonworks.com<http://sandbox.hortonworks.com/>
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception ":"SecurityException"," javaClassName":"java.lang.Secu rityException","message":"Fail ed to obtain user group information: org.apache.hadoop.security.aut horize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<UrlBlockedError.aspx>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [more@apache.org<ma...@apache.org>]
Sent: 24 February 2017 18:41
To: user@knox.apache.org<ma...@knox.apache.org>
Subject: Re: Using Kerberos with Knox

Hello Ben,

Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1].

I noticed that you are using cookie domain 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' and see if that helps ?

i.e.
curl -ki --negotiate -u : "https://<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>:8443/gateway/sandbox/webhdf s/v1/?op=LISTSTATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"


[1] https://cwiki.apache.org/confl uence/display/KNOX/2017/02/24/ Hadoop+Auth+%28SPNEGO+and+ delegation+token+based+authent ication%29+with+Apache+Knox<https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox>

Best,
Sandeep

On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>> wrote:
Hello Ben,

Welcome to the list !

At first glance your knox configs look ok to me, it could be related to a setup issue.

In the setup procedure mentioned, you followed
1. Ambari instruction to setup Kerberos (item #3) and
2. Knox instructions for setting up Kerberos

Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)

I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).


Best,
sandeep

On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:

  *   I'm using HDP-2.4.0.0-169.
  *   I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
  *   Kerberos has been set up using these instructions (https://docs.hortonworks.com/ HDPDocuments/Ambari-2.1.1.0/bk _Ambari_Security_Guide/content /ch_configuring_amb_hdp_for_ke rberos.html<https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>)
  *   I've validated the Kerberos set up by using the following curl statement after a kinit:

$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/ v1/tmp?op=LISTSTATUS<http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS>"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-st ore
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/a dmin@EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e= 1487710154130&s=gt9iw89RJ7XMd0 XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus": [
{"accessTime":0,"blockSize":0, "childrenNum":1,"fileId":16397 ,"group":"hdfs","length":0,"mo dificationTime":1456768692570, "owner":"hdfs","pathSuffix":"e ntity-file-history","permissio n":"755","replication":0,"stor agePolicy":0,"type":"DIRECTORY "},
{"accessTime":0,"blockSize":0, "childrenNum":3,"fileId":16434 ,"group":"hdfs","length":0,"mo dificationTime":1456785191888, "owner":"ambari-qa","pathSuffi x":"hive","permission":"733"," replication":0,"storagePolicy" :0,"type":"DIRECTORY"}
]}}

What I've tried with Knox

I've gone through these instructions (https://knox.apache.org/books /knox-0-11-0/user-guide.html#S ecure+Clusters<https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters>) to create the knox keytab, update the 2 conf files and update gateway-site.xml.

krb5.conf:
[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM<http://example.com/>
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM<http://example.com/> = {
  kdc = sandbox.hortonworks.com<http://sandbox.hortonworks.com/>
  admin_server = sandbox.hortonworks.com<http://sandbox.hortonworks.com/>
 }

During Kerberos set up I did leave the realm as EXAMPLE.COM<http://example.com/>.

krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.K rb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox- server/conf/knox.service.keyta b"
principal="HTTP/sandbox.horton works.com@EXAMPLE.COM<ma...@EXAMPLE.COM>"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.s ervice.keytab and /etc/security/keytabs/knox.ser vice.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM<ma...@EXAMPLE.COM>.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</val ue>
            </param>
            <param>
                <name>hadoop.auth.config.signa ture.secret</name>
                <!--<value>78hdkjaka</value>-- >
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type< /name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simpl e.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token .validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.domain</name>
                <value>sandbox.hortonworks.com<http://sandbox.hortonworks.com/> </value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.path</name>
                <!--<value>gateway/hada</value >-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.principal</name>
                <value>HTTP/sandbox.hortonwork s.com@EXAMPLE.COM<ma...@EXAMPLE.COM></value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.keytab</name>
                <value>/usr/hdp/current/knox-s erver/conf/knox.service.keytab </value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role >
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonwor ks.com<http://sandbox.hortonworks.com/></name><value>sandbox,sa ndbox.hortonworks.com<http://sandbox.hortonworks.com/></value>< /param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gatewa y/sandbox/webhdfs/v1/?op=LISTS TATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
HTTP/1.1 401
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om<http://sandbox.hortonworks.com/>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i>< small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<UrlBlockedError.aspx>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk/>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.







Re: Using Kerberos with Knox

Posted by Mohammad Islam <mi...@yahoo.com>.
Hi Ben,What is the value you put in hadoop.proxyuser.knox.hosts ?Can you please set  "*" for  both hadoop.proxyuser.knox.hosts and hadoop.proxyuser.knox.groups properties in core-site.xml? You may need to restart the services. BTW This is ONLY for testing purpose.
Regards,Mohammad
 

    On Wednesday, March 1, 2017 1:02 AM, "Jovanic, Ben" <be...@cgi.com> wrote:
 

  #yiv8561471615 P {margin-top:0;margin-bottom:0;}Hi Sandeep,

If you meant ones of these files:

/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

Then they both are already had the properties you mentioned.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com | cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
From: Sandeep More [moresandeep@gmail.com]
Sent: 27 February 2017 15:40
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,
I misspoke a bit on my previous reply, you should *not* add user 'knox' to the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox.groups' property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property in core-site.xml and you should be good.
i.e. the following should be sufficient.
<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox.hosts</name>
    <value>FQDN_OF_KNOX_HOST</value>
</property>See this for more info:
http://knox.apache.org/books/knox-0-11-0/user-guide.html#Related+Cluster+Configuration
Best,Sandeep

On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com> wrote:

Good to know you about the Progress Ben ! About the error "knox is not allowed to impersonate knox" , this might be because the user 'knox' does not have sufficient group privileges to perform the operation.
If you are using Ambari, you can check the 'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then add the user 'knox' to 'users' group (or which ever group you have in hadoop.proxyuser.knox.groups).  
Let me know how it goes !
Best,Sandeep
On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com> wrote:

Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.co m:8443/gateway/sandbox/webhdfs /v1/tmp?op=LISTSTATUS
HTTP/1.1 401 
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox @EXAMPLE.COM&t=kerberos&e=1488 188772724&s=5A//jMYbfdVTp1ggiN E3jsLZ1bE=;Path=/;Domain=sandb ox.hortonworks.com
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception ":"SecurityException"," javaClassName":"java.lang.Secu rityException","message":"Fail ed to obtain user group information: org.apache.hadoop.security.aut horize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com | cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
From: Sandeep More [more@apache.org]
Sent: 24 February 2017 18:41
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,
Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1]. 
I noticed that you are using cookie domain 'sandbox.hortonworks.com' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com' and see if that helps ?
i.e. curl -ki --negotiate -u : "https://sandbox.hortonworks.c om:8443/gateway/sandbox/webhdf s/v1/?op=LISTSTATUS"

[1] https://cwiki.apache.org/confl uence/display/KNOX/2017/02/24/ Hadoop+Auth+%28SPNEGO+and+ delegation+token+based+authent ication%29+with+Apache+Knox
Best,Sandeep
On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com> wrote:

Hello Ben,
Welcome to the list !
At first glance your knox configs look ok to me, it could be related to a setup issue.
In the setup procedure mentioned, you followed 1. Ambari instruction to setup Kerberos (item #3) and 2. Knox instructions for setting up Kerberos 
Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)
I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).

Best,sandeep
On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com> wrote:

Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:
   
   - I'm using HDP-2.4.0.0-169.
   - I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and runhdp-select set knox-server 0.11.0.
   - Kerberos has been set up using these instructions (https://docs.hortonworks.com/ HDPDocuments/Ambari-2.1.1.0/bk _Ambari_Security_Guide/content /ch_configuring_amb_hdp_for_ke rberos.html)
   - I've validated the Kerberos set up by using the following curl statement after a kinit:


$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/ v1/tmp?op=LISTSTATUS"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-st ore
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/a dmin@EXAMPLE.COM&t=kerberos&e= 1487710154130&s=gt9iw89RJ7XMd0 XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus": [
{"accessTime":0,"blockSize":0, "childrenNum":1,"fileId":16397 ,"group":"hdfs","length":0,"mo dificationTime":1456768692570, "owner":"hdfs","pathSuffix":"e ntity-file-history","permissio n":"755","replication":0,"stor agePolicy":0,"type":"DIRECTORY "},
{"accessTime":0,"blockSize":0, "childrenNum":3,"fileId":16434 ,"group":"hdfs","length":0,"mo dificationTime":1456785191888, "owner":"ambari-qa","pathSuffi x":"hive","permission":"733"," replication":0,"storagePolicy" :0,"type":"DIRECTORY"}
]}}



What I've tried with Knox
I've gone through these instructions (https://knox.apache.org/books /knox-0-11-0/user-guide.html#S ecure+Clusters) to create the knox keytab, update the 2 conf files and update gateway-site.xml. 

krb5.conf:[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM = {
  kdc = sandbox.hortonworks.com
  admin_server = sandbox.hortonworks.com
 }

During Kerberos set up I did leave the realm as EXAMPLE.COM.

krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.K rb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox- server/conf/knox.service.keyta b"
principal="HTTP/sandbox.horton works.com@EXAMPLE.COM"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.s ervice.keytab and/etc/security/keytabs/knox.ser vice.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</val ue>
            </param>
            <param>
                <name>hadoop.auth.config.signa ture.secret</name>
                <!--<value>78hdkjaka</value>-- >
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type< /name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simpl e.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token .validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.domain</name>
                <value>sandbox.hortonworks.com </value>
            </param>
            <param>
                <name>hadoop.auth.config.cooki e.path</name>
                <!--<value>gateway/hada</value >-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.principal</name>
                <value>HTTP/sandbox.hortonwork s.com@EXAMPLE.COM</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.keytab</name>
                <value>/usr/hdp/current/knox-s erver/conf/knox.service.keytab </value>
            </param>
            <param>
                <name>hadoop.auth.config.kerbe ros.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role >
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonwor ks.com</name><value>sandbox,sa ndbox.hortonworks.com</value>< /param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gatewa y/sandbox/webhdfs/v1/?op=LISTS TATUS"
HTTP/1.1 401 
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path= /;Domain=sandbox.hortonworks.c om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-st ore
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i>< small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive. 

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com | cgi-group.co.uk

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.










   

RE: Using Kerberos with Knox

Posted by "Jovanic, Ben" <be...@cgi.com>.
Hi Sandeep,

If you meant ones of these files:

/etc/hbase/conf/core-site.xml
/etc/hadoop/conf/core-site.xml

Then they both are already had the properties you mentioned.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [moresandeep@gmail.com]
Sent: 27 February 2017 15:40
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,

I misspoke a bit on my previous reply, you should *not* add user 'knox' to the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox.groups' property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property in core-site.xml and you should be good.

i.e. the following should be sufficient.


<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox.hosts</name>
    <value>FQDN_OF_KNOX_HOST</value>
</property>

See this for more info:
http://knox.apache.org/books/knox-0-11-0/user-guide.html#Related+Cluster+Configuration

Best,
Sandeep


On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com>> wrote:
Good to know you about the Progress Ben !
About the error "knox is not allowed to impersonate knox" , this might be because the user 'knox' does not have sufficient group privileges to perform the operation.

If you are using Ambari, you can check the
'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then add the user 'knox' to 'users' group (or which ever group you have in hadoop.proxyuser.knox.groups).

Let me know how it goes !

Best,
Sandeep

On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com<http://sanbox.hortonworks.com> as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
HTTP/1.1 401
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com<http://sandbox.hortonworks.com>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox@EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e=1488188772724&s=5A//jMYbfdVTp1ggiNE3jsLZ1bE=;Path=/;Domain=sandbox.hortonworks.com<http://sandbox.hortonworks.com>
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception":"SecurityException","javaClassName":"java.lang.SecurityException","message":"Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<tel:+44%207917%20505645>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [more@apache.org<ma...@apache.org>]
Sent: 24 February 2017 18:41
To: user@knox.apache.org<ma...@knox.apache.org>
Subject: Re: Using Kerberos with Knox

Hello Ben,

Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1].

I noticed that you are using cookie domain 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' and see if that helps ?

i.e.
curl -ki --negotiate -u : "https://<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>sandbox.hortonworks.com<http://sandbox.hortonworks.com/>:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"


[1] https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox

Best,
Sandeep

On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>> wrote:
Hello Ben,

Welcome to the list !

At first glance your knox configs look ok to me, it could be related to a setup issue.

In the setup procedure mentioned, you followed
1. Ambari instruction to setup Kerberos (item #3) and
2. Knox instructions for setting up Kerberos

Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)

I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).


Best,
sandeep

On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:

  *   I'm using HDP-2.4.0.0-169.
  *   I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
  *   Kerberos has been set up using these instructions (https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html)
  *   I've validated the Kerberos set up by using the following curl statement after a kinit:

$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-store
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e=1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus":[
{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397,"group":"hdfs","length":0,"modificationTime":1456768692570,"owner":"hdfs","pathSuffix":"entity-file-history","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
{"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434,"group":"hdfs","length":0,"modificationTime":1456785191888,"owner":"ambari-qa","pathSuffix":"hive","permission":"733","replication":0,"storagePolicy":0,"type":"DIRECTORY"}
]}}


What I've tried with Knox


I've gone through these instructions (https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox keytab, update the 2 conf files and update gateway-site.xml.


krb5.conf:

[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM<http://EXAMPLE.COM>
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM<http://EXAMPLE.COM> = {
  kdc = sandbox.hortonworks.com<http://sandbox.hortonworks.com>
  admin_server = sandbox.hortonworks.com<http://sandbox.hortonworks.com>
 }


During Kerberos set up I did leave the realm as EXAMPLE.COM<http://EXAMPLE.COM>.


krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.Krb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM<ma...@EXAMPLE.COM>"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.service.keytab and /etc/security/keytabs/knox.service.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM<ma...@EXAMPLE.COM>.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</value>
            </param>
            <param>
                <name>hadoop.auth.config.signature.secret</name>
                <!--<value>78hdkjaka</value>-->
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type</name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simple.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token.validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.domain</name>
                <value>sandbox.hortonworks.com<http://sandbox.hortonworks.com></value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.path</name>
                <!--<value>gateway/hada</value>-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.principal</name>
                <va...@EXAMPLE.COM></value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.keytab</name>
                <value>/usr/hdp/current/knox-server/conf/knox.service.keytab</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role>
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonworks.com<http://sandbox.hortonworks.com></name><value>sandbox,sandbox.hortonworks.com<http://sandbox.hortonworks.com></value></param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS"
HTTP/1.1 401
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com<http://sandbox.hortonworks.com>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<tel:+44%207917%20505645>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.





Re: Using Kerberos with Knox

Posted by Sandeep More <mo...@gmail.com>.
Hello Ben,

I misspoke a bit on my previous reply, you should *not* add user 'knox' to
the 'users' group, instead  just add 'users' to  'hadoop.proxyuser.knox.groups'
property and add the FQDN of Knox to 'hadoop.proxyuser.knox.hosts' property
in core-site.xml and you should be good.

i.e. the following should be sufficient.

<property>
    <name>hadoop.proxyuser.knox.groups</name>
    <value>users</value>
</property>
<property>
    <name>hadoop.proxyuser.knox.hosts</name>
    <value>FQDN_OF_KNOX_HOST</value>
</property>

See this for more info:
http://knox.apache.org/books/knox-0-11-0/user-guide.html#Related+Cluster+Configuration

Best,
Sandeep


On Mon, Feb 27, 2017 at 9:13 AM, Sandeep More <mo...@gmail.com> wrote:

> Good to know you about the Progress Ben !
> About the error "knox is not allowed to impersonate knox" , this might be
> because the user 'knox' does not have sufficient group privileges to
> perform the operation.
>
> If you are using Ambari, you can check the
> 'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then
> add the user 'knox' to 'users' group (or which ever group you have in
> hadoop.proxyuser.knox.groups).
>
> Let me know how it goes !
>
> Best,
> Sandeep
>
> On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com> wrote:
>
>> Hi Sandeep,
>>
>> Your suggestion of using sanbox.hortonworks.com as the domain has gotten
>> me a step farther. Thank you!
>>
>> Now I'm getting an authorisation error which I'll dig into (unless anyone
>> can offer a solution :))
>>
>>
>> # curl -ki --negotiate -u : https://sandbox.hortonworks.co
>> m:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
>> HTTP/1.1 401
>> Date: Mon, 27 Feb 2017 09:16:12 GMT
>> WWW-Authenticate: Negotiate
>> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.c
>> om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
>> Content-Type: text/html; charset=ISO-8859-1
>> Cache-Control: must-revalidate,no-cache,no-store
>> Content-Length: 320
>> Server: Jetty(9.2.15.v20160210)
>>
>> HTTP/1.1 403 Forbidden
>> Date: Mon, 27 Feb 2017 09:16:12 GMT
>> Set-Cookie: hadoop.auth=u=knox&p=knox/knox@EXAMPLE.COM&t=kerberos&e=1488
>> 188772724&s=5A//jMYbfdVTp1ggiNE3jsLZ1bE=;Path=/;Domain=sandb
>> ox.hortonworks.com
>> Expires: Thu, 01 Jan 1970 00:00:00 GMT
>> Cache-Control: no-cache
>> Expires: Mon, 27 Feb 2017 09:16:12 GMT
>> Date: Mon, 27 Feb 2017 09:16:12 GMT
>> Pragma: no-cache
>> Expires: Mon, 27 Feb 2017 09:16:12 GMT
>> Date: Mon, 27 Feb 2017 09:16:12 GMT
>> Pragma: no-cache
>> Content-Type: application/json; charset=UTF-8
>> Server: Jetty(6.1.26.hwx)
>> Content-Length: 259
>>
>> {"RemoteException":{"exception":"SecurityException","
>> javaClassName":"java.lang.SecurityException","message":"Failed to obtain
>> user group information: org.apache.hadoop.security.authorize.AuthorizationException:
>> User: knox is not allowed to impersonate knox"}}
>>
>>
>>
>> Kind regards,
>> Ben
>>
>> *Ben Jovanic* | Software Engineer
>> Energy, Utilities & Telco | CGI
>> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
>> M: +44 7917 505 645 <+44%207917%20505645>
>> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>>
>> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
>> CGI Group Inc. and its affiliates may be contained in this message. If you
>> are not a recipient indicated or intended in this message (or responsible
>> for delivery of this message to such person), or you think for any reason
>> that this message may have been addressed to you in error, you may not use
>> or copy or deliver this message to anyone else. In such case, you should
>> destroy this message and are asked to notify the sender by reply e-mail.
>> ------------------------------
>> *From:* Sandeep More [more@apache.org]
>> *Sent:* 24 February 2017 18:41
>> *To:* user@knox.apache.org
>> *Subject:* Re: Using Kerberos with Knox
>>
>> Hello Ben,
>>
>> Just following up on this issue, I did try testing HadoopAuth provider
>> with Knox and it seems to work in my case, I tried to document the process
>> in the blog post [1].
>>
>> I noticed that you are using cookie domain 'sandbox.hortonworks.com' but
>> in your curl request you are using 'localhost', IMO this would fail, can
>> you try using 'sandbox.hortonworks.com' and see if that helps ?
>>
>> i.e.
>> curl -ki --negotiate -u : "https://
>> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>
>> sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS
>> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
>>
>>
>> [1] https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/
>> Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authent
>> ication%29+with+Apache+Knox
>>
>> Best,
>> Sandeep
>>
>> On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>
>> wrote:
>>
>>> Hello Ben,
>>>
>>> Welcome to the list !
>>>
>>> At first glance your knox configs look ok to me, it could be related to
>>> a setup issue.
>>>
>>> In the setup procedure mentioned, you followed
>>> 1. Ambari instruction to setup Kerberos (item #3) and
>>> 2. Knox instructions for setting up Kerberos
>>>
>>> Ambari setup already takes care of Knox Kerberos setup, so you just have
>>> to go with Ambari instructions (assuming you setup Knox from Ambari
>>> initially)
>>>
>>> I am assuming you installed Apache Knox from a Zip or a tgz file (since
>>> 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command
>>> did you take into account the directory structure for zip installs (since
>>> the directory structure from the rpms are different than the zip/tgz ones).
>>>
>>>
>>> Best,
>>> sandeep
>>>
>>> On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> First time emailing the user mailing list. I work for CGI and am
>>>> currently working on Knox for one of our projects.
>>>>
>>>> I'm struggling to get Kerberos and Knox set up together on my HDP. Knox
>>>> works fine on its own with LDAP and Kerberos works with WebHDFS.
>>>>
>>>> *The set up:*
>>>>
>>>>    - I'm using HDP-2.4.0.0-169.
>>>>    - I'm using Knox 0.11.0 -- which I've installed at
>>>>    /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
>>>>    - Kerberos has been set up using these instructions (
>>>>    https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk
>>>>    _Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_ke
>>>>    rberos.html
>>>>    <https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>
>>>>    )
>>>>    - I've validated the Kerberos set up by using the following curl
>>>>    statement after a kinit:
>>>>
>>>> $ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/
>>>> v1/tmp?op=LISTSTATUS"
>>>> HTTP/1.1 401 Authentication required
>>>> Cache-Control: must-revalidate,no-cache,no-store
>>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Pragma: no-cache
>>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Pragma: no-cache
>>>> Content-Type: text/html; charset=iso-8859-1
>>>> WWW-Authenticate: Negotiate
>>>> Set-Cookie: hadoop.auth=; Path=/; HttpOnly
>>>> Content-Length: 1407
>>>> Server: Jetty(6.1.26.hwx)
>>>>
>>>> HTTP/1.1 200 OK
>>>> Cache-Control: no-cache
>>>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Pragma: no-cache
>>>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>>> Pragma: no-cache
>>>> Content-Type: application/json
>>>> Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM
>>>> &t=kerberos&e=1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/;
>>>> HttpOnly
>>>> Transfer-Encoding: chunked
>>>> Server: Jetty(6.1.26.hwx)
>>>>
>>>> {"FileStatuses":{"FileStatus":[
>>>> {"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397
>>>> ,"group":"hdfs","length":0,"modificationTime":1456768692570,
>>>> "owner":"hdfs","pathSuffix":"entity-file-history","permissio
>>>> n":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
>>>> {"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434
>>>> ,"group":"hdfs","length":0,"modificationTime":1456785191888,
>>>> "owner":"ambari-qa","pathSuffix":"hive","permission":"733","
>>>> replication":0,"storagePolicy":0,"type":"DIRECTORY"}
>>>> ]}}
>>>>
>>>> *What I've tried with Knox*
>>>>
>>>> I've gone through these instructions (https://knox.apache.org/books
>>>> /knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox
>>>> keytab, update the 2 conf files and update gateway-site.xml.
>>>>
>>>>
>>>> krb5.conf:
>>>>
>>>> [logging]
>>>>  default = FILE:/var/log/krb5libs.log
>>>>  kdc = FILE:/var/log/krb5kdc.log
>>>>  admin_server = FILE:/var/log/kadmind.log
>>>>
>>>> [libdefaults]
>>>>  default_realm = EXAMPLE.COM
>>>>  dns_lookup_realm = false
>>>>  dns_lookup_kdc = false
>>>>  ticket_lifetime = 24h
>>>>  renew_lifetime = 7d
>>>>  forwardable = true
>>>>
>>>> [realms]
>>>>  EXAMPLE.COM = {
>>>>   kdc = sandbox.hortonworks.com
>>>>   admin_server = sandbox.hortonworks.com
>>>>  }
>>>>
>>>>
>>>> During Kerberos set up I did leave the realm as EXAMPLE.COM.
>>>>
>>>>
>>>> krb5JAASLogin.conf:
>>>> com.sun.security.jgss.initiate {
>>>> com.sun.security.auth.module.Krb5LoginModule required
>>>> renewTGT=true
>>>> doNotPrompt=true
>>>> useKeyTab=true
>>>> keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
>>>> principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM"
>>>> isInitiator=true
>>>> storeKey=true
>>>> useTicketCache=true
>>>> client=true;
>>>> };
>>>>
>>>> I have tried different keytabs, like /etc/security/keytabs/spnego.s
>>>> ervice.keytab and /etc/security/keytabs/knox.service.keytab.
>>>> I have tried other principles like knox/knox@EXAMPLE.COM.
>>>>
>>>> I have coped the templates/hadas.xml file to
>>>> conf/topologies/sandbox.xml.
>>>>
>>>> sandbox.xml:
>>>> <topology>
>>>>
>>>>     <gateway>
>>>>
>>>>         <provider>
>>>>             <role>authentication</role>
>>>>             <name>HadoopAuth</name>
>>>>             <enabled>true</enabled>
>>>>
>>>>             <param>
>>>>                 <name>config.prefix</name>
>>>>                 <value>hadoop.auth.config</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.signature.secret</name>
>>>>                 <!--<value>78hdkjaka</value>-->
>>>>                 <value></value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.type</name>
>>>>                 <value>kerberos</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.simpl
>>>> e.anonymous.allowed</name>
>>>>                 <value>false</value> <!-- default: false -->
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.token.validity</name>
>>>>                 <value>1800</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.cookie.domain</name>
>>>>                 <value>sandbox.hortonworks.com</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.cookie.path</name>
>>>>                 <!--<value>gateway/hada</value>-->
>>>>                 <value>/</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.kerberos.principal</name>
>>>>                 <value>HTTP/sandbox.hortonworks.com@EXAMPLE.COM</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.kerberos.keytab</name>
>>>>                 <value>/usr/hdp/current/knox-s
>>>> erver/conf/knox.service.keytab</value>
>>>>             </param>
>>>>             <param>
>>>>                 <name>hadoop.auth.config.kerberos.name.rules</name>
>>>>                 <value>DEFAULT</value>
>>>>             </param>
>>>>
>>>>         </provider>
>>>>
>>>>         <provider>
>>>>             <role>identity-assertion</role>
>>>>             <name>Default</name>
>>>>             <enabled>true</enabled>
>>>>          <!-- param>
>>>>                 <name>principal.mapping</name>
>>>>                 <value>sam=god;</value>
>>>>          </param -->
>>>>
>>>>         </provider>
>>>>
>>>>         <provider>
>>>>             <role>hostmap</role>
>>>>             <name>static</name>
>>>>             <enabled>false</enabled>
>>>>             <param><name>sandbox.hortonworks.com</name><value>sandbox,
>>>> sandbox.hortonworks.com</value></param>
>>>>         </provider>
>>>>
>>>>     </gateway>
>>>>
>>>>   <service>...</service>
>>>>   ...
>>>>
>>>> </topology>
>>>>
>>>> Again, tried different principle and keytab values with no success. And
>>>> every attempt to use kinit with different principles and keytabs results in
>>>> the following message:
>>>>
>>>> $ curl -ki --negotiate -u : "https://localhost:8443/gatewa
>>>> y/sandbox/webhdfs/v1/?op=LISTSTATUS"
>>>> HTTP/1.1 401
>>>> Date: Tue, 21 Feb 2017 11:18:32 GMT
>>>> WWW-Authenticate: Negotiate
>>>> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=
>>>> sandbox.hortonworks.com;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
>>>> Content-Type: text/html; charset=ISO-8859-1
>>>> Cache-Control: must-revalidate,no-cache,no-store
>>>> Content-Length: 317
>>>> Server: Jetty(9.2.15.v20160210)
>>>>
>>>> <html>
>>>> <head>
>>>> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
>>>> <title>Error 401 Unauthorized</title>
>>>> </head>
>>>> <body><h2>HTTP ERROR 401</h2>
>>>> <p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
>>>> <pre>    Unauthorized</pre></p><hr><i><small>Powered by
>>>> Jetty://</small></i><hr/>
>>>>
>>>> </body>
>>>> </html>
>>>>
>>>> Thanks in advance for any help I receive.
>>>>
>>>> Kind regards,
>>>> Ben
>>>>
>>>> *Ben Jovanic* | Software Engineer
>>>> Energy, Utilities & Telco | CGI
>>>> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
>>>> M: +44 7917 505 645 <+44%207917%20505645>
>>>> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>>>>
>>>> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging
>>>> to CGI Group Inc. and its affiliates may be contained in this message. If
>>>> you are not a recipient indicated or intended in this message (or
>>>> responsible for delivery of this message to such person), or you think for
>>>> any reason that this message may have been addressed to you in error, you
>>>> may not use or copy or deliver this message to anyone else. In such case,
>>>> you should destroy this message and are asked to notify the sender by reply
>>>> e-mail.
>>>>
>>>
>>>
>>
>

Re: Using Kerberos with Knox

Posted by Sandeep More <mo...@gmail.com>.
Good to know you about the Progress Ben !
About the error "knox is not allowed to impersonate knox" , this might be
because the user 'knox' does not have sufficient group privileges to
perform the operation.

If you are using Ambari, you can check the
'hadoop.proxyuser.knox.groups' parameter, in my case it is 'users',  then
add the user 'knox' to 'users' group (or which ever group you have in
hadoop.proxyuser.knox.groups).

Let me know how it goes !

Best,
Sandeep

On Mon, Feb 27, 2017 at 4:20 AM, Jovanic, Ben <be...@cgi.com> wrote:

> Hi Sandeep,
>
> Your suggestion of using sanbox.hortonworks.com as the domain has gotten
> me a step farther. Thank you!
>
> Now I'm getting an authorisation error which I'll dig into (unless anyone
> can offer a solution :))
>
>
> # curl -ki --negotiate -u : https://sandbox.hortonworks.
> com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
> HTTP/1.1 401
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com;Expires=Thu,
> 01-Jan-1970 00:00:00 GMT;Max-Age=0
> Content-Type: text/html; charset=ISO-8859-1
> Cache-Control: must-revalidate,no-cache,no-store
> Content-Length: 320
> Server: Jetty(9.2.15.v20160210)
>
> HTTP/1.1 403 Forbidden
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Set-Cookie: hadoop.auth=u=knox&p=knox/knox@EXAMPLE.COM&t=kerberos&e=
> 1488188772724&s=5A//jMYbfdVTp1ggiNE3jsLZ1bE=;Path=/;Domain=
> sandbox.hortonworks.com
> Expires: Thu, 01 Jan 1970 00:00:00 GMT
> Cache-Control: no-cache
> Expires: Mon, 27 Feb 2017 09:16:12 GMT
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Pragma: no-cache
> Expires: Mon, 27 Feb 2017 09:16:12 GMT
> Date: Mon, 27 Feb 2017 09:16:12 GMT
> Pragma: no-cache
> Content-Type: application/json; charset=UTF-8
> Server: Jetty(6.1.26.hwx)
> Content-Length: 259
>
> {"RemoteException":{"exception":"SecurityException"
> ,"javaClassName":"java.lang.SecurityException","message":"Failed to
> obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException:
> User: knox is not allowed to impersonate knox"}}
>
>
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <+44%207917%20505645>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
> ------------------------------
> *From:* Sandeep More [more@apache.org]
> *Sent:* 24 February 2017 18:41
> *To:* user@knox.apache.org
> *Subject:* Re: Using Kerberos with Knox
>
> Hello Ben,
>
> Just following up on this issue, I did try testing HadoopAuth provider
> with Knox and it seems to work in my case, I tried to document the process
> in the blog post [1].
>
> I noticed that you are using cookie domain 'sandbox.hortonworks.com' but
> in your curl request you are using 'localhost', IMO this would fail, can
> you try using 'sandbox.hortonworks.com' and see if that helps ?
>
> i.e.
> curl -ki --negotiate -u : "https://
> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>
> sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS
> <https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"
>
>
> [1] https://cwiki.apache.org/confluence/display/KNOX/2017/
> 02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+
> authentication%29+with+Apache+Knox
>
> Best,
> Sandeep
>
> On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>
> wrote:
>
>> Hello Ben,
>>
>> Welcome to the list !
>>
>> At first glance your knox configs look ok to me, it could be related to a
>> setup issue.
>>
>> In the setup procedure mentioned, you followed
>> 1. Ambari instruction to setup Kerberos (item #3) and
>> 2. Knox instructions for setting up Kerberos
>>
>> Ambari setup already takes care of Knox Kerberos setup, so you just have
>> to go with Ambari instructions (assuming you setup Knox from Ambari
>> initially)
>>
>> I am assuming you installed Apache Knox from a Zip or a tgz file (since
>> 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command
>> did you take into account the directory structure for zip installs (since
>> the directory structure from the rpms are different than the zip/tgz ones).
>>
>>
>> Best,
>> sandeep
>>
>> On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>
>> wrote:
>>
>>> Hi,
>>>
>>> First time emailing the user mailing list. I work for CGI and am
>>> currently working on Knox for one of our projects.
>>>
>>> I'm struggling to get Kerberos and Knox set up together on my HDP. Knox
>>> works fine on its own with LDAP and Kerberos works with WebHDFS.
>>>
>>> *The set up:*
>>>
>>>    - I'm using HDP-2.4.0.0-169.
>>>    - I'm using Knox 0.11.0 -- which I've installed at
>>>    /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
>>>    - Kerberos has been set up using these instructions (
>>>    https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk
>>>    _Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_ke
>>>    rberos.html
>>>    <https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>
>>>    )
>>>    - I've validated the Kerberos set up by using the following curl
>>>    statement after a kinit:
>>>
>>> $ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/
>>> v1/tmp?op=LISTSTATUS"
>>> HTTP/1.1 401 Authentication required
>>> Cache-Control: must-revalidate,no-cache,no-store
>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>> Pragma: no-cache
>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>> Pragma: no-cache
>>> Content-Type: text/html; charset=iso-8859-1
>>> WWW-Authenticate: Negotiate
>>> Set-Cookie: hadoop.auth=; Path=/; HttpOnly
>>> Content-Length: 1407
>>> Server: Jetty(6.1.26.hwx)
>>>
>>> HTTP/1.1 200 OK
>>> Cache-Control: no-cache
>>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>> Pragma: no-cache
>>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>>> Pragma: no-cache
>>> Content-Type: application/json
>>> Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM&t=kerberos&e=
>>> 1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
>>> Transfer-Encoding: chunked
>>> Server: Jetty(6.1.26.hwx)
>>>
>>> {"FileStatuses":{"FileStatus":[
>>> {"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397
>>> ,"group":"hdfs","length":0,"modificationTime":1456768692570,
>>> "owner":"hdfs","pathSuffix":"entity-file-history","permissi
>>> on":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
>>> {"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434
>>> ,"group":"hdfs","length":0,"modificationTime":1456785191888,
>>> "owner":"ambari-qa","pathSuffix":"hive","permission":"733","
>>> replication":0,"storagePolicy":0,"type":"DIRECTORY"}
>>> ]}}
>>>
>>> *What I've tried with Knox*
>>>
>>> I've gone through these instructions (https://knox.apache.org/books
>>> /knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox
>>> keytab, update the 2 conf files and update gateway-site.xml.
>>>
>>>
>>> krb5.conf:
>>>
>>> [logging]
>>>  default = FILE:/var/log/krb5libs.log
>>>  kdc = FILE:/var/log/krb5kdc.log
>>>  admin_server = FILE:/var/log/kadmind.log
>>>
>>> [libdefaults]
>>>  default_realm = EXAMPLE.COM
>>>  dns_lookup_realm = false
>>>  dns_lookup_kdc = false
>>>  ticket_lifetime = 24h
>>>  renew_lifetime = 7d
>>>  forwardable = true
>>>
>>> [realms]
>>>  EXAMPLE.COM = {
>>>   kdc = sandbox.hortonworks.com
>>>   admin_server = sandbox.hortonworks.com
>>>  }
>>>
>>>
>>> During Kerberos set up I did leave the realm as EXAMPLE.COM.
>>>
>>>
>>> krb5JAASLogin.conf:
>>> com.sun.security.jgss.initiate {
>>> com.sun.security.auth.module.Krb5LoginModule required
>>> renewTGT=true
>>> doNotPrompt=true
>>> useKeyTab=true
>>> keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
>>> principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM"
>>> isInitiator=true
>>> storeKey=true
>>> useTicketCache=true
>>> client=true;
>>> };
>>>
>>> I have tried different keytabs, like /etc/security/keytabs/spnego.s
>>> ervice.keytab and /etc/security/keytabs/knox.service.keytab.
>>> I have tried other principles like knox/knox@EXAMPLE.COM.
>>>
>>> I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml
>>> .
>>>
>>> sandbox.xml:
>>> <topology>
>>>
>>>     <gateway>
>>>
>>>         <provider>
>>>             <role>authentication</role>
>>>             <name>HadoopAuth</name>
>>>             <enabled>true</enabled>
>>>
>>>             <param>
>>>                 <name>config.prefix</name>
>>>                 <value>hadoop.auth.config</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.signature.secret</name>
>>>                 <!--<value>78hdkjaka</value>-->
>>>                 <value></value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.type</name>
>>>                 <value>kerberos</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.simple.anonymous.allowed</name>
>>>                 <value>false</value> <!-- default: false -->
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.token.validity</name>
>>>                 <value>1800</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.cookie.domain</name>
>>>                 <value>sandbox.hortonworks.com</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.cookie.path</name>
>>>                 <!--<value>gateway/hada</value>-->
>>>                 <value>/</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.kerberos.principal</name>
>>>                 <value>HTTP/sandbox.hortonworks.com@EXAMPLE.COM</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.kerberos.keytab</name>
>>>                 <value>/usr/hdp/current/knox-s
>>> erver/conf/knox.service.keytab</value>
>>>             </param>
>>>             <param>
>>>                 <name>hadoop.auth.config.kerberos.name.rules</name>
>>>                 <value>DEFAULT</value>
>>>             </param>
>>>
>>>         </provider>
>>>
>>>         <provider>
>>>             <role>identity-assertion</role>
>>>             <name>Default</name>
>>>             <enabled>true</enabled>
>>>          <!-- param>
>>>                 <name>principal.mapping</name>
>>>                 <value>sam=god;</value>
>>>          </param -->
>>>
>>>         </provider>
>>>
>>>         <provider>
>>>             <role>hostmap</role>
>>>             <name>static</name>
>>>             <enabled>false</enabled>
>>>             <param><name>sandbox.hortonworks.com</name><value>sandbox,sa
>>> ndbox.hortonworks.com</value></param>
>>>         </provider>
>>>
>>>     </gateway>
>>>
>>>   <service>...</service>
>>>   ...
>>>
>>> </topology>
>>>
>>> Again, tried different principle and keytab values with no success. And
>>> every attempt to use kinit with different principles and keytabs results in
>>> the following message:
>>>
>>> $ curl -ki --negotiate -u : "https://localhost:8443/gatewa
>>> y/sandbox/webhdfs/v1/?op=LISTSTATUS"
>>> HTTP/1.1 401
>>> Date: Tue, 21 Feb 2017 11:18:32 GMT
>>> WWW-Authenticate: Negotiate
>>> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.c
>>> om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
>>> Content-Type: text/html; charset=ISO-8859-1
>>> Cache-Control: must-revalidate,no-cache,no-store
>>> Content-Length: 317
>>> Server: Jetty(9.2.15.v20160210)
>>>
>>> <html>
>>> <head>
>>> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
>>> <title>Error 401 Unauthorized</title>
>>> </head>
>>> <body><h2>HTTP ERROR 401</h2>
>>> <p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
>>> <pre>    Unauthorized</pre></p><hr><i><small>Powered by
>>> Jetty://</small></i><hr/>
>>>
>>> </body>
>>> </html>
>>>
>>> Thanks in advance for any help I receive.
>>>
>>> Kind regards,
>>> Ben
>>>
>>> *Ben Jovanic* | Software Engineer
>>> Energy, Utilities & Telco | CGI
>>> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
>>> M: +44 7917 505 645 <+44%207917%20505645>
>>> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>>>
>>> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging
>>> to CGI Group Inc. and its affiliates may be contained in this message. If
>>> you are not a recipient indicated or intended in this message (or
>>> responsible for delivery of this message to such person), or you think for
>>> any reason that this message may have been addressed to you in error, you
>>> may not use or copy or deliver this message to anyone else. In such case,
>>> you should destroy this message and are asked to notify the sender by reply
>>> e-mail.
>>>
>>
>>
>

RE: Using Kerberos with Knox

Posted by "Jovanic, Ben" <be...@cgi.com>.
Hi Sandeep,

Your suggestion of using sanbox.hortonworks.com as the domain has gotten me a step farther. Thank you!

Now I'm getting an authorisation error which I'll dig into (unless anyone can offer a solution :))


# curl -ki --negotiate -u : https://sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
HTTP/1.1 401
Date: Mon, 27 Feb 2017 09:16:12 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 320
Server: Jetty(9.2.15.v20160210)

HTTP/1.1 403 Forbidden
Date: Mon, 27 Feb 2017 09:16:12 GMT
Set-Cookie: hadoop.auth=u=knox&p=knox/knox@EXAMPLE.COM&t=kerberos&e=1488188772724&s=5A//jMYbfdVTp1ggiNE3jsLZ1bE=;Path=/;Domain=sandbox.hortonworks.com
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Expires: Mon, 27 Feb 2017 09:16:12 GMT
Date: Mon, 27 Feb 2017 09:16:12 GMT
Pragma: no-cache
Content-Type: application/json; charset=UTF-8
Server: Jetty(6.1.26.hwx)
Content-Length: 259

{"RemoteException":{"exception":"SecurityException","javaClassName":"java.lang.SecurityException","message":"Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: knox is not allowed to impersonate knox"}}



Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.
________________________________
From: Sandeep More [more@apache.org]
Sent: 24 February 2017 18:41
To: user@knox.apache.org
Subject: Re: Using Kerberos with Knox

Hello Ben,

Just following up on this issue, I did try testing HadoopAuth provider with Knox and it seems to work in my case, I tried to document the process in the blog post [1].

I noticed that you are using cookie domain 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' but in your curl request you are using 'localhost', IMO this would fail, can you try using 'sandbox.hortonworks.com<http://sandbox.hortonworks.com/>' and see if that helps ?

i.e.
curl -ki --negotiate -u : "https://<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>sandbox.hortonworks.com<http://sandbox.hortonworks.com/>:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"


[1] https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox

Best,
Sandeep

On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com>> wrote:
Hello Ben,

Welcome to the list !

At first glance your knox configs look ok to me, it could be related to a setup issue.

In the setup procedure mentioned, you followed
1. Ambari instruction to setup Kerberos (item #3) and
2. Knox instructions for setting up Kerberos

Ambari setup already takes care of Knox Kerberos setup, so you just have to go with Ambari instructions (assuming you setup Knox from Ambari initially)

I am assuming you installed Apache Knox from a Zip or a tgz file (since 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command did you take into account the directory structure for zip installs (since the directory structure from the rpms are different than the zip/tgz ones).


Best,
sandeep

On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com>> wrote:
Hi,

First time emailing the user mailing list. I work for CGI and am currently working on Knox for one of our projects.

I'm struggling to get Kerberos and Knox set up together on my HDP. Knox works fine on its own with LDAP and Kerberos works with WebHDFS.

The set up:

  *   I'm using HDP-2.4.0.0-169.
  *   I'm using Knox 0.11.0 -- which I've installed at /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
  *   Kerberos has been set up using these instructions (https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html)
  *   I've validated the Kerberos set up by using the following curl statement after a kinit:

$ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/v1/tmp?op=LISTSTATUS"
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-store
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Length: 1407
Server: Jetty(6.1.26.hwx)

HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Expires: Tue, 21 Feb 2017 10:49:14 GMT
Date: Tue, 21 Feb 2017 10:49:14 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM<ma...@EXAMPLE.COM>&t=kerberos&e=1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"FileStatuses":{"FileStatus":[
{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397,"group":"hdfs","length":0,"modificationTime":1456768692570,"owner":"hdfs","pathSuffix":"entity-file-history","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
{"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434,"group":"hdfs","length":0,"modificationTime":1456785191888,"owner":"ambari-qa","pathSuffix":"hive","permission":"733","replication":0,"storagePolicy":0,"type":"DIRECTORY"}
]}}


What I've tried with Knox


I've gone through these instructions (https://knox.apache.org/books/knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox keytab, update the 2 conf files and update gateway-site.xml.


krb5.conf:

[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 default_realm = EXAMPLE.COM<http://EXAMPLE.COM>
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true

[realms]
 EXAMPLE.COM<http://EXAMPLE.COM> = {
  kdc = sandbox.hortonworks.com<http://sandbox.hortonworks.com>
  admin_server = sandbox.hortonworks.com<http://sandbox.hortonworks.com>
 }


During Kerberos set up I did leave the realm as EXAMPLE.COM<http://EXAMPLE.COM>.


krb5JAASLogin.conf:
com.sun.security.jgss.initiate {
com.sun.security.auth.module.Krb5LoginModule required
renewTGT=true
doNotPrompt=true
useKeyTab=true
keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM<ma...@EXAMPLE.COM>"
isInitiator=true
storeKey=true
useTicketCache=true
client=true;
};

I have tried different keytabs, like /etc/security/keytabs/spnego.service.keytab and /etc/security/keytabs/knox.service.keytab.
I have tried other principles like knox/knox@EXAMPLE.COM<ma...@EXAMPLE.COM>.

I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.

sandbox.xml:
<topology>

    <gateway>

        <provider>
            <role>authentication</role>
            <name>HadoopAuth</name>
            <enabled>true</enabled>

            <param>
                <name>config.prefix</name>
                <value>hadoop.auth.config</value>
            </param>
            <param>
                <name>hadoop.auth.config.signature.secret</name>
                <!--<value>78hdkjaka</value>-->
                <value></value>
            </param>
            <param>
                <name>hadoop.auth.config.type</name>
                <value>kerberos</value>
            </param>
            <param>
                <name>hadoop.auth.config.simple.anonymous.allowed</name>
                <value>false</value> <!-- default: false -->
            </param>
            <param>
                <name>hadoop.auth.config.token.validity</name>
                <value>1800</value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.domain</name>
                <value>sandbox.hortonworks.com<http://sandbox.hortonworks.com></value>
            </param>
            <param>
                <name>hadoop.auth.config.cookie.path</name>
                <!--<value>gateway/hada</value>-->
                <value>/</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.principal</name>
                <va...@EXAMPLE.COM></value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.keytab</name>
                <value>/usr/hdp/current/knox-server/conf/knox.service.keytab</value>
            </param>
            <param>
                <name>hadoop.auth.config.kerberos.name.rules</name>
                <value>DEFAULT</value>
            </param>

        </provider>

        <provider>
            <role>identity-assertion</role>
            <name>Default</name>
            <enabled>true</enabled>
         <!-- param>
                <name>principal.mapping</name>
                <value>sam=god;</value>
         </param -->

        </provider>

        <provider>
            <role>hostmap</role>
            <name>static</name>
            <enabled>false</enabled>
            <param><name>sandbox.hortonworks.com<http://sandbox.hortonworks.com></name><value>sandbox,sandbox.hortonworks.com<http://sandbox.hortonworks.com></value></param>
        </provider>

    </gateway>

  <service>...</service>
  ...

</topology>

Again, tried different principle and keytab values with no success. And every attempt to use kinit with different principles and keytabs results in the following message:

$ curl -ki --negotiate -u : "https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS"
HTTP/1.1 401
Date: Tue, 21 Feb 2017 11:18:32 GMT
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com<http://sandbox.hortonworks.com>;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
Content-Type: text/html; charset=ISO-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 317
Server: Jetty(9.2.15.v20160210)

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
<pre>    Unauthorized</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>

</body>
</html>

Thanks in advance for any help I receive.

Kind regards,
Ben

Ben Jovanic | Software Engineer
Energy, Utilities & Telco | CGI
2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
M: +44 7917 505 645<tel:+44%207917%20505645>
ben.jovanic@cgi.com<ma...@cgi.com> | cgi-group.co.uk<http://www.cgi-group.co.uk>

CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply e-mail.



Re: Using Kerberos with Knox

Posted by Sandeep More <mo...@apache.org>.
Hello Ben,

Just following up on this issue, I did try testing HadoopAuth provider with
Knox and it seems to work in my case, I tried to document the process in
the blog post [1].

I noticed that you are using cookie domain 'sandbox.hortonworks.com' but in
your curl request you are using 'localhost', IMO this would fail, can you
try using 'sandbox.hortonworks.com' and see if that helps ?

i.e.
curl -ki --negotiate -u : "https://
<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>
sandbox.hortonworks.com:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS
<https://localhost:8443/gateway/sandbox/webhdfs/v1/?op=LISTSTATUS>"


[1]
https://cwiki.apache.org/confluence/display/KNOX/2017/02/24/Hadoop+Auth+%28SPNEGO+and+delegation+token+based+authentication%29+with+Apache+Knox

Best,
Sandeep

On Tue, Feb 21, 2017 at 3:23 PM, Sandeep More <mo...@gmail.com> wrote:

> Hello Ben,
>
> Welcome to the list !
>
> At first glance your knox configs look ok to me, it could be related to a
> setup issue.
>
> In the setup procedure mentioned, you followed
> 1. Ambari instruction to setup Kerberos (item #3) and
> 2. Knox instructions for setting up Kerberos
>
> Ambari setup already takes care of Knox Kerberos setup, so you just have
> to go with Ambari instructions (assuming you setup Knox from Ambari
> initially)
>
> I am assuming you installed Apache Knox from a Zip or a tgz file (since
> 0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command
> did you take into account the directory structure for zip installs (since
> the directory structure from the rpms are different than the zip/tgz ones).
>
>
> Best,
> sandeep
>
> On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com> wrote:
>
>> Hi,
>>
>> First time emailing the user mailing list. I work for CGI and am
>> currently working on Knox for one of our projects.
>>
>> I'm struggling to get Kerberos and Knox set up together on my HDP. Knox
>> works fine on its own with LDAP and Kerberos works with WebHDFS.
>>
>> *The set up:*
>>
>>    - I'm using HDP-2.4.0.0-169.
>>    - I'm using Knox 0.11.0 -- which I've installed at
>>    /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
>>    - Kerberos has been set up using these instructions (
>>    https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk
>>    _Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_
>>    kerberos.html
>>    <https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>
>>    )
>>    - I've validated the Kerberos set up by using the following curl
>>    statement after a kinit:
>>
>> $ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/
>> v1/tmp?op=LISTSTATUS"
>> HTTP/1.1 401 Authentication required
>> Cache-Control: must-revalidate,no-cache,no-store
>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>> Pragma: no-cache
>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>> Pragma: no-cache
>> Content-Type: text/html; charset=iso-8859-1
>> WWW-Authenticate: Negotiate
>> Set-Cookie: hadoop.auth=; Path=/; HttpOnly
>> Content-Length: 1407
>> Server: Jetty(6.1.26.hwx)
>>
>> HTTP/1.1 200 OK
>> Cache-Control: no-cache
>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>> Pragma: no-cache
>> Expires: Tue, 21 Feb 2017 10:49:14 GMT
>> Date: Tue, 21 Feb 2017 10:49:14 GMT
>> Pragma: no-cache
>> Content-Type: application/json
>> Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM&t=kerberos&e=
>> 1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
>> Transfer-Encoding: chunked
>> Server: Jetty(6.1.26.hwx)
>>
>> {"FileStatuses":{"FileStatus":[
>> {"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16397
>> ,"group":"hdfs","length":0,"modificationTime":145676869257
>> 0,"owner":"hdfs","pathSuffix":"entity-file-history","
>> permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"},
>> {"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16434
>> ,"group":"hdfs","length":0,"modificationTime":145678519188
>> 8,"owner":"ambari-qa","pathSuffix":"hive","permission
>> ":"733","replication":0,"storagePolicy":0,"type":"DIRECTORY"}
>> ]}}
>>
>> *What I've tried with Knox*
>>
>> I've gone through these instructions (https://knox.apache.org/books
>> /knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox keytab,
>> update the 2 conf files and update gateway-site.xml.
>>
>>
>> krb5.conf:
>>
>> [logging]
>>  default = FILE:/var/log/krb5libs.log
>>  kdc = FILE:/var/log/krb5kdc.log
>>  admin_server = FILE:/var/log/kadmind.log
>>
>> [libdefaults]
>>  default_realm = EXAMPLE.COM
>>  dns_lookup_realm = false
>>  dns_lookup_kdc = false
>>  ticket_lifetime = 24h
>>  renew_lifetime = 7d
>>  forwardable = true
>>
>> [realms]
>>  EXAMPLE.COM = {
>>   kdc = sandbox.hortonworks.com
>>   admin_server = sandbox.hortonworks.com
>>  }
>>
>>
>> During Kerberos set up I did leave the realm as EXAMPLE.COM.
>>
>>
>> krb5JAASLogin.conf:
>> com.sun.security.jgss.initiate {
>> com.sun.security.auth.module.Krb5LoginModule required
>> renewTGT=true
>> doNotPrompt=true
>> useKeyTab=true
>> keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
>> principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM"
>> isInitiator=true
>> storeKey=true
>> useTicketCache=true
>> client=true;
>> };
>>
>> I have tried different keytabs, like /etc/security/keytabs/spnego.s
>> ervice.keytab and /etc/security/keytabs/knox.service.keytab.
>> I have tried other principles like knox/knox@EXAMPLE.COM.
>>
>> I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.
>>
>> sandbox.xml:
>> <topology>
>>
>>     <gateway>
>>
>>         <provider>
>>             <role>authentication</role>
>>             <name>HadoopAuth</name>
>>             <enabled>true</enabled>
>>
>>             <param>
>>                 <name>config.prefix</name>
>>                 <value>hadoop.auth.config</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.signature.secret</name>
>>                 <!--<value>78hdkjaka</value>-->
>>                 <value></value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.type</name>
>>                 <value>kerberos</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.simple.anonymous.allowed</name>
>>                 <value>false</value> <!-- default: false -->
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.token.validity</name>
>>                 <value>1800</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.cookie.domain</name>
>>                 <value>sandbox.hortonworks.com</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.cookie.path</name>
>>                 <!--<value>gateway/hada</value>-->
>>                 <value>/</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.kerberos.principal</name>
>>                 <value>HTTP/sandbox.hortonworks.com@EXAMPLE.COM</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.kerberos.keytab</name>
>>                 <value>/usr/hdp/current/knox-s
>> erver/conf/knox.service.keytab</value>
>>             </param>
>>             <param>
>>                 <name>hadoop.auth.config.kerberos.name.rules</name>
>>                 <value>DEFAULT</value>
>>             </param>
>>
>>         </provider>
>>
>>         <provider>
>>             <role>identity-assertion</role>
>>             <name>Default</name>
>>             <enabled>true</enabled>
>>          <!-- param>
>>                 <name>principal.mapping</name>
>>                 <value>sam=god;</value>
>>          </param -->
>>
>>         </provider>
>>
>>         <provider>
>>             <role>hostmap</role>
>>             <name>static</name>
>>             <enabled>false</enabled>
>>             <param><name>sandbox.hortonworks.com</name><value>sandbox,sa
>> ndbox.hortonworks.com</value></param>
>>         </provider>
>>
>>     </gateway>
>>
>>   <service>...</service>
>>   ...
>>
>> </topology>
>>
>> Again, tried different principle and keytab values with no success. And
>> every attempt to use kinit with different principles and keytabs results in
>> the following message:
>>
>> $ curl -ki --negotiate -u : "https://localhost:8443/gatewa
>> y/sandbox/webhdfs/v1/?op=LISTSTATUS"
>> HTTP/1.1 401
>> Date: Tue, 21 Feb 2017 11:18:32 GMT
>> WWW-Authenticate: Negotiate
>> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.c
>> om;Expires=Thu, 01-Jan-1970 00:00:00 GMT;Max-Age=0
>> Content-Type: text/html; charset=ISO-8859-1
>> Cache-Control: must-revalidate,no-cache,no-store
>> Content-Length: 317
>> Server: Jetty(9.2.15.v20160210)
>>
>> <html>
>> <head>
>> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
>> <title>Error 401 Unauthorized</title>
>> </head>
>> <body><h2>HTTP ERROR 401</h2>
>> <p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
>> <pre>    Unauthorized</pre></p><hr><i><small>Powered by
>> Jetty://</small></i><hr/>
>>
>> </body>
>> </html>
>>
>> Thanks in advance for any help I receive.
>>
>> Kind regards,
>> Ben
>>
>> *Ben Jovanic* | Software Engineer
>> Energy, Utilities & Telco | CGI
>> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
>> M: +44 7917 505 645 <+44%207917%20505645>
>> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>>
>> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
>> CGI Group Inc. and its affiliates may be contained in this message. If you
>> are not a recipient indicated or intended in this message (or responsible
>> for delivery of this message to such person), or you think for any reason
>> that this message may have been addressed to you in error, you may not use
>> or copy or deliver this message to anyone else. In such case, you should
>> destroy this message and are asked to notify the sender by reply e-mail.
>>
>
>

Re: Using Kerberos with Knox

Posted by Sandeep More <mo...@gmail.com>.
Hello Ben,

Welcome to the list !

At first glance your knox configs look ok to me, it could be related to a
setup issue.

In the setup procedure mentioned, you followed
1. Ambari instruction to setup Kerberos (item #3) and
2. Knox instructions for setting up Kerberos

Ambari setup already takes care of Knox Kerberos setup, so you just have to
go with Ambari instructions (assuming you setup Knox from Ambari initially)

I am assuming you installed Apache Knox from a Zip or a tgz file (since
0.11 does not ship with HDP-2.4, IIRC), when you ran 'hdp-select' command
did you take into account the directory structure for zip installs (since
the directory structure from the rpms are different than the zip/tgz ones).


Best,
sandeep

On Tue, Feb 21, 2017 at 6:24 AM, Jovanic, Ben <be...@cgi.com> wrote:

> Hi,
>
> First time emailing the user mailing list. I work for CGI and am currently
> working on Knox for one of our projects.
>
> I'm struggling to get Kerberos and Knox set up together on my HDP. Knox
> works fine on its own with LDAP and Kerberos works with WebHDFS.
>
> *The set up:*
>
>    - I'm using HDP-2.4.0.0-169.
>    - I'm using Knox 0.11.0 -- which I've installed at
>    /usr/hdp/0.11.0/knox/conf and run hdp-select set knox-server 0.11.0.
>    - Kerberos has been set up using these instructions (
>    https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/
>    bk_Ambari_Security_Guide/content/ch_configuring_amb_
>    hdp_for_kerberos.html
>    <https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.1.0/bk_Ambari_Security_Guide/content/ch_configuring_amb_hdp_for_kerberos.html>
>    )
>    - I've validated the Kerberos set up by using the following curl
>    statement after a kinit:
>
> $ curl -i --negotiate -u : "http://sandbox:50070/webhdfs/
> v1/tmp?op=LISTSTATUS"
> HTTP/1.1 401 Authentication required
> Cache-Control: must-revalidate,no-cache,no-store
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Content-Type: text/html; charset=iso-8859-1
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth=; Path=/; HttpOnly
> Content-Length: 1407
> Server: Jetty(6.1.26.hwx)
>
> HTTP/1.1 200 OK
> Cache-Control: no-cache
> Expires: Tue, 21 Feb 2017 10:49:14 GMT
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Expires: Tue, 21 Feb 2017 10:49:14 GMT
> Date: Tue, 21 Feb 2017 10:49:14 GMT
> Pragma: no-cache
> Content-Type: application/json
> Set-Cookie: hadoop.auth="u=admin&p=admin/admin@EXAMPLE.COM&t=kerberos&e=
> 1487710154130&s=gt9iw89RJ7XMd0XFA+xm49hUet0="; Path=/; HttpOnly
> Transfer-Encoding: chunked
> Server: Jetty(6.1.26.hwx)
>
> {"FileStatuses":{"FileStatus":[
> {"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":
> 16397,"group":"hdfs","length":0,"modificationTime":
> 1456768692570,"owner":"hdfs","pathSuffix":"entity-file-
> history","permission":"755","replication":0,"storagePolicy"
> :0,"type":"DIRECTORY"},
> {"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":
> 16434,"group":"hdfs","length":0,"modificationTime":
> 1456785191888,"owner":"ambari-qa","pathSuffix":"hive","permission":"733","
> replication":0,"storagePolicy":0,"type":"DIRECTORY"}
> ]}}
>
> *What I've tried with Knox*
>
> I've gone through these instructions (https://knox.apache.org/
> books/knox-0-11-0/user-guide.html#Secure+Clusters) to create the knox
> keytab, update the 2 conf files and update gateway-site.xml.
>
>
> krb5.conf:
>
> [logging]
>  default = FILE:/var/log/krb5libs.log
>  kdc = FILE:/var/log/krb5kdc.log
>  admin_server = FILE:/var/log/kadmind.log
>
> [libdefaults]
>  default_realm = EXAMPLE.COM
>  dns_lookup_realm = false
>  dns_lookup_kdc = false
>  ticket_lifetime = 24h
>  renew_lifetime = 7d
>  forwardable = true
>
> [realms]
>  EXAMPLE.COM = {
>   kdc = sandbox.hortonworks.com
>   admin_server = sandbox.hortonworks.com
>  }
>
>
> During Kerberos set up I did leave the realm as EXAMPLE.COM.
>
>
> krb5JAASLogin.conf:
> com.sun.security.jgss.initiate {
> com.sun.security.auth.module.Krb5LoginModule required
> renewTGT=true
> doNotPrompt=true
> useKeyTab=true
> keyTab="/usr/hdp/current/knox-server/conf/knox.service.keytab"
> principal="HTTP/sandbox.hortonworks.com@EXAMPLE.COM"
> isInitiator=true
> storeKey=true
> useTicketCache=true
> client=true;
> };
>
> I have tried different keytabs, like /etc/security/keytabs/spnego.
> service.keytab and /etc/security/keytabs/knox.service.keytab.
> I have tried other principles like knox/knox@EXAMPLE.COM.
>
> I have coped the templates/hadas.xml file to conf/topologies/sandbox.xml.
>
> sandbox.xml:
> <topology>
>
>     <gateway>
>
>         <provider>
>             <role>authentication</role>
>             <name>HadoopAuth</name>
>             <enabled>true</enabled>
>
>             <param>
>                 <name>config.prefix</name>
>                 <value>hadoop.auth.config</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.signature.secret</name>
>                 <!--<value>78hdkjaka</value>-->
>                 <value></value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.type</name>
>                 <value>kerberos</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.simple.anonymous.allowed</name>
>                 <value>false</value> <!-- default: false -->
>             </param>
>             <param>
>                 <name>hadoop.auth.config.token.validity</name>
>                 <value>1800</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.cookie.domain</name>
>                 <value>sandbox.hortonworks.com</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.cookie.path</name>
>                 <!--<value>gateway/hada</value>-->
>                 <value>/</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerberos.principal</name>
>                 <value>HTTP/sandbox.hortonworks.com@EXAMPLE.COM</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerberos.keytab</name>
>                 <value>/usr/hdp/current/knox-server/conf/knox.service.
> keytab</value>
>             </param>
>             <param>
>                 <name>hadoop.auth.config.kerberos.name.rules</name>
>                 <value>DEFAULT</value>
>             </param>
>
>         </provider>
>
>         <provider>
>             <role>identity-assertion</role>
>             <name>Default</name>
>             <enabled>true</enabled>
>          <!-- param>
>                 <name>principal.mapping</name>
>                 <value>sam=god;</value>
>          </param -->
>
>         </provider>
>
>         <provider>
>             <role>hostmap</role>
>             <name>static</name>
>             <enabled>false</enabled>
>             <param><name>sandbox.hortonworks.com</name><value>sandbox,
> sandbox.hortonworks.com</value></param>
>         </provider>
>
>     </gateway>
>
>   <service>...</service>
>   ...
>
> </topology>
>
> Again, tried different principle and keytab values with no success. And
> every attempt to use kinit with different principles and keytabs results in
> the following message:
>
> $ curl -ki --negotiate -u : "https://localhost:8443/
> gateway/sandbox/webhdfs/v1/?op=LISTSTATUS"
> HTTP/1.1 401
> Date: Tue, 21 Feb 2017 11:18:32 GMT
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth="";Version=1;Path=/;Domain=sandbox.hortonworks.com;Expires=Thu,
> 01-Jan-1970 00:00:00 GMT;Max-Age=0
> Content-Type: text/html; charset=ISO-8859-1
> Cache-Control: must-revalidate,no-cache,no-store
> Content-Length: 317
> Server: Jetty(9.2.15.v20160210)
>
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
> <title>Error 401 Unauthorized</title>
> </head>
> <body><h2>HTTP ERROR 401</h2>
> <p>Problem accessing /gateway/sandbox/webhdfs/v1/. Reason:
> <pre>    Unauthorized</pre></p><hr><i><small>Powered by
> Jetty://</small></i><hr/>
>
> </body>
> </html>
>
> Thanks in advance for any help I receive.
>
> Kind regards,
> Ben
>
> *Ben Jovanic* | Software Engineer
> Energy, Utilities & Telco | CGI
> 2nd floor, Inovo Building 121 George Street, Glasgow, G1 1RD, UK
> M: +44 7917 505 645 <+44%207917%20505645>
> ben.jovanic@cgi.com | cgi-group.co.uk <http://www.cgi-group.co.uk>
>
> CONFIDENTIALITY NOTICE: Proprietary/Confidential Information belonging to
> CGI Group Inc. and its affiliates may be contained in this message. If you
> are not a recipient indicated or intended in this message (or responsible
> for delivery of this message to such person), or you think for any reason
> that this message may have been addressed to you in error, you may not use
> or copy or deliver this message to anyone else. In such case, you should
> destroy this message and are asked to notify the sender by reply e-mail.
>