You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@knox.apache.org by km...@apache.org on 2014/10/30 20:15:51 UTC

svn commit: r1635592 - in /knox: site/ site/books/knox-0-4-0/ site/books/knox-0-5-0/ trunk/books/0.5.0/

Author: kminder
Date: Thu Oct 30 19:15:50 2014
New Revision: 1635592

URL: http://svn.apache.org/r1635592
Log:
KNOX-417: Verify Knox User Guide Instructions for v0.5.0

Modified:
    knox/site/books/knox-0-4-0/deployment-overview.png
    knox/site/books/knox-0-4-0/deployment-provider.png
    knox/site/books/knox-0-4-0/deployment-service.png
    knox/site/books/knox-0-4-0/runtime-overview.png
    knox/site/books/knox-0-4-0/runtime-request-processing.png
    knox/site/books/knox-0-5-0/knox-0-5-0.html
    knox/site/index.html
    knox/site/issue-tracking.html
    knox/site/license.html
    knox/site/mail-lists.html
    knox/site/project-info.html
    knox/site/team-list.html
    knox/trunk/books/0.5.0/book_getting-started.md
    knox/trunk/books/0.5.0/config_kerberos.md
    knox/trunk/books/0.5.0/config_preauth_sso_provider.md
    knox/trunk/books/0.5.0/config_webappsec_provider.md
    knox/trunk/books/0.5.0/quick_start.md
    knox/trunk/books/0.5.0/service_hive.md

Modified: knox/site/books/knox-0-4-0/deployment-overview.png
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/deployment-overview.png?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
Binary files - no diff available.

Modified: knox/site/books/knox-0-4-0/deployment-provider.png
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/deployment-provider.png?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
Binary files - no diff available.

Modified: knox/site/books/knox-0-4-0/deployment-service.png
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/deployment-service.png?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
Binary files - no diff available.

Modified: knox/site/books/knox-0-4-0/runtime-overview.png
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/runtime-overview.png?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
Binary files - no diff available.

Modified: knox/site/books/knox-0-4-0/runtime-request-processing.png
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/runtime-request-processing.png?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
Binary files - no diff available.

Modified: knox/site/books/knox-0-5-0/knox-0-5-0.html
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-5-0/knox-0-5-0.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/books/knox-0-5-0/knox-0-5-0.html (original)
+++ knox/site/books/knox-0-5-0/knox-0-5-0.html Thu Oct 30 19:15:50 2014
@@ -101,7 +101,7 @@ bin/knoxcli.sh create-master
 </code></pre><p>The cli will prompt you for the master secret (i.e. password).</p><p>The server will discover the persisted master secret during start up and complete the setup process for demo installs. A demo install will consist of a knox gateway instance with an identity certificate for localhost. This will require clients to be on the same machine or to turn off hostname verification. For more involved deployments, See the Knox CLI section of this document for additional commands - incuding the ability to create a self-signed certificate for a specific hostname.</p>
 <pre><code>cd {GATEWAY_HOME}
 bin/gateway.sh start
-</code></pre><p>When starting the gateway this way the process will be run in the backgroud. The log output is written into the directory /var/log/knox. In addition a PID (process ID) is written into /var/run/knox.</p><p>In order to stop a gateway that was started with the script use this command.</p>
+</code></pre><p>When starting the gateway this way the process will be run in the backgroud. In the Sandbox environment mentioned earlier the log output is written into the directory /var/log/knox. In addition a PID (process ID) is written into /var/run/knox.</p><p>In order to stop a gateway that was started with the script use this command.</p>
 <pre><code>cd {GATEWAY_HOME}
 bin/gateway.sh stop
 </code></pre><p>If for some reason the gateway is stopped other than by using the command above you may need to clear the tracking PID.</p>
@@ -144,72 +144,68 @@ Server: Jetty(6.1.26)
       <td>Contains security and topology specific artifacts that require read/write access at runtime </td>
     </tr>
     <tr>
-      <td>data/topologies/</td>
-      <td>Contains topology files that represent Hadoop clusters which the gateway uses to deploy cluster proxies</td>
+      <td>conf/topologies/ </td>
+      <td>Contains topology files that represent Hadoop clusters which the gateway uses to deploy cluster proxies </td>
     </tr>
     <tr>
       <td>data/security/ </td>
-      <td>Contains the persisted master secret and keystore dir</td>
+      <td>Contains the persisted master secret and keystore dir </td>
     </tr>
     <tr>
-      <td>data/security/keystores/</td>
-      <td>Contains the gateway identity keystore and credential stores for the gateway and each deployed cluster topology</td>
+      <td>data/security/keystores/ </td>
+      <td>Contains the gateway identity keystore and credential stores for the gateway and each deployed cluster topology </td>
     </tr>
     <tr>
       <td>bin/ </td>
-      <td>Contains the executable shell scripts, batch files and JARs for clients and servers.</td>
+      <td>Contains the executable shell scripts, batch files and JARs for clients and servers. </td>
     </tr>
     <tr>
       <td>data/deployments/ </td>
-      <td>Contains deployed cluster topologies used to protect access to specific Hadoop clusters.</td>
+      <td>Contains deployed cluster topologies used to protect access to specific Hadoop clusters. </td>
     </tr>
     <tr>
       <td>lib/ </td>
-      <td>Contains the JARs for all the components that make up the gateway.</td>
+      <td>Contains the JARs for all the components that make up the gateway. </td>
     </tr>
     <tr>
       <td>dep/ </td>
-      <td>Contains the JARs for all of the components upon which the gateway depends.</td>
+      <td>Contains the JARs for all of the components upon which the gateway depends. </td>
     </tr>
     <tr>
       <td>ext/ </td>
-      <td>A directory where user supplied extension JARs can be placed to extends the gateways functionality.</td>
+      <td>A directory where user supplied extension JARs can be placed to extends the gateways functionality. </td>
     </tr>
     <tr>
       <td>pids/ </td>
-      <td>Contains the process ids for running ldap and gateway servers</td>
+      <td>Contains the process ids for running ldap and gateway servers </td>
     </tr>
     <tr>
       <td>samples/ </td>
-      <td>Contains a number of samples that can be used to explore the functionality of the gateway.</td>
+      <td>Contains a number of samples that can be used to explore the functionality of the gateway. </td>
     </tr>
     <tr>
       <td>templates/ </td>
-      <td>Contains default configuration files that can be copied and customized.</td>
+      <td>Contains default configuration files that can be copied and customized. </td>
     </tr>
     <tr>
       <td>README </td>
-      <td>Provides basic information about the Apache Knox Gateway.</td>
+      <td>Provides basic information about the Apache Knox Gateway. </td>
     </tr>
     <tr>
       <td>ISSUES </td>
-      <td>Describes significant know issues.</td>
+      <td>Describes significant know issues. </td>
     </tr>
     <tr>
       <td>CHANGES </td>
-      <td>Enumerates the changes between releases.</td>
+      <td>Enumerates the changes between releases. </td>
     </tr>
     <tr>
       <td>LICENSE </td>
-      <td>Documents the license under which this software is provided.</td>
+      <td>Documents the license under which this software is provided. </td>
     </tr>
     <tr>
       <td>NOTICE </td>
-      <td>Documents required attribution notices for included dependencies.</td>
-    </tr>
-    <tr>
-      <td>DISCLAIMER </td>
-      <td>Documents that this release is from a project undergoing incubation at Apache.</td>
+      <td>Documents required attribution notices for included dependencies. </td>
     </tr>
   </tbody>
 </table><h3><a id="Supported+Services"></a>Supported Services</h3><p>This table enumerates the versions of various Hadoop services that have been tested to work with the Knox Gateway.</p>
@@ -242,7 +238,7 @@ Server: Jetty(6.1.26)
       <td><img src="check.png"  alt="y"/> </td>
     </tr>
     <tr>
-      <td>Ozzie </td>
+      <td>Oozie </td>
       <td>4.0.0 </td>
       <td><img src="check.png"  alt="y"/> </td>
       <td><img src="check.png"  alt="y"/> </td>
@@ -1239,20 +1235,20 @@ ktadd -k /etc/security/keytabs/knox.serv
 exit
 </code></pre><h4><a id="Grant+Proxy+privileges+for+Knox+user+in+`core-site.xml`+on+Hadoop+master+nodes"></a>Grant Proxy privileges for Knox user in <code>core-site.xml</code> on Hadoop master nodes</h4><p>Update <code>core-site.xml</code> and add the following lines towards the end of the file.</p><p>Replace FQDN_OF_KNOX_HOST with the fully qualified domain name of the host running the gateway. You can usually find this by running <code>hostname -f</code> on that host.</p><p>You could use * for local developer testing if Knox host does not have static IP.</p>
 <pre><code>&lt;property&gt;
-    &lt;name&gt;webhcat.proxyuser.knox.groups&lt;/name&gt;
+    &lt;name&gt;hadoop.proxyuser.knox.groups&lt;/name&gt;
     &lt;value&gt;users&lt;/value&gt;
 &lt;/property&gt;
 &lt;property&gt;
-    &lt;name&gt;webhcat.proxyuser.knox.hosts&lt;/name&gt;
+    &lt;name&gt;hadoop.proxyuser.knox.hosts&lt;/name&gt;
     &lt;value&gt;FQDN_OF_KNOX_HOST&lt;/value&gt;
 &lt;/property&gt;
 </code></pre><h4><a id="Grant+proxy+privilege+for+Knox+in+`webhcat-site.xml`+on+Hadoop+master+nodes"></a>Grant proxy privilege for Knox in <code>webhcat-site.xml</code> on Hadoop master nodes</h4><p>Update <code>webhcat-site.xml</code> and add the following lines towards the end of the file.</p><p>Replace FQDN_OF_KNOX_HOST with right value in your cluster. You could use * for local developer testing if Knox host does not have static IP.</p>
 <pre><code>&lt;property&gt;
-    &lt;name&gt;hadoop.proxyuser.knox.groups&lt;/name&gt;
+    &lt;name&gt;webhcat.proxyuser.knox.groups&lt;/name&gt;
     &lt;value&gt;users&lt;/value&gt;
 &lt;/property&gt;
 &lt;property&gt;
-    &lt;name&gt;hadoop.proxyuser.knox.hosts&lt;/name&gt;
+    &lt;name&gt;webhcat.proxyuser.knox.hosts&lt;/name&gt;
     &lt;value&gt;FQDN_OF_KNOX_HOST&lt;/value&gt;
 &lt;/property&gt;
 </code></pre><h4><a id="Grant+proxy+privilege+for+Knox+in+`oozie-site.xml`+on+Oozie+host"></a>Grant proxy privilege for Knox in <code>oozie-site.xml</code> on Oozie host</h4><p>Update <code>oozie-site.xml</code> and add the following lines towards the end of the file.</p><p>Replace FQDN_OF_KNOX_HOST with right value in your cluster. You could use * for local developer testing if Knox host does not have static IP.</p>
@@ -1269,7 +1265,7 @@ exit
 </code></pre><p>Copy knox.service.keytab created on KDC host on to your Knox host /etc/knox/conf/knox.service.keytab</p>
 <pre><code>chown knox knox.service.keytab
 chmod 400 knox.service.keytab
-</code></pre><h4><a id="Update+krb5.conf+at+/etc/knox/conf/krb5.conf+on+Knox+host"></a>Update krb5.conf at /etc/knox/conf/krb5.conf on Knox host</h4><p>You could copy the <code>templates/krb5.conf</code> file provided in the Knox binary download and customize it to suit your cluster.</p><h4><a id="Update+`krb5JAASLogin.conf`+at+`/etc/knox/conf/krb5JAASLogin.conf`+on+Knox+host"></a>Update <code>krb5JAASLogin.conf</code> at <code>/etc/knox/conf/krb5JAASLogin.conf</code> on Knox host</h4><p>You could copy the <code>templates/krb5JAASLogin.conf</code> file provided in the Knox binary download and customize it to suit your cluster.</p><h4><a id="Update+`gateway-site.xml`+on+Knox+host+on+Knox+host"></a>Update <code>gateway-site.xml</code> on Knox host on Knox host</h4><p>Update <code>conf/gateway-site.xml</code> in your Knox installation and set the value of <code>gateway.hadoop.kerberos.secured</code> to true.</p><h4><a id="Restart+Knox"></a>Restart Knox</h4><p>After you do the above con
 figurations and restart Knox, Knox would use SPNego to authenticate with Hadoop services and Oozie. There is no change in the way you make calls to Knox whether you use Curl or Knox DSL.</p><h3><a id="High+Availability"></a>High Availability</h3><h4><a id="Configure+Knox+instances"></a>Configure Knox instances</h4><p>All Knox instances must be synced to use the same topologies credentials keystores. These files are located under {GATEWAY_HOME}/conf/security/keystores/{TOPOLOGY_NAME}-credentials.jceks. They are generated after the first topology deployment. Currently these files can be synced just manually. There is no automation tool. Here are the steps to sync topologies credentials keystores:</p>
+</code></pre><h4><a id="Update+krb5.conf+at+/etc/knox/conf/krb5.conf+on+Knox+host"></a>Update krb5.conf at /etc/knox/conf/krb5.conf on Knox host</h4><p>You could copy the <code>templates/krb5.conf</code> file provided in the Knox binary download and customize it to suit your cluster.</p><h4><a id="Update+`krb5JAASLogin.conf`+at+`/etc/knox/conf/krb5JAASLogin.conf`+on+Knox+host"></a>Update <code>krb5JAASLogin.conf</code> at <code>/etc/knox/conf/krb5JAASLogin.conf</code> on Knox host</h4><p>You could copy the <code>templates/krb5JAASLogin.conf</code> file provided in the Knox binary download and customize it to suit your cluster.</p><h4><a id="Update+`gateway-site.xml`+on+Knox+host"></a>Update <code>gateway-site.xml</code> on Knox host</h4><p>Update <code>conf/gateway-site.xml</code> in your Knox installation and set the value of <code>gateway.hadoop.kerberos.secured</code> to true.</p><h4><a id="Restart+Knox"></a>Restart Knox</h4><p>After you do the above configurations and restart Kn
 ox, Knox would use SPNego to authenticate with Hadoop services and Oozie. There is no change in the way you make calls to Knox whether you use Curl or Knox DSL.</p><h3><a id="High+Availability"></a>High Availability</h3><h4><a id="Configure+Knox+instances"></a>Configure Knox instances</h4><p>All Knox instances must be synced to use the same topologies credentials keystores. These files are located under {GATEWAY_HOME}/conf/security/keystores/{TOPOLOGY_NAME}-credentials.jceks. They are generated after the first topology deployment. Currently these files can be synced just manually. There is no automation tool. Here are the steps to sync topologies credentials keystores:</p>
 <ol>
   <li>Choose Knox instance that will be the source for topologies credentials keystores. Let&rsquo;s call it keystores master</li>
   <li>Replace topologies credentials keystores in the other Knox instance with topologies credentials keystores from keystores master</li>
@@ -1321,7 +1317,7 @@ chmod 400 knox.service.keytab
 <pre><code>APACHE_HOME/bin/apachectl -k start
 APACHE_HOME/bin/apachectl -k stop
 </code></pre><h6><a id="Verify"></a>Verify</h6><p>Use Knox samples.</p><h3><a id="Web+App+Security+Provider"></a>Web App Security Provider</h3><p>Knox is a Web API (REST) Gateway for Hadoop. The fact that REST interactions are HTTP based means that they are vulnerable to a number of web application security vulnerabilities. This project introduces a web application security provider for plugging in various protection filters.</p><p>The initial vulnerability protection filter will be for Cross Site Request Forgery (CSRF). Others will be added in future releases.</p><p>Cross site request forgery (CSRF) attacks attempt to force an authenticated user to execute functionality without their knowledge. By presenting them with a link or image that when clicked invokes a request to another site with which the user may have already established an active session.</p><p>CSRF is entirely a browser based attack. Some background knowledge of how browsers work enables us to provide a filter that wi
 ll prevent CSRF attacks. HTTP requests from a web browser performed via form, image, iframe, etc are unable to set custom HTTP headers. The only way to create a HTTP request from a browser with a custom HTTP header is to use a technology such as Javascript XMLHttpRequest or Flash. These technologies can set custom HTTP headers, but have security policies built in to prevent web sites from sending requests to each other unless specifically allowed by policy. </p><p>This means that a website www.bad.com cannot send a request to <a href="http://bank.example.com">http://bank.example.com</a> with the custom header X-XSRF-Header unless they use a technology such as a XMLHttpRequest. That technology would prevent such a request from being made unless the bank.example.com domain specifically allowed it. This then results in a REST endpoint that can only be called via XMLHttpRequest (or similar technology).</p><p>NOTE: by enabling this protection within the gateway, this custom header will b
 e required for <em>all</em> clients that interact with it - not just browsers.</p><h4><a id="Configuration"></a>Configuration</h4><h5><a id="Overview"></a>Overview</h5><p>As with all providers in the Knox gateway, the web app security provider is configured through provider params. Unlike many other providers, the web app security provider may actually host multiple vulnerability filters. Currently, we only have an implementation for CSRF but others will follow and you may be interested in creating your own.</p><p>Because of this one-to-many provider/filter relationship, there is an extra configuration element for this provider per filter. As you can see in the sample below, the actual filter configuration is defined entirely within the params of the WebAppSec provider.</p>
-<pre><code>&lt;provider
+<pre><code>&lt;provider&gt;
   &lt;role&gt;webappsec&lt;/role&gt;
   &lt;name&gt;WebAppSec&lt;/name&gt;
   &lt;enabled&gt;true&lt;/enabled&gt;
@@ -1414,7 +1410,7 @@ APACHE_HOME/bin/apachectl -k stop
   &lt;param&gt;&lt;name&gt;preauth.validation.method&lt;/name&gt;&lt;value&gt;preauth.ip.validation&lt;/value&gt;&lt;/param&gt;
   &lt;param&gt;&lt;name&gt;preauth.ip.addresses&lt;/name&gt;&lt;value&gt;127.0.0.2,127.0.0.1&lt;/value&gt;&lt;/param&gt;
 &lt;/provider&gt;
-</code></pre><h5><a id="REST+Invocation+for+Tivoli+AM"></a>REST Invocation for Tivoli AM</h5><p>The following curl command can be used to request a directory listing from HDFS while passing in the expected headers of iv_user and iv_group. Note that the iv_group value in this command matches the expected ACL for webhdfs in the above topology file. Changing this from “admin” to “admin2” should result in a 401 unauthorized response.</p>
+</code></pre><h5><a id="REST+Invocation+for+Tivoli+AM"></a>REST Invocation for Tivoli AM</h5><p>The following curl command can be used to request a directory listing from HDFS while passing in the expected headers of iv_user and iv_group. Note that the iv_group value in this command matches the expected ACL for webhdfs in the above topology file. Changing this from &ldquo;admin&rdquo; to &ldquo;admin2&rdquo; should result in a 401 unauthorized response.</p>
 <pre><code>curl -k -i --header &quot;iv_user: guest&quot; --header &quot;iv_group: admin&quot; -v https://localhost:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
 </code></pre><p>Omitting the &ndash;header &ldquo;iv_user: guest&rdquo; above will result in a rejected request.</p><h3><a id="Audit"></a>Audit</h3><p>The Audit facility within the Knox Gateway introduces functionality for tracking actions that are executed by Knox per user&rsquo;s request or that are produced by Knox internal events like topology deploy, etc. The Knox Audit module is based on the <a href="http://logging.apache.org/log4j/1.2/">Apache log4j</a>.</p><h4><a id="Configuration+needed"></a>Configuration needed</h4><p>Out of the box, the Knox Gateway includes preconfigured auditing capabilities. To change its configuration please read following sections.</p><h4><a id="Where+audit+logs+go"></a>Where audit logs go</h4><p>Audit module is preconfigured to write audit records to the log file <code>/var/log/knox/gateway-audit.log</code>.</p><p>This behavior can be changed in the <code>conf/gateway-log4j.properties</code> file. <code>log4j.appender.auditfile.*</code> properties d
 etermine this behavior. For detailed information read <a href="http://logging.apache.org/log4j/1.2/">Apache log4j</a>.</p><h4><a id="Audit+format"></a>Audit format</h4><p>Out of the box, the audit record format is defined by org.apache.hadoop.gateway.audit.log4j.layout.AuditLayout. Its structure is following:</p>
 <pre><code>EVENT_PUBLISHING_TIME ROOT_REQUEST_ID|PARENT_REQUEST_ID|REQUEST_ID|LOGGER_NAME|TARGET_SERVICE_NAME|USER_NAME|PROXY_USER_NAME|SYSTEM_USER_NAME|ACTION|RESOURCE_TYPE|RESOURCE_NAME|OUTCOME|LOGGING_MESSAGE
@@ -2819,24 +2815,16 @@ session.shutdown(10, SECONDS)
   <ol>
     <li>Hive JDBC in HTTP mode depends on following minimal libraries set to run successfully(must be in the classpath):
     <ul>
-      <li>hive-jdbc-0.13.0.jar;</li>
-      <li>hive-service-0.13.0.jar;</li>
-      <li>libthrift-0.9.0.jar;</li>
-      <li>httpcore-4.2.5.jar;</li>
-      <li>httpclient-4.2.5.jar;</li>
+      <li>hive-jdbc-0.14.0-standalone.jar;</li>
       <li>commons-logging-1.1.3.jar;</li>
-      <li>commons-codec-1.4.jar;</li>
-      <li>slf4j-api-1.7.5.jar;</li>
-      <li>slf4j-log4j12-1.7.5.jar;</li>
-      <li>log4j-1.2.17.jar;</li>
     </ul></li>
     <li>Connection URL has to be following: <code>jdbc:hive2://{gateway-host}:{gateway-port}/;ssl=true;sslTrustStore={gateway-trust-store-path};trustStorePassword={gateway-trust-store-password}?hive.server2.transport.mode=http;hive.server2.thrift.http.path={gateway-path}/{cluster-name}/hive</code></li>
     <li>Look at <a href="https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-DDLOperations">https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-DDLOperations</a> for examples.  Hint: For testing it would be better to execute <code>set hive.security.authorization.enabled=false</code> as the first statement.  Hint: Good examples of Hive DDL/DML can be found here <a href="http://gettingstarted.hadooponazure.com/hw/hive.html">http://gettingstarted.hadooponazure.com/hw/hive.html</a></li>
   </ol></li>
 </ol><h5><a id="Customization"></a>Customization</h5><p>This example may need to be tailored to the execution environment. In particular host name, host port, user name, user password and context path may need to be changed to match your environment. In particular there is one example file in the distribution that may need to be customized. Take a moment to review this file. All of the values that may need to be customized can be found together at the top of the file.</p>
 <ul>
-  <li>samples/HiveJDBCSample.java</li>
-</ul><h5><a id="Client+JDBC+Example"></a>Client JDBC Example</h5><p>Sample example for creating new table, loading data into it from local file system and querying data from that table.</p><h6><a id="Java"></a>Java</h6>
+  <li>samples/hive/java/jdbc/sandbox/HiveJDBCSample.java</li>
+</ul><h5><a id="Client+JDBC+Example"></a>Client JDBC Example</h5><p>Sample example for creating new table, loading data into it from the file system local to the Hive server and querying data from that table.</p><h6><a id="Java"></a>Java</h6>
 <pre><code>import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.ResultSet;
@@ -2917,16 +2905,8 @@ public class HiveJDBCSample {
 }
 </code></pre><h6><a id="Groovy"></a>Groovy</h6><p>Make sure that GATEWAY_HOME/ext directory contains following libraries for successful execution:</p>
 <ul>
-  <li>hive-jdbc-0.13.0.jar;</li>
-  <li>hive-service-0.13.0.jar;</li>
-  <li>libthrift-0.9.0.jar;</li>
-  <li>httpcore-4.2.5.jar;</li>
-  <li>httpclient-4.2.5.jar;</li>
+  <li>hive-jdbc-0.14.0-standalone.jar;</li>
   <li>commons-logging-1.1.3.jar;</li>
-  <li>commons-codec-1.4.jar;</li>
-  <li>slf4j-api-1.7.5.jar;</li>
-  <li>slf4j-log4j12-1.7.5.jar;</li>
-  <li>log4j-1.2.17.jar;</li>
 </ul><p>There are several ways to execute this sample depending upon your preference.</p><p>You can use the Groovy interpreter provided with the distribution.</p>
 <pre><code>java -jar bin/shell.jar samples/hive/groovy/jdbc/sandbox/HiveJDBCSample.groovy
 </code></pre><p>You can manually type in the KnoxShell DSL script into the interactive Groovy interpreter provided with the distribution.</p>
@@ -3091,7 +3071,7 @@ The contents of the state-killed.json fi
 
 
 curl -ikv -u guest:guest-password -H Content-Type:application/json -X PUT -T state-killed.json &#39;https://localhost:8443/gateway/sandbox/resourcemanager/v1/cluster/apps/{application_id}/state&#39;
-</code></pre><h2><a id="Limitations"></a>Limitations</h2><h3><a id="Secure+Oozie+POST/PUT+Request+Payload+Size+Restriction"></a>Secure Oozie POST/PUT Request Payload Size Restriction</h3><p>With one exception there are no know size limits for requests or responses payloads that pass through the gateway. The exception involves POST or PUT request payload sizes for Oozie in a Kerberos secured Hadoop cluster. In this one case there is currently a 4Kb payload size limit for the first request made to the Hadoop cluster. This is a result of how the gateway negotiates a trust relationship between itself and the cluster via SPNego. There is an undocumented configuration setting to modify this limit&rsquo;s value if required. In the future this will be made more easily configuration and at that time it will be documented.</p><h3><a id="LDAP+Groups+Acquisition+from+AD"></a>LDAP Groups Acquisition from AD</h3><p>The LDAP authenticator currently does not &ldquo;out of the box&rdquo; support the
  acquisition of group information from Microsoft Active Directory. Building this into the default implementation is on the roadmap.</p><h3><a id="Group+Membership+Propagation"></a>Group Membership Propagation</h3><p>Groups that are acquired via Shiro Group Lookup and/or Identity Assertion Group Principal Mapping are not propagated to the Hadoop services. Therefore groups used for Service Level Authorization policy may not match those acquired within the cluster via GroupMappingServiceProvider plugins.</p><h2><a id="Troubleshooting"></a>Troubleshooting</h2><h3><a id="Finding+Logs"></a>Finding Logs</h3><p>When things aren&rsquo;t working the first thing you need to do is examine the diagnostic logs. Depending upon how you are running the gateway these diagnostic logs will be output to different locations.</p><h4><a id="java+-jar+bin/gateway.jar"></a>java -jar bin/gateway.jar</h4><p>When the gateway is run this way the diagnostic output is written directly to the console. If you want t
 o capture that output you will need to redirect the console output to a file using OS specific techniques.</p>
+</code></pre><h2><a id="Limitations"></a>Limitations</h2><h3><a id="Secure+Oozie+POST/PUT+Request+Payload+Size+Restriction"></a>Secure Oozie POST/PUT Request Payload Size Restriction</h3><p>With one exception there are no known size limits for requests or responses payloads that pass through the gateway. The exception involves POST or PUT request payload sizes for Oozie in a Kerberos secured Hadoop cluster. In this one case there is currently a 4Kb payload size limit for the first request made to the Hadoop cluster. This is a result of how the gateway negotiates a trust relationship between itself and the cluster via SPNego. There is an undocumented configuration setting to modify this limit&rsquo;s value if required. In the future this will be made more easily configuration and at that time it will be documented.</p><h3><a id="Group+Membership+Propagation"></a>Group Membership Propagation</h3><p>Groups that are acquired via Shiro Group Lookup and/or Identity Assertion Group Princip
 al Mapping are not propagated to the Hadoop services. Therefore, groups used for Service Level Authorization policy may not match those acquired within the cluster via GroupMappingServiceProvider plugins.</p><h2><a id="Troubleshooting"></a>Troubleshooting</h2><h3><a id="Finding+Logs"></a>Finding Logs</h3><p>When things aren&rsquo;t working the first thing you need to do is examine the diagnostic logs. Depending upon how you are running the gateway these diagnostic logs will be output to different locations.</p><h4><a id="java+-jar+bin/gateway.jar"></a>java -jar bin/gateway.jar</h4><p>When the gateway is run this way the diagnostic output is written directly to the console. If you want to capture that output you will need to redirect the console output to a file using OS specific techniques.</p>
 <pre><code>java -jar bin/gateway.jar &gt; gateway.log
 </code></pre><h4><a id="bin/gateway.sh+start"></a>bin/gateway.sh start</h4><p>When the gateway is run this way the diagnostic output is written to /var/log/knox/knox.out and /var/log/knox/knox.err. Typically only knox.out will have content.</p><h3><a id="Increasing+Logging"></a>Increasing Logging</h3><p>The <code>log4j.properties</code> files <code>{GATEWAY_HOME}/conf</code> can be used to change the granularity of the logging done by Knox. The Knox server must be restarted in order for these changes to take effect. There are various useful loggers pre-populated but commented out.</p>
 <pre><code>log4j.logger.org.apache.hadoop.gateway=DEBUG # Use this logger to increase the debugging of Apache Knox itself.

Modified: knox/site/index.html
URL: http://svn.apache.org/viewvc/knox/site/index.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/index.html (original)
+++ knox/site/index.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/site/issue-tracking.html
URL: http://svn.apache.org/viewvc/knox/site/issue-tracking.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/issue-tracking.html (original)
+++ knox/site/issue-tracking.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/site/license.html
URL: http://svn.apache.org/viewvc/knox/site/license.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/license.html (original)
+++ knox/site/license.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/site/mail-lists.html
URL: http://svn.apache.org/viewvc/knox/site/mail-lists.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/mail-lists.html (original)
+++ knox/site/mail-lists.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/site/project-info.html
URL: http://svn.apache.org/viewvc/knox/site/project-info.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/project-info.html (original)
+++ knox/site/project-info.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/site/team-list.html
URL: http://svn.apache.org/viewvc/knox/site/team-list.html?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/site/team-list.html (original)
+++ knox/site/team-list.html Thu Oct 30 19:15:50 2014
@@ -1,5 +1,5 @@
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
-<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-09-29 -->
+<!-- Generated by Apache Maven Doxia Site Renderer 1.6 at 2014-10-30 -->
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
@@ -10,7 +10,7 @@
       @import url("./css/site.css");
     </style>
     <link rel="stylesheet" href="./css/print.css" type="text/css" media="print" />
-    <meta name="Date-Revision-yyyymmdd" content="20140929" />
+    <meta name="Date-Revision-yyyymmdd" content="20141030" />
     <meta http-equiv="Content-Language" content="en" />
                                                     
 <script type="text/javascript">var _gaq = _gaq || [];
@@ -57,7 +57,7 @@
                         <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
               
                     
-                &nbsp;| <span id="publishDate">Last Published: 2014-09-29</span>
+                &nbsp;| <span id="publishDate">Last Published: 2014-10-30</span>
               &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
             </div>
       <div class="clear">

Modified: knox/trunk/books/0.5.0/book_getting-started.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/book_getting-started.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/book_getting-started.md (original)
+++ knox/trunk/books/0.5.0/book_getting-started.md Thu Oct 30 19:15:50 2014
@@ -44,27 +44,26 @@ Knox can be installed by expanding the z
 
 The table below provides a brief explanation of the important files and directories within `{GATEWWAY_HOME}`
 
-| Directory     | Purpose |
-| ------------- | ------- |
-| conf/         | Contains configuration files that apply to the gateway globally (i.e. not cluster specific ).       |
-| data/         | Contains security and topology specific artifacts that require read/write access at runtime |
-|data/topologies/|Contains topology files that represent Hadoop clusters which the gateway uses to deploy cluster proxies|
-|data/security/ | Contains the persisted master secret and keystore dir|
-|data/security/keystores/| Contains the gateway identity keystore and credential stores for the gateway and each deployed cluster topology|
-| bin/          | Contains the executable shell scripts, batch files and JARs for clients and servers.|
-| data/deployments/ | Contains deployed cluster topologies used to protect access to specific Hadoop clusters.|
-| lib/          | Contains the JARs for all the components that make up the gateway.|
-| dep/          | Contains the JARs for all of the components upon which the gateway depends.|
-| ext/          | A directory where user supplied extension JARs can be placed to extends the gateways functionality.|
-| pids/         | Contains the process ids for running ldap and gateway servers|
-| samples/      | Contains a number of samples that can be used to explore the functionality of the gateway.|
-| templates/    | Contains default configuration files that can be copied and customized.|
-| README        | Provides basic information about the Apache Knox Gateway.|
-| ISSUES        | Describes significant know issues.|
-| CHANGES       | Enumerates the changes between releases.|
-| LICENSE       | Documents the license under which this software is provided.|
-| NOTICE        | Documents required attribution notices for included dependencies.|
-| DISCLAIMER    | Documents that this release is from a project undergoing incubation at Apache.|
+| Directory                | Purpose |
+| ------------------------ | ------- |
+| conf/                    | Contains configuration files that apply to the gateway globally (i.e. not cluster specific ). |
+| data/                    | Contains security and topology specific artifacts that require read/write access at runtime |
+| conf/topologies/         | Contains topology files that represent Hadoop clusters which the gateway uses to deploy cluster proxies |
+| data/security/           | Contains the persisted master secret and keystore dir |
+| data/security/keystores/ | Contains the gateway identity keystore and credential stores for the gateway and each deployed cluster topology |
+| bin/                     | Contains the executable shell scripts, batch files and JARs for clients and servers. |
+| data/deployments/        | Contains deployed cluster topologies used to protect access to specific Hadoop clusters. |
+| lib/                     | Contains the JARs for all the components that make up the gateway. |
+| dep/                     | Contains the JARs for all of the components upon which the gateway depends. |
+| ext/                     | A directory where user supplied extension JARs can be placed to extends the gateways functionality. |
+| pids/                    | Contains the process ids for running ldap and gateway servers |
+| samples/                 | Contains a number of samples that can be used to explore the functionality of the gateway. |
+| templates/               | Contains default configuration files that can be copied and customized. |
+| README                   | Provides basic information about the Apache Knox Gateway. |
+| ISSUES                   | Describes significant know issues. |
+| CHANGES                  | Enumerates the changes between releases. |
+| LICENSE                  | Documents the license under which this software is provided. |
+| NOTICE                   | Documents required attribution notices for included dependencies. |
 
 
 ### Supported Services ###
@@ -76,7 +75,7 @@ This table enumerates the versions of va
 | WebHDFS            | 2.4.0      | ![y]        | ![y]   |
 | WebHCat/Templeton  | 0.13.0     | ![y]        | ![y]   |
 |                    | 0.12.0     | ![y]        | ![y]   |
-| Ozzie              | 4.0.0      | ![y]        | ![y]   |
+| Oozie              | 4.0.0      | ![y]        | ![y]   |
 | HBase/Stargate     | 0.98.0     | ![y]        | ![y]   |
 | Hive (via WebHCat) | 0.13.0     | ![y]        | ![y]   |
 | Hive (via JDBC)    | 0.13.0     | ![y]        | ![y]   |

Modified: knox/trunk/books/0.5.0/config_kerberos.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/config_kerberos.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/config_kerberos.md (original)
+++ knox/trunk/books/0.5.0/config_kerberos.md Thu Oct 30 19:15:50 2014
@@ -50,11 +50,11 @@ You can usually find this by running `ho
 You could use * for local developer testing if Knox host does not have static IP.
 
     <property>
-        <name>webhcat.proxyuser.knox.groups</name>
+        <name>hadoop.proxyuser.knox.groups</name>
         <value>users</value>
     </property>
     <property>
-        <name>webhcat.proxyuser.knox.hosts</name>
+        <name>hadoop.proxyuser.knox.hosts</name>
         <value>FQDN_OF_KNOX_HOST</value>
     </property>
 
@@ -66,11 +66,11 @@ Replace FQDN_OF_KNOX_HOST with right val
 You could use * for local developer testing if Knox host does not have static IP.
 
     <property>
-        <name>hadoop.proxyuser.knox.groups</name>
+        <name>webhcat.proxyuser.knox.groups</name>
         <value>users</value>
     </property>
     <property>
-        <name>hadoop.proxyuser.knox.hosts</name>
+        <name>webhcat.proxyuser.knox.hosts</name>
         <value>FQDN_OF_KNOX_HOST</value>
     </property>
 
@@ -139,7 +139,7 @@ You could copy the `templates/krb5.conf`
 
 You could copy the `templates/krb5JAASLogin.conf` file provided in the Knox binary download and customize it to suit your cluster.
 
-#### Update `gateway-site.xml` on Knox host on Knox host ####
+#### Update `gateway-site.xml` on Knox host ####
 
 Update `conf/gateway-site.xml` in your Knox installation and set the value of `gateway.hadoop.kerberos.secured` to true.
 

Modified: knox/trunk/books/0.5.0/config_preauth_sso_provider.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/config_preauth_sso_provider.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/config_preauth_sso_provider.md (original)
+++ knox/trunk/books/0.5.0/config_preauth_sso_provider.md Thu Oct 30 19:15:50 2014
@@ -78,7 +78,7 @@ As an example for configuring the preaut
     </provider>
 
 ##### REST Invocation for Tivoli AM
-The following curl command can be used to request a directory listing from HDFS while passing in the expected headers of iv_user and iv_group. Note that the iv_group value in this command matches the expected ACL for webhdfs in the above topology file. Changing this from “admin” to “admin2” should result in a 401 unauthorized response.
+The following curl command can be used to request a directory listing from HDFS while passing in the expected headers of iv_user and iv_group. Note that the iv_group value in this command matches the expected ACL for webhdfs in the above topology file. Changing this from "admin" to "admin2" should result in a 401 unauthorized response.
 
 	curl -k -i --header "iv_user: guest" --header "iv_group: admin" -v https://localhost:8443/gateway/sandbox/webhdfs/v1/tmp?op=LISTSTATUS
 

Modified: knox/trunk/books/0.5.0/config_webappsec_provider.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/config_webappsec_provider.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/config_webappsec_provider.md (original)
+++ knox/trunk/books/0.5.0/config_webappsec_provider.md Thu Oct 30 19:15:50 2014
@@ -37,7 +37,7 @@ As with all providers in the Knox gatewa
 
 Because of this one-to-many provider/filter relationship, there is an extra configuration element for this provider per filter. As you can see in the sample below, the actual filter configuration is defined entirely within the params of the WebAppSec provider.
 
-	<provider
+	<provider>
 	  <role>webappsec</role>
 	  <name>WebAppSec</name>
 	  <enabled>true</enabled>

Modified: knox/trunk/books/0.5.0/quick_start.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/quick_start.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/quick_start.md (original)
+++ knox/trunk/books/0.5.0/quick_start.md Thu Oct 30 19:15:50 2014
@@ -144,7 +144,7 @@ The server will discover the persisted m
     bin/gateway.sh start
 
 When starting the gateway this way the process will be run in the backgroud.
-The log output is written into the directory /var/log/knox.
+In the Sandbox environment mentioned earlier the log output is written into the directory /var/log/knox.
 In addition a PID (process ID) is written into /var/run/knox.
 
 In order to stop a gateway that was started with the script use this command.

Modified: knox/trunk/books/0.5.0/service_hive.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/service_hive.md?rev=1635592&r1=1635591&r2=1635592&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/service_hive.md (original)
+++ knox/trunk/books/0.5.0/service_hive.md Thu Oct 30 19:15:50 2014
@@ -72,16 +72,8 @@ This guide provides detailed examples fo
 3. Make sure Hive Server is running in HTTP mode.
 4. Client side (JDBC):
      1. Hive JDBC in HTTP mode depends on following minimal libraries set to run successfully(must be in the classpath):
-         * hive-jdbc-0.13.0.jar;
-         * hive-service-0.13.0.jar;
-         * libthrift-0.9.0.jar;
-         * httpcore-4.2.5.jar;
-         * httpclient-4.2.5.jar;
+         * hive-jdbc-0.14.0-standalone.jar;
          * commons-logging-1.1.3.jar;
-         * commons-codec-1.4.jar;
-         * slf4j-api-1.7.5.jar;
-         * slf4j-log4j12-1.7.5.jar;
-         * log4j-1.2.17.jar;
      2. Connection URL has to be following:
         `jdbc:hive2://{gateway-host}:{gateway-port}/;ssl=true;sslTrustStore={gateway-trust-store-path};trustStorePassword={gateway-trust-store-password}?hive.server2.transport.mode=http;hive.server2.thrift.http.path={gateway-path}/{cluster-name}/hive`
      3. Look at https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-DDLOperations for examples.
@@ -96,11 +88,11 @@ In particular there is one example file 
 Take a moment to review this file.
 All of the values that may need to be customized can be found together at the top of the file.
 
-* samples/HiveJDBCSample.java
+* samples/hive/java/jdbc/sandbox/HiveJDBCSample.java
 
 ##### Client JDBC Example #####
 
-Sample example for creating new table, loading data into it from local file system and querying data from that table.
+Sample example for creating new table, loading data into it from the file system local to the Hive server and querying data from that table.
 
 ###### Java ######
 
@@ -187,16 +179,8 @@ Sample example for creating new table, l
 
 Make sure that GATEWAY_HOME/ext directory contains following libraries for successful execution:
 
-- hive-jdbc-0.13.0.jar;
-- hive-service-0.13.0.jar;
-- libthrift-0.9.0.jar;
-- httpcore-4.2.5.jar;
-- httpclient-4.2.5.jar;
+- hive-jdbc-0.14.0-standalone.jar;
 - commons-logging-1.1.3.jar;
-- commons-codec-1.4.jar;
-- slf4j-api-1.7.5.jar;
-- slf4j-log4j12-1.7.5.jar;
-- log4j-1.2.17.jar;
 
 There are several ways to execute this sample depending upon your preference.