You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@knox.apache.org by km...@apache.org on 2013/09/25 23:40:30 UTC

svn commit: r1526308 [2/2] - in /incubator/knox: site/books/knox-incubating-0-3-0/ trunk/books/0.3.0/ trunk/books/static/

Modified: incubator/knox/trunk/books/0.3.0/hbase.md
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/books/0.3.0/hbase.md?rev=1526308&r1=1526307&r2=1526308&view=diff
==============================================================================
--- incubator/knox/trunk/books/0.3.0/hbase.md (original)
+++ incubator/knox/trunk/books/0.3.0/hbase.md Wed Sep 25 21:40:30 2013
@@ -26,3 +26,578 @@ TODO
 #### {{HBase Examples}} ####
 
 TODO
+
+The examples below illustrate the set of basic operations with HBase instance using Stargate REST API.
+Use following link to get more more details about HBase/Stargate API: http://wiki.apache.org/hadoop/Hbase/Stargate.
+
+### Assumptions ###
+
+This document assumes a few things about your environment in order to simplify the examples.
+
+1. The JVM is executable as simply java.
+2. The Apache Knox Gateway is installed and functional.
+3. The example commands are executed within the context of the GATEWAY_HOME current directory.  The GATEWAY_HOME directory is the directory within the Apache Knox Gateway installation that contains the README file and the bin, conf and deployments directories.
+4. A few examples optionally require the use of commands from a standard Groovy installation.  These examples are optional but to try them you will need Groovy [installed|http://groovy.codehaus.org/Installing+Groovy].
+
+### HBase Stargate Setup ###
+
+#### Launch Stargate ####
+The command below launches the Stargate daemon on port 60080
+
+    sudo /usr/lib/hbase/bin/hbase-daemon.sh start rest -p 60080
+
+60080 post is used because it was specified in sample Hadoop cluster deployment {{\{GATEWAY_HOME\}}}/deployments/sample.xml.
+
+#### Configure Sandbox port mapping for VirtualBox
+
+1. Select the VM
+2. Select menu Machine>Settings...
+3. Select tab Network
+4. Select Adapter 1
+5. Press Port Forwarding button
+6. Press Plus button to insert new rule: Name=Stargate, Host Port=60080, Guest Port=60080
+7. Press OK to close the rule window
+8. Press OK to Network window save the changes
+
+60080 post is used because it was specified in sample Hadoop cluster deployment {{\{GATEWAY_HOME\}}}/deployments/sample.xml.
+
+### HBase/Stargate via KnoxShell DSL
+
+#### Usage
+For more details about client DSL usage please follow this [page|https://cwiki.apache.org/confluence/display/KNOX/Client+Usage].
+ 
+##### systemVersion() - Query Software Version.
+
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).systemVersion().now().string}}
+
+##### clusterVersion() - Query Storage Cluster Version.
+
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).clusterVersion().now().string}}
+
+##### status() - Query Storage Cluster Status.
+
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).status().now().string}}
+
+##### table().list() - Query Table List.
+
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+  * {{HBase.session(session).table().list().now().string}}
+
+##### table(String tableName).schema() - Query Table Schema.
+
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).table().schema().now().string}}
+
+##### table(String tableName).create() - Create Table Schema.
+* Request
+    * attribute(String name, Object value) - the table's attribute.
+    * family(String name) - starts family definition. Has sub requests:
+    * attribute(String name, Object value) - the family's attribute.
+    * endFamilyDef() - finishes family definition.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).create()}}
+     {{.attribute("tb_attr1", "value1")}}
+     {{.attribute("tb_attr2", "value2")}}
+     {{.family("family1")}}
+         {{.attribute("fm_attr1", "value3")}}
+         {{.attribute("fm_attr2", "value4")}}
+     {{.endFamilyDef()}}
+     {{.family("family2")}}
+     {{.family("family3")}}
+     {{.endFamilyDef()}}
+     {{.attribute("tb_attr3", "value5")}}
+     {{.now()}}
+
+##### table(String tableName).update() - Update Table Schema.
+* Request
+    * family(String name) - starts family definition. Has sub requests:
+    * attribute(String name, Object value) - the family's attribute.
+    * endFamilyDef() - finishes family definition.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).update()}}
+     {{.family("family1")}}
+         {{.attribute("fm_attr1", "new_value3")}}
+     {{.endFamilyDef()}}
+     {{.family("family4")}}
+         {{.attribute("fm_attr3", "value6")}}
+     {{.endFamilyDef()}}
+     {{.now()}}
+
+##### table(String tableName).regions() - Query Table Metadata.
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).table(tableName).regions().now().string}}
+
+##### table(String tableName).delete() - Delete Table.
+* Request
+    * No request parameters.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).delete().now()}}
+
+##### table(String tableName).row(String rowId).store() - Cell Store.
+* Request
+    * column(String family, String qualifier, Object value, Long time) - the data to store; "qualifier" may be "null"; "time" is optional.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).row("row_id_1").store()}}
+     {{.column("family1", "col1", "col_value1")}}
+     {{.column("family1", "col2", "col_value2", 1234567890l)}}
+     {{.column("family2", null, "fam_value1")}}
+     {{.now()}}
+    * {{HBase.session(session).table(tableName).row("row_id_2").store()}}
+     {{.column("family1", "row2_col1", "row2_col_value1")}}
+     {{.now()}}
+
+##### table(String tableName).row(String rowId).query() - Cell or Row Query.
+* rowId is optional. Querying with null or empty rowId will select all rows.
+* Request
+    * column(String family, String qualifier) - the column to select; "qualifier" is optional.
+    * startTime(Long) - the lower bound for filtration by time.
+    * endTime(Long) - the upper bound for filtration by time.
+    * times(Long startTime, Long endTime) - the lower and upper bounds for filtration by time.
+    * numVersions(Long) - the maximum number of versions to return.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).table(tableName).row("row_id_1")}}
+     {{.query()}}
+     {{.now().string}}
+    * {{HBase.session(session).table(tableName).row().query().now().string}}
+    * {{HBase.session(session).table(tableName).row().query()}}
+     {{.column("family1", "row2_col1")}}
+     {{.column("family2")}}
+     {{.times(0, Long.MAX_VALUE)}}
+     {{.numVersions(1)}}
+     {{.now().string}}
+
+##### table(String tableName).row(String rowId).delete() - Row, Column, or Cell Delete.
+* Request
+    * column(String family, String qualifier) - the column to delete; "qualifier" is optional.
+    * time(Long) - the upper bound for time filtration.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).row("row_id_1")}}
+     {{.delete()}}
+     {{.column("family1", "col1")}}
+     {{.now()}}
+    * {{HBase.session(session).table(tableName).row("row_id_1")}}
+     {{.delete()}}
+     {{.column("family2")}}
+     {{.time(Long.MAX_VALUE)}}
+     {{.now()}}
+
+##### table(String tableName).scanner().create() - Scanner Creation.
+* Request
+    * startRow(String) - the lower bound for filtration by row id.
+    * endRow(String) - the upper bound for filtration by row id.
+    * rows(String startRow, String endRow) - the lower and upper bounds for filtration by row id.
+    * column(String family, String qualifier) - the column to select; "qualifier" is optional.
+    * batch(Integer) - the batch size.
+    * startTime(Long) - the lower bound for filtration by time.
+    * endTime(Long) - the upper bound for filtration by time.
+    * times(Long startTime, Long endTime) - the lower and upper bounds for filtration by time.
+    * filter(String) - the filter XML definition.
+    * maxVersions(Integer) - the the maximum number of versions to return.
+* Response
+    * scannerId : String - the scanner ID of the created scanner. Consumes body.
+* Example
+    * {{HBase.session(session).table(tableName).scanner().create()}}
+     {{.column("family1", "col2")}}
+     {{.column("family2")}}
+     {{.startRow("row_id_1")}}
+     {{.endRow("row_id_2")}}
+     {{.batch(1)}}
+     {{.startTime(0)}}
+     {{.endTime(Long.MAX_VALUE)}}
+     {{.filter("")}}
+     {{.maxVersions(100)}}
+     {{.now()}}
+
+##### table(String tableName).scanner(String scannerId).getNext() - Scanner Get Next.
+* Request
+    * No request parameters.
+* Response
+    * BasicResponse
+* Example
+    * {{HBase.session(session).table(tableName).scanner(scannerId).getNext().now().string}}
+
+##### table(String tableName).scanner(String scannerId).delete() - Scanner Deletion.
+* Request
+    * No request parameters.
+* Response
+    * EmptyResponse
+* Example
+    * {{HBase.session(session).table(tableName).scanner(scannerId).delete().now()}}
+
+#### Examples
+
+This example illustrates sequence of all basic HBase operations: 
+1. get system version
+2. get cluster version
+3. get cluster status
+4. create the table
+5. get list of tables
+6. get table schema
+7. update table schema
+8. insert single row into table
+9. query row by id
+10. query all rows
+11. delete cell from row
+12. delete entire column family from row
+13. get table regions
+14. create scanner
+15. fetch values using scanner
+16. drop scanner
+17. drop the table
+
+There are several ways to do this depending upon your preference.
+
+You can use the Groovy interpreter provided with the distribution.
+
+    java -jar bin/shell.jar samples/ExampleHBaseUseCase.groovy
+
+You can manually type in the KnoxShell DSL script into the interactive Groovy interpreter provided with the distribution.
+
+    java -jar bin/shell.jar
+
+Each line from the file below will need to be typed or copied into the interactive shell.
+
+{code:title="samples/ExampleHBaseUseCase.groovy"}
+
+    /**
+     * Licensed to the Apache Software Foundation (ASF) under one
+     * or more contributor license agreements.  See the NOTICE file
+     * distributed with this work for additional information
+     * regarding copyright ownership.  The ASF licenses this file
+     * to you under the Apache License, Version 2.0 (the
+     * "License"); you may not use this file except in compliance
+     * with the License.  You may obtain a copy of the License at
+     *
+     *     http://www.apache.org/licenses/LICENSE-2.0
+     *
+     * Unless required by applicable law or agreed to in writing, software
+     * distributed under the License is distributed on an "AS IS" BASIS,
+     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     * See the License for the specific language governing permissions and
+     * limitations under the License.
+     */
+    package org.apache.hadoop.gateway.shell.hbase
+
+    import org.apache.hadoop.gateway.shell.Hadoop
+
+    import static java.util.concurrent.TimeUnit.SECONDS
+
+    gateway = "https://localhost:8443/gateway/sandbox"
+    username = "guest"
+    password = "guest-password"
+    tableName = "test_table"
+
+    session = Hadoop.login(gateway, username, password)
+
+    println "System version : " + HBase.session(session).systemVersion().now().string
+
+    println "Cluster version : " + HBase.session(session).clusterVersion().now().string
+
+    println "Status : " + HBase.session(session).status().now().string
+
+    println "Creating table '" + tableName + "'..."
+
+    HBase.session(session).table(tableName).create()  \
+        .attribute("tb_attr1", "value1")  \
+        .attribute("tb_attr2", "value2")  \
+        .family("family1")  \
+            .attribute("fm_attr1", "value3")  \
+            .attribute("fm_attr2", "value4")  \
+        .endFamilyDef()  \
+        .family("family2")  \
+        .family("family3")  \
+        .endFamilyDef()  \
+        .attribute("tb_attr3", "value5")  \
+        .now()
+
+    println "Done"
+
+    println "Table List : " + HBase.session(session).table().list().now().string
+
+    println "Schema for table '" + tableName + "' : " + HBase.session(session)  \
+        .table(tableName)  \
+        .schema()  \
+        .now().string
+
+    println "Updating schema of table '" + tableName + "'..."
+
+    HBase.session(session).table(tableName).update()  \
+        .family("family1")  \
+            .attribute("fm_attr1", "new_value3")  \
+        .endFamilyDef()  \
+        .family("family4")  \
+            .attribute("fm_attr3", "value6")  \
+        .endFamilyDef()  \
+        .now()
+
+    println "Done"
+
+    println "Schema for table '" + tableName + "' : " + HBase.session(session)  \
+        .table(tableName)  \
+        .schema()  \
+        .now().string
+
+    println "Inserting data into table..."
+
+    HBase.session(session).table(tableName).row("row_id_1").store()  \
+        .column("family1", "col1", "col_value1")  \
+        .column("family1", "col2", "col_value2", 1234567890l)  \
+        .column("family2", null, "fam_value1")  \
+        .now()
+
+    HBase.session(session).table(tableName).row("row_id_2").store()  \
+        .column("family1", "row2_col1", "row2_col_value1")  \
+        .now()
+
+    println "Done"
+
+    println "Querying row by id..."
+
+    println HBase.session(session).table(tableName).row("row_id_1")  \
+        .query()  \
+        .now().string
+
+    println "Querying all rows..."
+
+    println HBase.session(session).table(tableName).row().query().now().string
+
+    println "Querying row by id with extended settings..."
+
+    println HBase.session(session).table(tableName).row().query()  \
+        .column("family1", "row2_col1")  \
+        .column("family2")  \
+        .times(0, Long.MAX_VALUE)  \
+        .numVersions(1)  \
+        .now().string
+
+    println "Deleting cell..."
+
+    HBase.session(session).table(tableName).row("row_id_1")  \
+        .delete()  \
+        .column("family1", "col1")  \
+        .now()
+
+    println "Rows after delete:"
+
+    println HBase.session(session).table(tableName).row().query().now().string
+
+    println "Extended cell delete"
+
+    HBase.session(session).table(tableName).row("row_id_1")  \
+        .delete()  \
+        .column("family2")  \
+        .time(Long.MAX_VALUE)  \
+        .now()
+
+    println "Rows after delete:"
+
+    println HBase.session(session).table(tableName).row().query().now().string
+
+    println "Table regions : " + HBase.session(session).table(tableName)  \
+        .regions()  \
+        .now().string
+
+    println "Creating scanner..."
+
+    scannerId = HBase.session(session).table(tableName).scanner().create()  \
+        .column("family1", "col2")  \
+        .column("family2")  \
+        .startRow("row_id_1")  \
+        .endRow("row_id_2")  \
+        .batch(1)  \
+        .startTime(0)  \
+        .endTime(Long.MAX_VALUE)  \
+        .filter("")  \
+        .maxVersions(100)  \
+        .now().scannerId
+
+    println "Scanner id=" + scannerId
+
+    println "Scanner get next..."
+
+    println HBase.session(session).table(tableName).scanner(scannerId)  \
+        .getNext()  \
+        .now().string
+
+    println "Dropping scanner with id=" + scannerId
+
+    HBase.session(session).table(tableName).scanner(scannerId).delete().now()
+
+    println "Done"
+
+    println "Dropping table '" + tableName + "'..."
+
+    HBase.session(session).table(tableName).delete().now()
+
+    println "Done"
+
+    session.shutdown(10, SECONDS)
+
+### HBase/Stargate via cURL
+
+#### Get software version
+
+Set Accept Header to "text/plain", "text/xml", "application/json" or "application/x-protobuf"
+
+    %  curl -ik -u guest:guest-password\
+     -H "Accept:  application/json"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/version'
+
+#### Get version information regarding the HBase cluster backing the Stargate instance
+
+Set Accept Header to "text/plain", "text/xml" or "application/x-protobuf"
+
+    %  curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/version/cluster'
+
+#### Get detailed status on the HBase cluster backing the Stargate instance.
+
+Set Accept Header to "text/plain", "text/xml", "application/json" or "application/x-protobuf"
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/status/cluster'
+
+#### Get the list of available tables.
+
+Set Accept Header to "text/plain", "text/xml", "application/json" or "application/x-protobuf"
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase'
+
+#### Create table with two column families using xml input
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"   -H "Content-Type: text/xml"\
+     -d '<?xml version="1.0" encoding="UTF-8"?><TableSchema name="table1"><ColumnSchema name="family1"/><ColumnSchema name="family2"/></TableSchema>'\
+     -X PUT 'https://localhost:8443/gateway/sandbox/hbase/table1/schema'
+
+#### Create table with two column families using JSON input
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: application/json"  -H "Content-Type: application/json"\
+     -d '{"name":"table2","ColumnSchema":[{"name":"family3"},{"name":"family4"}]}'\
+     -X PUT 'https://localhost:8443/gateway/sandbox/hbase/table2/schema'
+
+#### Get table metadata
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/table1/regions'
+
+#### Insert single row table
+
+    % curl -ik -u guest:guest-password\
+     -H "Content-Type: text/xml"\
+     -H "Accept: text/xml"\
+     -d '<?xml version="1.0" encoding="UTF-8" standalone="yes"?><CellSet><Row key="cm93MQ=="><Cell column="ZmFtaWx5MTpjb2wx" >dGVzdA==</Cell></Row></CellSet>'\
+     -X POST 'https://localhost:8443/gateway/sandbox/hbase/table1/row1'
+
+#### Insert multiple rows into table
+
+    % curl -ik -u guest:guest-password\
+     -H "Content-Type: text/xml"\
+     -H "Accept: text/xml"\
+     -d '<?xml version="1.0" encoding="UTF-8" standalone="yes"?><CellSet><Row key="cm93MA=="><Cell column=" ZmFtaWx5Mzpjb2x1bW4x" >dGVzdA==</Cell></Row><Row key="cm93MQ=="><Cell column=" ZmFtaWx5NDpjb2x1bW4x" >dGVzdA==</Cell></Row></CellSet>'\
+     -X POST 'https://localhost:8443/gateway/sandbox/hbase/table2/false-row-key'
+
+#### Get all data from table
+
+Set Accept Header to "text/plain", "text/xml", "application/json" or "application/x-protobuf"
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/table1/*'
+
+#### Execute cell or row query
+
+Set Accept Header to "text/plain", "text/xml", "application/json" or "application/x-protobuf"
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/table1/row1/family1:col1'
+
+#### Delete entire row from table
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X DELETE 'https://localhost:8443/gateway/sandbox/hbase/table2/row0'
+
+#### Delete column family from row
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X DELETE 'https://localhost:8443/gateway/sandbox/hbase/table2/row0/family3'
+
+#### Delete specific column from row
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X DELETE 'https://localhost:8443/gateway/sandbox/hbase/table2/row0/family3'
+
+#### Create scanner
+
+Scanner URL will be in Location response header
+
+    % curl -ik -u guest:guest-password\
+     -H "Content-Type: text/xml"\
+     -d '<Scanner batch="1"/>'\
+     -X PUT 'https://localhost:8443/gateway/sandbox/hbase/table1/scanner'
+
+#### Get the values of the next cells found by the scanner
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: application/json"\
+     -X GET 'https://localhost:8443/gateway/sandbox/hbase/table1/scanner/13705290446328cff5ed'
+
+#### Delete scanner
+
+    % curl -ik -u guest:guest-password\
+     -H "Accept: text/xml"\
+     -X DELETE 'https://localhost:8443/gateway/sandbox/hbase/table1/scanner/13705290446328cff5ed'
+
+#### Delete table
+
+    % curl -ik -u guest:guest-password\
+     -X DELETE 'https://localhost:8443/gateway/sandbox/hbase/table1/schema'

Modified: incubator/knox/trunk/books/0.3.0/hive.md
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/books/0.3.0/hive.md?rev=1526308&r1=1526307&r2=1526308&view=diff
==============================================================================
--- incubator/knox/trunk/books/0.3.0/hive.md (original)
+++ incubator/knox/trunk/books/0.3.0/hive.md Wed Sep 25 21:40:30 2013
@@ -25,4 +25,213 @@ TODO
 
 #### {{Hive Examples}} ####
 
-TODO
+This guide provides detailed examples for how to to some basic interactions with Hive via the Apache Knox Gateway.
+
+##### Assumptions #####
+
+This document assumes a few things about your environment in order to simplify the examples.
+
+1. The JVM is executable as simply java.
+2. The Apache Knox Gateway is installed and functional.
+3. Minor Hive version is 0.12.0.
+4. The example commands are executed within the context of the GATEWAY_HOME current directory.
+   The GATEWAY_HOME directory is the directory within the Apache Knox Gateway installation that contains the README file and the bin, conf and deployments directories.
+5. A few examples optionally require the use of commands from a standard Groovy installation.
+   These examples are optional but to try them you will need Groovy [installed|http://groovy.codehaus.org/Installing+Groovy].
+
+##### Setup #####
+
+1. Make sure you are running the correct version of Hive to ensure JDBC/Thrift/HTTP support.
+2. Make sure Hive is running on the correct port.
+3. In hive-server.xml add the property "hive.server2.servermode=http"
+4. Client side (JDBC):
+    1. Hive JDBC in HTTP mode depends on following libraries to run successfully(must be in the classpath):
+       Hive Thrift artifacts classes, commons-codec.jar, commons-configuration.jar, commons-lang.jar, commons-logging.jar, hadoop-core.jar, hive-cli.jar, hive-common.jar, hive-jdbc.jar, hive-service.jar, hive-shims.jar, httpclient.jar, httpcore.jar, slf4j-api.jar;
+    2. import gateway certificate into default truststore. It is located in the <java-home>/lib/security/cacerts:
+       `keytool -import -alias hadoop.gateway -file hadoop.gateway.cer -keystore <java-home>/lib/security/cacerts`
+    3. connection URL has to be following:
+       `jdbc:hive2://<gateway-host>:<gateway-port>/?hive.server2.servermode=https;hive.server2.http.path=<gateway-path>/<cluster-name>/hive`
+    4. look at https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-DDLOperations for examples;
+       Hint: it would be better to execute "set hive.security.authorization.enabled=false" as the first statement - for testing purposes; http://gettingstarted.hadooponazure.com/hw/hive.html - here is a good example of Hive DDL/DML operations.
+
+##### Customization #####
+
+This example may need to be tailored to the execution environment.
+In particular host name, host port, user name, user password and context path may need to be changed to match your environment.
+In particular there is one example file in the distribution that may need to be customized.
+Take a moment to review this file.
+All of the values that may need to be customized can be found together at the top of the file.
+
+* samples/HiveJDBCSample.java
+
+##### Client JDBC Example #####
+
+Sample example for creating new table, loading data into it from local file system and querying data from that table.
+
+###### Java ######
+
+    import java.sql.Connection;
+    import java.sql.DriverManager;
+    import java.sql.ResultSet;
+    import java.sql.SQLException;
+    import java.sql.Statement;
+
+    import java.util.logging.Level;
+    import java.util.logging.Logger;
+
+    public class HiveJDBCSample {
+
+      public static void main( String[] args ) {
+        Connection connection = null;
+        Statement statement = null;
+        ResultSet resultSet = null;
+
+        try {
+          String user = "guest";
+          String password = user + "-password";
+          String gatewayHost = "localhost";
+          int gatewayPort = 8443;
+          String contextPath = "gateway/sandbox/hive";
+          String connectionString = String.format( "jdbc:hive2://%s:%d/?hive.server2.servermode=https;hive.server2.http.path=%s", gatewayHost, gatewayPort, contextPath );
+
+          // load Hive JDBC Driver
+          Class.forName( "org.apache.hive.jdbc.HiveDriver" );
+
+          // configure JDBC connection
+          connection = DriverManager.getConnection( connectionString, user, password );
+
+          statement = connection.createStatement();
+
+          // disable Hive authorization - it could be ommited if Hive authorization
+          // was configured properly
+          statement.execute( "set hive.security.authorization.enabled=false" );
+
+          // create sample table
+          statement.execute( "CREATE TABLE logs(column1 string, column2 string, column3 string, column4 string, column5 string, column6 string, column7 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ' '" );
+
+          // load data into Hive from file /tmp/log.txt which is placed on the local file system
+          statement.execute( "LOAD DATA LOCAL INPATH '/tmp/log.txt' OVERWRITE INTO TABLE logs" );
+
+          resultSet = statement.executeQuery( "SELECT * FROM logs" );
+
+          while ( resultSet.next() ) {
+            System.out.println( resultSet.getString( 1 ) + " --- " + resultSet.getString( 2 ) + " --- " + resultSet.getString( 3 ) + " --- " + resultSet.getString( 4 ) );
+          }
+        } catch ( ClassNotFoundException ex ) {
+          Logger.getLogger( HiveJDBCSample.class.getName() ).log( Level.SEVERE, null, ex );
+        } catch ( SQLException ex ) {
+          Logger.getLogger( HiveJDBCSample.class.getName() ).log( Level.SEVERE, null, ex );
+        } finally {
+          if ( resultSet != null ) {
+            try {
+              resultSet.close();
+            } catch ( SQLException ex ) {
+              Logger.getLogger( HiveJDBCSample.class.getName() ).log( Level.SEVERE, null, ex );
+            }
+          }
+          if ( statement != null ) {
+            try {
+              statement.close();
+            } catch ( SQLException ex ) {
+              Logger.getLogger( HiveJDBCSample.class.getName() ).log( Level.SEVERE, null, ex );
+            }
+          }
+          if ( connection != null ) {
+            try {
+              connection.close();
+            } catch ( SQLException ex ) {
+              Logger.getLogger( HiveJDBCSample.class.getName() ).log( Level.SEVERE, null, ex );
+            }
+          }
+        }
+      }
+    }
+
+h3. Groovy
+
+Make sure that GATEWAY_HOME/ext directory contains following jars/classes for successful execution:
+Hive Thrift artifacts classes, commons-codec.jar, commons-configuration.jar, commons-lang.jar, commons-logging.jar, hadoop-core.jar, hive-cli.jar, hive-common.jar, hive-jdbc.jar, hive-service.jar, hive-shims.jar, httpclient.jar, httpcore.jar, slf4j-api.jar
+
+There are several ways to execute this sample depending upon your preference.
+
+You can use the Groovy interpreter provided with the distribution.
+
+    java -jar bin/shell.jar samples/hive/groovy/jdbc/sandbox/HiveJDBCSample.groovy
+
+You can manually type in the KnoxShell DSL script into the interactive Groovy interpreter provided with the distribution.
+
+    java -jar bin/shell.jar
+
+Each line from the file below will need to be typed or copied into the interactive shell.
+
+    import java.sql.DriverManager
+
+    user = "guest";
+    password = user + "-password";
+    gatewayHost = "localhost";
+    gatewayPort = 8443;
+    contextPath = "gateway/sandbox/hive";
+    connectionString = String.format( "jdbc:hive2://%s:%d/?hive.server2.servermode=https;hive.server2.http.path=%s", gatewayHost, gatewayPort, contextPath );
+
+    // Load Hive JDBC Driver
+    Class.forName( "org.apache.hive.jdbc.HiveDriver" );
+
+    // Configure JDBC connection
+    connection = DriverManager.getConnection( connectionString, user, password );
+
+    statement = connection.createStatement();
+
+    // Disable Hive authorization - This can be ommited if Hive authorization is configured properly
+    statement.execute( "set hive.security.authorization.enabled=false" );
+
+    // Create sample table
+    statement.execute( "CREATE TABLE logs(column1 string, column2 string, column3 string, column4 string, column5 string, column6 string, column7 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ' '" );
+
+    // Load data into Hive from file /tmp/log.txt which is placed on the local file system
+    statement.execute( "LOAD DATA LOCAL INPATH '/tmp/sample.log' OVERWRITE INTO TABLE logs" );
+
+    resultSet = statement.executeQuery( "SELECT * FROM logs" );
+
+    while ( resultSet.next() ) {
+      System.out.println( resultSet.getString( 1 ) + " --- " + resultSet.getString( 2 ) );
+    }
+
+    resultSet.close();
+    statement.close();
+    connection.close();
+
+Exampes use 'log.txt' with content:
+
+    2012-02-03 18:35:34 SampleClass6 [INFO] everything normal for id 577725851
+    2012-02-03 18:35:34 SampleClass4 [FATAL] system problem at id 1991281254
+    2012-02-03 18:35:34 SampleClass3 [DEBUG] detail for id 1304807656
+    2012-02-03 18:35:34 SampleClass3 [WARN] missing id 423340895
+    2012-02-03 18:35:34 SampleClass5 [TRACE] verbose detail for id 2082654978
+    2012-02-03 18:35:34 SampleClass0 [ERROR] incorrect id  1886438513
+    2012-02-03 18:35:34 SampleClass9 [TRACE] verbose detail for id 438634209
+    2012-02-03 18:35:34 SampleClass8 [DEBUG] detail for id 2074121310
+    2012-02-03 18:35:34 SampleClass0 [TRACE] verbose detail for id 1505582508
+    2012-02-03 18:35:34 SampleClass0 [TRACE] verbose detail for id 1903854437
+    2012-02-03 18:35:34 SampleClass7 [DEBUG] detail for id 915853141
+    2012-02-03 18:35:34 SampleClass3 [TRACE] verbose detail for id 303132401
+    2012-02-03 18:35:34 SampleClass6 [TRACE] verbose detail for id 151914369
+    2012-02-03 18:35:34 SampleClass2 [DEBUG] detail for id 146527742
+    ...
+
+Expected output:
+
+    2012-02-03 --- 18:35:34 --- SampleClass6 --- [INFO]
+    2012-02-03 --- 18:35:34 --- SampleClass4 --- [FATAL]
+    2012-02-03 --- 18:35:34 --- SampleClass3 --- [DEBUG]
+    2012-02-03 --- 18:35:34 --- SampleClass3 --- [WARN]
+    2012-02-03 --- 18:35:34 --- SampleClass5 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass0 --- [ERROR]
+    2012-02-03 --- 18:35:34 --- SampleClass9 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass8 --- [DEBUG]
+    2012-02-03 --- 18:35:34 --- SampleClass0 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass0 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass7 --- [DEBUG]
+    2012-02-03 --- 18:35:34 --- SampleClass3 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass6 --- [TRACE]
+    2012-02-03 --- 18:35:34 --- SampleClass2 --- [DEBUG]
+    ...

Added: incubator/knox/trunk/books/0.3.0/sandbox.md
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/books/0.3.0/sandbox.md?rev=1526308&view=auto
==============================================================================
--- incubator/knox/trunk/books/0.3.0/sandbox.md (added)
+++ incubator/knox/trunk/books/0.3.0/sandbox.md Wed Sep 25 21:40:30 2013
@@ -0,0 +1,39 @@
+<!---
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+--->
+
+{{Sandbox Configuration}}
+-------------------------
+
+This version of the Apache Knox Gateway is tested against [Hortonworks Sandbox 1.2|http://hortonworks.com/products/hortonworks-sandbox/]
+
+Currently there is an issue with Sandbox that prevents it from being easily used with the gateway.  In order to correct the issue, you can use the commands below to login to the Sandbox VM and modify the configuration.  This assumes that the name sandbox is setup to resolve to the Sandbox VM.  It may be necessary to use the IP address of the Sandbox VM instead. *This is frequently but not always* {{{*}192.168.56.101{*}}}*.*
+
+    ssh root@sandbox
+    cp /usr/lib/hadoop/conf/hdfs-site.xml /usr/lib/hadoop/conf/hdfs-site.xml.orig
+    sed -e s/localhost/sandbox/ /usr/lib/hadoop/conf/hdfs-site.xml.orig > /usr/lib/hadoop/conf/hdfs-site.xml
+    shutdown -r now
+
+
+In addition to make it very easy to follow along with the samples for the gateway you can configure your local system to resolve the address of the Sandbox by the names {{vm}} and {{sandbox}}.  The IP address that is shown below should be that of the Sandbox VM as it is known on your system.  This will likely, but not always, be {{192.168.56.101}}.
+
+On Linux or Macintosh systems add a line like this to the end of the file&nbsp;{{/etc/hosts}}&nbsp;on your local machine, *not the Sandbox VM*.
+_Note: The character between the {{{_}192.168.56.101{_}}} and {{{_}vm{_}}} below is a *{_}tab{_}* character._
+
+    192.168.56.101	vm sandbox
+
+On Windows systems a similar but different mechanism can be used.  On recent
+versions of windows the file that should be modified is {{%systemroot%\system32\drivers\etc\hosts}}

Modified: incubator/knox/trunk/books/static/book.css
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/books/static/book.css?rev=1526308&r1=1526307&r2=1526308&view=diff
==============================================================================
--- incubator/knox/trunk/books/static/book.css (original)
+++ incubator/knox/trunk/books/static/book.css Wed Sep 25 21:40:30 2013
@@ -95,7 +95,10 @@ h6 {
    color: #777777;
    font-size: 14px; }
 
-p, blockquote, ul, ol, dl, li, table, pre {
+ul {
+   margin: 0px 0; }
+
+p, blockquote, ol, dl, li, table, pre {
    margin: 15px 0; }
 
 hr {