You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@knox.apache.org by km...@apache.org on 2013/03/27 19:15:35 UTC

svn commit: r1461761 - in /incubator/knox: site/ trunk/ trunk/src/site/markdown/

Author: kminder
Date: Wed Mar 27 18:15:34 2013
New Revision: 1461761

URL: http://svn.apache.org/r1461761
Log:
Fixes found when reviewing docs and testing against distro.

Modified:
    incubator/knox/site/build-process.html
    incubator/knox/site/client.html
    incubator/knox/site/contribute-process.html
    incubator/knox/site/examples.html
    incubator/knox/site/getting-started.html
    incubator/knox/site/index.html
    incubator/knox/site/issue-tracking.html
    incubator/knox/site/license.html
    incubator/knox/site/mail-lists.html
    incubator/knox/site/news.html
    incubator/knox/site/privacy-policy.html
    incubator/knox/site/project-info.html
    incubator/knox/site/release-0-2-0.html
    incubator/knox/site/release-process.html
    incubator/knox/site/roadmap-0-3-0.html
    incubator/knox/site/team-list.html
    incubator/knox/site/template.html
    incubator/knox/trunk/pom.xml
    incubator/knox/trunk/src/site/markdown/client.md.vm
    incubator/knox/trunk/src/site/markdown/examples.md.vm
    incubator/knox/trunk/src/site/markdown/getting-started.md.vm

Modified: incubator/knox/site/build-process.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/build-process.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/build-process.html (original)
+++ incubator/knox/site/build-process.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/client.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/client.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/client.html (original)
+++ incubator/knox/site/client.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>
@@ -179,18 +182,20 @@ limitations under the License. --><div c
   <li>The Apache Knox Gateway is installed and functional.</li>
   <li>The example commands are executed within the context of the GATEWAY_HOME current directory. The GATEWAY_HOME directory is the directory within the Apache Knox Gateway installation that contains the README file and the bin, conf and deployments directories.</li>
   <li>A few examples require the use of commands from a standard Groovy installation. These examples are optional but to try them you will need Groovy <a class="externalLink" href="http://groovy.codehaus.org/Installing+Groovy">installed</a>.</li>
-</ol></div><div class="section"><h2>Usage<a name="Usage"></a></h2><p>The DSL requires a shell to interpret the Groovy script. The shell can either be used interactively or to execute a script file. To simplify use, the distribution contains an embedded version of the Groovy shell.</p><p>The shell can be run interactively.</p>
+</ol></div><div class="section"><h2>Usage<a name="Usage"></a></h2><p>The DSL requires a shell to interpret the Groovy script. The shell can either be used interactively or to execute a script file. To simplify use, the distribution contains an embedded version of the Groovy shell.</p><p>The shell can be run interactively. Use the command <tt>exit</tt> to exit.</p>
 <div class="source"><pre>java -jar bin/shell.jar
-</pre></div><p>The shell can also be used to execute a script by passing a single filename argument.</p>
-<div class="source"><pre>java -jar bin/shell.jar sample/SmokeTestJob.groovy
 </pre></div><p>When running interactively it may be helpful to reduce some of the output generated by the shell console. Use the following command in the interactive shell to reduce that output. This only needs to be done once as these preferences are persisted.</p>
 <div class="source"><pre>set verbosity QUIET
 set show-last-result false
-</pre></div><p>Also when running interactively use the <tt>exit</tt> command to terminate the shell. Using <tt>^C</tt> to exit can sometimes leaves the parent shell in a problematic state.</p></div><div class="section"><h2>Examples<a name="Examples"></a></h2><p>Once the shell can be launched the DSL can be used to interact with the gateway and Hadoop. Below is a very simple example of an interactive shell session to upload a file to HDFS.</p>
+</pre></div><p>Also when running interactively use the <tt>exit</tt> command to terminate the shell. Using <tt>^C</tt> to exit can sometimes leaves the parent shell in a problematic state.</p><p>The shell can also be used to execute a script by passing a single filename argument.</p>
+<div class="source"><pre>java -jar bin/shell.jar samples/ExamplePutFile.groovy
+</pre></div></div><div class="section"><h2>Examples<a name="Examples"></a></h2><p>Once the shell can be launched the DSL can be used to interact with the gateway and Hadoop. Below is a very simple example of an interactive shell session to upload a file to HDFS.</p>
 <div class="source"><pre>java -jar bin/shell.jar
 knox:000&gt; hadoop = Hadoop.login( &quot;https://localhost:8443/gateway/sample&quot;, &quot;hdfs&quot;, &quot;hdfs-password&quot; )
 knox:000&gt; Hdfs.put( hadoop ).file( &quot;README&quot; ).to( &quot;/tmp/example/README&quot; ).now()
-</pre></div><p>The <tt>knox:000&gt;</tt> in the example above is the prompt from the embedded Groovy console. If you output doesnt look like this you may need to set the verbosity and show-last-result preferences as described above in the Usage section.</p><p>Without using some other tool to browse HDFS it is impossible to tell that that this command did anything. Execute this to get a bit more feedback.</p>
+</pre></div><p>The <tt>knox:000&gt;</tt> in the example above is the prompt from the embedded Groovy console. If you output doesnt look like this you may need to set the verbosity and show-last-result preferences as described above in the Usage section.</p><p>If you relieve an error <tt>HTTP/1.1 403 Forbidden</tt> it may be because that file already exists. Try deleting it with the following command and then try again.</p>
+<div class="source"><pre>knox:000&gt; Hdfs.rm(hadoop).file(&quot;/tmp/example/README&quot;).now()
+</pre></div><p>Without using some other tool to browse HDFS it is hard to tell that that this command did anything. Execute this to get a bit more feedback.</p>
 <div class="source"><pre>knox:000&gt; println &quot;Status=&quot; + Hdfs.put( hadoop ).file( &quot;README&quot; ).to( &quot;/tmp/example/README2&quot; ).now().statusCode
 Status=201
 </pre></div><p>Notice that a different filename is used for the destination. Without this an error would have resulted. Of course the DSL also provides a command to list the contents of a directory.</p>
@@ -206,7 +211,7 @@ knox:000&gt; println json.FileStatuses.F
 <div class="source"><pre>knox:000&gt; hadoop.shutdown()
 knox:000&gt; exit
 </pre></div><p>All of the commands above could have been combined into a script file and executed as a single line.</p>
-<div class="source"><pre>java -jar bin/shell.jar samples/Example.groovy
+<div class="source"><pre>java -jar bin/shell.jar samples/ExamplePutFile.groovy
 </pre></div><p>This script file is available in the distribution but for convenience, this is the content.</p>
 <div class="source"><pre>import org.apache.hadoop.gateway.shell.Hadoop
 import org.apache.hadoop.gateway.shell.hdfs.Hdfs
@@ -224,6 +229,7 @@ text = Hdfs.ls( hadoop ).dir( &quot;/tmp
 json = (new JsonSlurper()).parseText( text )
 println json.FileStatuses.FileStatus.pathSuffix
 hadoop.shutdown()
+exit
 </pre></div><p>Notice the Hdfs.rm command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p></div><div class="section"><h2>Constructs<a name="Constructs"></a></h2><p>In order to understand the DSL there are three primary constructs that need to be understood.</p><div class="section"><h3>Hadoop<a name="Hadoop"></a></h3><p>This construct encapsulates the client side session state that will be shared between all command invocations. In particular it will simplify the management of any tokens that need to be presented with each command invocation. It also manages a thread pool that is used by all asynchronous commands which is why it is important to call one of the shutdown methods.</p><p>The syntax associated with this is expected to change we expect that credentials will not need to be provided to the gateway. Rather it is expected that some form of access token will be used to initialize the sessi
 on.</p></div><div class="section"><h3>Services<a name="Services"></a></h3><p>Services are the primary extension point for adding new suites of commands. The built in examples are: Hdfs, Job and Workflow. The desire for extensibility is the reason for the slightly awkward Hdfs.ls(hadoop) syntax. Certainly something more like hadoop.hdfs().ls() would have been preferred but this would prevent adding new commands easily. At a minimum it would result in extension commands with a different syntax from the built-in commands.</p><p>The service objects essentially function as a factory for a suite of commands.</p></div><div class="section"><h3>Commands<a name="Commands"></a></h3><p>Commands provide the behavior of the DSL. They typically follow a Fluent interface style in order to allow for single line commands. There are really three parts to each command: Request, Invocation, Response</p><div class="section"><h4>Request<a name="Request"></a></h4><p>The request is populated by all 
 of the methods following the verb method and the invoke method. For example in Hdfs.rm(hadoop).ls(dir).now() the request is populated between the verb method rm() and the invoke method now().</p></div><div class="section"><h4>Invocation<a name="Invocation"></a></h4><p>The invocation method controls how the request is invoked. Currently supported synchronous and asynchronous invocation. The now() method executes the request and returns the result immediately. The later() method submits the request to be executed later and returns a future from which the result can be retrieved. In addition later() invocation method can optionally be provided a closure to execute when the request is complete. See the Futures and Closures sections below for additional detail and examples.</p></div><div class="section"><h4>Response<a name="Response"></a></h4><p>The response contains the results of the invocation of the request. In most cases the response is a thin wrapper over the HTTP response.
  In fact many commands will share a single BasicResponse type that only provides a few simple methods.</p>
 <div class="source"><pre>public int getStatusCode()
 public long getContentLength()
@@ -433,13 +439,13 @@ println licenseFuture.get().statusCode
 licenseFuture = Hdfs.put(hadoop).file(&quot;LICENSE&quot;).to(&quot;tmp/example/LICENSE&quot;).later() { println it.statusCode }
 hadoop.waitFor( readmeFuture, licenseFuture )
 </pre></div><p>Again, the hadoop.waitFor() method will wait for one or more asynchronous commands to complete.</p></div><div class="section"><h2>Extension<a name="Extension"></a></h2><p>Extensibility is a key design goal of the KnoxShell and DSL. There are two ways to provide extended functionality for use with the shell. The first is to simply create Groovy scripts that use the DSL to perform a useful task. The second is to add new services and commands. In order to add new service and commands new classes must be written in either Groovy or Java and added to the classpath of the shell. Fortunately there is a very simple way to add classes and JARs to the shell classpath. The first time the shell is executed it will create a configuration file in the same directory as the JAR with the same base name and a <tt>.cfg</tt> extension.</p>
-<div class="source"><pre>bin/shell-${gateway-version}.jar
-bin/shell-${gateway-version}.cfg
+<div class="source"><pre>bin/shell.jar
+bin/shell.cfg
 </pre></div><p>That file contains both the main class for the shell as well as a definition of the classpath. Currently that file will by default contain the following.</p>
 <div class="source"><pre>main.class=org.apache.hadoop.gateway.shell.Shell
 class.path=../lib; ../lib/*.jar; ../ext; ../ext/*.jar
 </pre></div><p>Therefore to extend the shell you should copy any new service and command class either to the <tt>ext</tt> directory or if they are packaged within a JAR copy the JAR to the <tt>ext</tt> directory. The <tt>lib</tt> directory is reserved for JARs that may be delivered with the product.</p><p>Below are samples for the service and command classes that would need to be written to add new commands to the shell. These happen to be Groovy source files but could with very minor changes be Java files. The easiest way to add these to the shell is to compile them directory into the <tt>ext</tt> directory. <i>Note: This command depends upon having the Groovy compiler installed and available on the execution path.</i></p>
-<div class="source"><pre>groovyc -d ext -cp bin/shell-${gateway-version}.jar samples/SampleService.groovy samples/SampleSimpleCommand.groovy samples/SampleComplexCommand.groovy
+<div class="source"><pre>groovyc -d ext -cp bin/shell.jar samples/SampleService.groovy samples/SampleSimpleCommand.groovy samples/SampleComplexCommand.groovy
 </pre></div><p>These source files are available in the samples directory of the distribution but these are included here for convenience.</p><div class="section"><h3>Sample Service (Groovy)<a name="Sample_Service_Groovy"></a></h3>
 <div class="source"><pre>import org.apache.hadoop.gateway.shell.Hadoop
 
@@ -544,27 +550,37 @@ class ComplexCommand {
   }
 
 }
-</pre></div></div></div><div class="section"><h2>Groovy<a name="Groovy"></a></h2><p>The shell included in the distribution is basically an unmodified packaging of the Groovy shell. Therefore these command are functionally equivalent if you have Groovy <a class="externalLink" href="http://groovy.codehaus.org/Installing+Groovy">installed</a>.</p>
-<div class="source"><pre>java -jar bin/shell.jar sample/SmokeTestJob.groovy
-groovy -cp bin/shell.jar sample/SmokeTestJob.groovy
-</pre></div><p>The interactive shell isnt exactly equivalent. However the only difference is that the shell-${gateway-version}.jar automatically executes some additional imports that are useful for the KnoxShell DSL. So these two sets of commands should be functionality equivalent. <b><i>However there is currently a class loading issue that prevents the groovysh command from working propertly.</i></b></p>
+</pre></div></div></div><div class="section"><h2>Groovy<a name="Groovy"></a></h2><p>The shell included in the distribution is basically an unmodified packaging of the Groovy shell. The distribution does however provide a wrapper that makes it very easy to setup the class path for the shell. In fact the JARs required to execute the DSL are included on the class path by default. Therefore these command are functionally equivalent if you have Groovy <a class="externalLink" href="http://groovy.codehaus.org/Installing+Groovy">installed</a>. See below for a description of what is required for <tt>{JARs required by the DSL from lib and dep}</tt></p>
+<div class="source"><pre>java -jar bin/shell.jar samples/ExamplePutFile.groovy
+groovy -classpath {JARs required by the DSL from lib and dep} samples/ExamplePutFile.groovy
+</pre></div><p>The interactive shell isnt exactly equivalent. However the only difference is that the shell.jar automatically executes some additional imports that are useful for the KnoxShell DSL. So these two sets of commands should be functionality equivalent. <b><i>However there is currently a class loading issue that prevents the groovysh command from working propertly.</i></b></p>
 <div class="source"><pre>java -jar bin/shell.jar
 
-groovysh -cp bin/shell-${gateway-version}.jar
+groovysh -classpath {JARs required by the DSL from lib and dep}
 import org.apache.hadoop.gateway.shell.Hadoop
 import org.apache.hadoop.gateway.shell.hdfs.Hdfs
 import org.apache.hadoop.gateway.shell.job.Job
 import org.apache.hadoop.gateway.shell.workflow.Workflow
 import java.util.concurrent.TimeUnit
 </pre></div><p>Alternatively, you can use the Groovy Console which does not appear to have the same class loading issue.</p>
-<div class="source"><pre>groovyConsole -cp bin/shell.jar
+<div class="source"><pre>groovyConsole -classpath {JARs required by the DSL from lib and dep}
 
 import org.apache.hadoop.gateway.shell.Hadoop
 import org.apache.hadoop.gateway.shell.hdfs.Hdfs
 import org.apache.hadoop.gateway.shell.job.Job
 import org.apache.hadoop.gateway.shell.workflow.Workflow
 import java.util.concurrent.TimeUnit
-</pre></div><p>In addition because the DSL can be used via standard Groovy, the Groovy integrations in many popular IDEs (e.g. IntelliJ , Eclipse) can also be used. This makes it particularly nice to develop and execute scripts to interact with Hadoop. The code-completion feature in particular provides immense value. All that is required is to add the shell-0.2.0.jar to the projects class path.</p><p>There are a variety of Groovy tools that make it very easy to work with the standard interchange formats (i.e. JSON and XML). In Groovy the creation of XML or JSON is typically done via a builder and parsing done via a slurper. In addition once JSON or XML is slurped the GPath, an XPath like feature build into Groovy can be used to access data. * XML  * Markup Builder <a class="externalLink" href="http://groovy.codehaus.org/Creating+XML+using+Groovy&apos;s+MarkupBuilder">Overview</a>, <a class="externalLink" href="http://groovy.codehaus.org/api/groovy/xml/MarkupBuilder.html">API
 </a>  * XML Slurper <a class="externalLink" href="http://groovy.codehaus.org/Reading+XML+using+Groovy&apos;s+XmlSlurper">Overview</a>, <a class="externalLink" href="http://groovy.codehaus.org/api/groovy/util/XmlSlurper.html">API</a>  * XPath <a class="externalLink" href="http://groovy.codehaus.org/GPath">Overview</a>, <a class="externalLink" href="http://docs.oracle.com/javase/1.5.0/docs/api/javax/xml/xpath/XPath.html">API</a> * JSON  * JSON Builder <a class="externalLink" href="http://groovy.codehaus.org/gapi/groovy/json/JsonBuilder.html">API</a>  * JSON Slurper <a class="externalLink" href="http://groovy.codehaus.org/gapi/groovy/json/JsonSlurper.html">API</a>  * JSON Path <a class="externalLink" href="https://code.google.com/p/json-path/">API</a> * GPath <a class="externalLink" href="http://groovy.codehaus.org/GPath">Overview</a></p></div><div class="section"><h2>Disclaimer<a name="Disclaimer"></a></h2><p>The Apache Knox Gateway is an effort undergoing incubation at the Ap
 ache Software Foundation (ASF), sponsored by the Apache Incubator PMC.</p><p>Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects.</p><p>While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.</p></div>
+</pre></div><p>The list of JARs currently required by the DSL is</p>
+<div class="source"><pre>lib/gateway-shell-0.2.0-SNAPSHOT.jar
+dep/httpclient-4.2.3.jar
+dep/httpcore-4.2.2.jar
+dep/commons-lang3-3.1.jar
+dep/commons-codec-1.7.jar
+</pre></div><p>So on Linux/MacOS you would need this command</p>
+<div class="source"><pre>groovy -cp lib/gateway-shell-0.2.0-SNAPSHOT.jar:dep/httpclient-4.2.3.jar:dep/httpcore-4.2.2.jar:dep/commons-lang3-3.1.jar:dep/commons-codec-1.7.jar samples/ExamplePutFile.groovy
+</pre></div><p>and on Windows you would need this command</p>
+<div class="source"><pre>groovy -cp lib/gateway-shell-0.2.0-SNAPSHOT.jar;dep/httpclient-4.2.3.jar;dep/httpcore-4.2.2.jar;dep/commons-lang3-3.1.jar;dep/commons-codec-1.7.jar samples/ExamplePutFile.groovy
+</pre></div><p>The exact list of required JARs is likely to change from release to release so it is recommended that you utilize the wrapper <tt>bin/shell.jar</tt>.</p><p>In addition because the DSL can be used via standard Groovy, the Groovy integrations in many popular IDEs (e.g. IntelliJ , Eclipse) can also be used. This makes it particularly nice to develop and execute scripts to interact with Hadoop. The code-completion feature in particular provides immense value. All that is required is to add the shell-0.2.0.jar to the projects class path.</p><p>There are a variety of Groovy tools that make it very easy to work with the standard interchange formats (i.e. JSON and XML). In Groovy the creation of XML or JSON is typically done via a builder and parsing done via a slurper. In addition once JSON or XML is slurped the GPath, an XPath like feature build into Groovy can be used to access data. * XML  * Markup Builder <a class="externalLink" href="http://groovy.codehaus.org/C
 reating+XML+using+Groovy&apos;s+MarkupBuilder">Overview</a>, <a class="externalLink" href="http://groovy.codehaus.org/api/groovy/xml/MarkupBuilder.html">API</a>  * XML Slurper <a class="externalLink" href="http://groovy.codehaus.org/Reading+XML+using+Groovy&apos;s+XmlSlurper">Overview</a>, <a class="externalLink" href="http://groovy.codehaus.org/api/groovy/util/XmlSlurper.html">API</a>  * XPath <a class="externalLink" href="http://groovy.codehaus.org/GPath">Overview</a>, <a class="externalLink" href="http://docs.oracle.com/javase/1.5.0/docs/api/javax/xml/xpath/XPath.html">API</a> * JSON  * JSON Builder <a class="externalLink" href="http://groovy.codehaus.org/gapi/groovy/json/JsonBuilder.html">API</a>  * JSON Slurper <a class="externalLink" href="http://groovy.codehaus.org/gapi/groovy/json/JsonSlurper.html">API</a>  * JSON Path <a class="externalLink" href="https://code.google.com/p/json-path/">API</a> * GPath <a class="externalLink" href="http://groovy.codehaus.org/GPath">Ov
 erview</a></p></div><div class="section"><h2>Disclaimer<a name="Disclaimer"></a></h2><p>The Apache Knox Gateway is an effort undergoing incubation at the Apache Software Foundation (ASF), sponsored by the Apache Incubator PMC.</p><p>Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects.</p><p>While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.</p></div>
       </div>
     </div>
     <div class="clear">

Modified: incubator/knox/site/contribute-process.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/contribute-process.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/contribute-process.html (original)
+++ incubator/knox/site/contribute-process.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/examples.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/examples.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/examples.html (original)
+++ incubator/knox/site/examples.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>
@@ -181,8 +184,6 @@ limitations under the License. --><p></p
   <li>samples/ExampleSubmitWorkflow.groovy</li>
 </ul><p>If you are using the Sandbox VM for your Hadoop cluster you may want to review <a href="sandbox.html">these configuration tips</a>.</p><p></p></div><div class="section"><h2>Example #1: WebHDFS &amp; Templeton/WebHCat via KnoxShell DSL<a name="Example_1:_WebHDFS__TempletonWebHCat_via_KnoxShell_DSL"></a></h2><p>This example will submit the familiar WordCount Java MapReduce job to the Hadoop cluster via the gateway using the KnoxShell DSL. There are several ways to do this depending upon your preference.</p><p>You can use the embedded Groovy interpreter provided with the distribution.</p>
 <div class="source"><pre>java -jar bin/shell.jar samples/ExampleSubmitJob.groovy
-</pre></div><p>You can load the KnoxShell DSL script into the standard Groovy Console.</p>
-<div class="source"><pre>groovyConsole -cp bin/shell-${gateway-version}.jar samples/ExampleSubmitJob.groovy
 </pre></div><p>You can manually type in the KnoxShell DSL script into the embedded Groovy interpreter provided with the distribution.</p>
 <div class="source"><pre>java -jar bin/shell.jar
 </pre></div><p>Each line from the file below will need to be typed or copied into the interactive shell.</p><p><b><i>samples/ExampleSubmitJob</i></b></p>
@@ -223,12 +224,16 @@ count = 0
 while( !done &amp;&amp; count++ &lt; 60 ) {
   sleep( 1000 )
   json = Job.queryStatus(hadoop).jobId(jobId).now().string
-  done = JsonPath.read( json, &quot;.status&quot; )
+  done = JsonPath.read( json, &quot;\$.status.jobComplete&quot; )
 }
 println &quot;Done &quot; + done
 
 println &quot;Shutdown &quot; + hadoop.shutdown( 10, SECONDS )
-</pre></div><p></p></div><div class="section"><h2>Example #2: WebHDFS &amp; Oozie via KnoxShell DSL<a name="Example_2:_WebHDFS__Oozie_via_KnoxShell_DSL"></a></h2><p>This example will also submit the familiar WordCount Java MapReduce job to the Hadoop cluster via the gateway using the KnoxShell DSL. However in this case the job will be submitted via a Oozie workflow. There are several ways to do this depending upon your preference.</p><p>You can use the embedded Groovy interpreter provided with the distribution.  java -jar bin/shell.jar samples/ExampleSubmitWorkflow.groovy</p><p>You can load the KnoxShell DSL script into the standard Groovy Console.  groovyConsole -cp bin/shell-${gateway-version}.jar samples/ExampleSubmitWorkflow.groovy</p><p>You can manually type in the KnoxShell DSL script into the embedded Groovy interpreter provided with the distribution.</p>
+
+exit
+</pre></div><p></p></div><div class="section"><h2>Example #2: WebHDFS &amp; Oozie via KnoxShell DSL<a name="Example_2:_WebHDFS__Oozie_via_KnoxShell_DSL"></a></h2><p>This example will also submit the familiar WordCount Java MapReduce job to the Hadoop cluster via the gateway using the KnoxShell DSL. However in this case the job will be submitted via a Oozie workflow. There are several ways to do this depending upon your preference.</p><p>You can use the embedded Groovy interpreter provided with the distribution.</p>
+<div class="source"><pre>java -jar bin/shell.jar samples/ExampleSubmitWorkflow.groovy
+</pre></div><p>You can manually type in the KnoxShell DSL script into the embedded Groovy interpreter provided with the distribution.</p>
 <div class="source"><pre>java -jar bin/shell.jar
 </pre></div><p>Each line from the file below will need to be typed or copied into the interactive shell.</p><p><b><i>samples/ExampleSubmitWorkflow.groovy</i></b></p>
 <div class="source"><pre>import com.jayway.jsonpath.JsonPath
@@ -300,12 +305,14 @@ count = 0;
 while( status != &quot;SUCCEEDED&quot; &amp;&amp; count++ &lt; 60 ) {
   sleep( 1000 )
   json = Workflow.status(hadoop).jobId( jobId ).now().string
-  status = JsonPath.read( json, &quot;.status&quot; )
+  status = JsonPath.read( json, &quot;\$.status&quot; )
 }
 println &quot;Job status &quot; + status;
 
 println &quot;Shutdown &quot; + hadoop.shutdown( 10, SECONDS )
-</pre></div><p></p></div><div class="section"><h2>Example #3: WebHDFS &amp; Templeton/WebHCat via cURL<a name="Example_3:_WebHDFS__TempletonWebHCat_via_cURL"></a></h2><p>The example below illustrates the sequence of curl commands that could be used to run a word count map reduce job. It utilizes the hadoop-examples.jar from a Hadoop install for running a simple word count job. Take care to follow the instructions below for steps 4/5 and 6/7 where the Location header returned by the call to the NameNode is copied for use with the call to the DataNode that follows it.</p>
+
+exit
+</pre></div><p></p></div><div class="section"><h2>Example #3: WebHDFS &amp; Templeton/WebHCat via cURL<a name="Example_3:_WebHDFS__TempletonWebHCat_via_cURL"></a></h2><p>The example below illustrates the sequence of curl commands that could be used to run a word count map reduce job. It utilizes the hadoop-examples.jar from a Hadoop install for running a simple word count job. A copy of that jar has been included in the samples directory for convenience. Take care to follow the instructions below for steps 4/5 and 6/7 where the Location header returned by the call to the NameNode is copied for use with the call to the DataNode that follows it. These replacement values are identified with { } markup.</p>
 <div class="source"><pre># 0. Optionally cleanup the test directory in case a previous example was run without cleaning up.
 curl -i -k -u mapred:mapred-password -X DELETE \
   'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test?op=DELETE&amp;recursive=true'
@@ -323,7 +330,7 @@ curl -i -k -u mapred:mapred-password -X 
   'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test/hadoop-examples.jar?op=CREATE'
 
 # 4. Upload hadoop-examples.jar to /tmp/test.  Use a hadoop-examples.jar from a Hadoop install.
-curl -i -k -u mapred:mapred-password -T hadoop-examples.jar -X PUT '{Value Location header from command above}'
+curl -i -k -u mapred:mapred-password -T samples/hadoop-examples.jar -X PUT '{Value Location header from command above}'
 
 # 5. Create the inode for a sample file README in /tmp/test/input
 curl -i -k -u mapred:mapred-password -X PUT \
@@ -354,7 +361,7 @@ curl -i -k -u mapred:mapred-password -X 
 # 11. Optionally cleanup the test directory
 curl -i -k -u mapred:mapred-password -X DELETE \
   'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test?op=DELETE&amp;recursive=true'
-</pre></div><p></p></div><div class="section"><h2>Example #4: WebHDFS &amp; Oozie via cURL<a name="Example_4:_WebHDFS__Oozie_via_cURL"></a></h2><p>The example below illustrates the sequence of curl commands that could be used to run a word count map reduce job via an Oozie workflow. It utilizes the hadoop-examples.jar from a Hadoop install for running a simple word count job. Take care to follow the instructions below where replacement values are required. These replacement values are identivied with { } markup.</p>
+</pre></div><p></p></div><div class="section"><h2>Example #4: WebHDFS &amp; Oozie via cURL<a name="Example_4:_WebHDFS__Oozie_via_cURL"></a></h2><p>The example below illustrates the sequence of curl commands that could be used to run a word count map reduce job via an Oozie workflow. It utilizes the hadoop-examples.jar from a Hadoop install for running a simple word count job. A copy of that jar has been included in the samples directory for convenience. Take care to follow the instructions below where replacement values are required. These replacement values are identified with { } markup.</p>
 <div class="source"><pre># 0. Optionally cleanup the test directory in case a previous example was run without cleaning up.
 curl -i -k -u mapred:mapred-password -X DELETE \
   'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test?op=DELETE&amp;recursive=true'
@@ -372,7 +379,7 @@ curl -i -k -u mapred:mapred-password -X 
   'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test/lib/hadoop-examples.jar?op=CREATE'
 
 # 4. Upload hadoop-examples.jar to /tmp/test/lib.  Use a hadoop-examples.jar from a Hadoop install.
-curl -i -k -u mapred:mapred-password -T hadoop-examples.jar -X PUT \
+curl -i -k -u mapred:mapred-password -T samples/hadoop-examples.jar -X PUT \
   '{Value Location header from command above}'
 
 # 5. Create the inode for a sample input file readme.txt in /tmp/test/input.
@@ -397,7 +404,7 @@ sed -e s/REPLACE.NAMENODE.RPCHOSTPORT/{N
 # 8. Submit the job via Oozie
 # Take note of the Job ID in the JSON response as this will be used in the next step.
 curl -i -k -u mapred:mapred-password -T workflow-configuration.xml -H Content-Type:application/xml -X POST \
-  'https://localhost:8443/gateway/oozie/sample/api/v1/jobs?action=start'
+  'https://localhost:8443/gateway/sample/oozie/api/v1/jobs?action=start'
 
 # 9. Query the job status via Oozie.
 curl -i -k -u mapred:mapred-password -X GET \

Modified: incubator/knox/site/getting-started.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/getting-started.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/getting-started.html (original)
+++ incubator/knox/site/getting-started.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>
@@ -161,10 +164,10 @@ Unless required by applicable law or agr
 distributed under the License is distributed on an "AS IS" BASIS,
 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 See the License for the specific language governing permissions and
-limitations under the License. --><p></p><div class="section"><h2>Apache Knox Gateway - Getting Started<a name="Apache_Knox_Gateway_-_Getting_Started"></a></h2><p>This guide describes the steps required to install, deploy and validate the Apache Knox Gateway.</p><p></p></div><div class="section"><h2>Requirements<a name="Requirements"></a></h2><p>The following prerequisites must be installed to successfully complete the steps described in this guide.</p><div class="section"><h3>Java<a name="Java"></a></h3><p>Java 1.6 or later</p></div><div class="section"><h3>Hadoop<a name="Hadoop"></a></h3><p>A local installation of a Hadoop Cluster is required at this time. Hadoop EC2 cluster and/or Sandbox installations are currently difficult to access remotely via the Gateway. The EC2 and Sandbox limitation is caused by Hadoop services running with internal IP addresses. For the Gateway to work in these cases it will need to be deployed on the EC2 cluster or Sandbox, at this time.</p><p>
 The instructions that follow assume that the Gateway is <i>not</i> collocated with the Hadoop clusters themselves and (most importantly) that the hostnames and IP addresses of the cluster services are accessible by the gateway where ever it happens to be running.</p><p>The Hadoop cluster should be ensured to have WebHDFS, WebHCat (i.e. Templeton) and Oozie configured, deployed and running.</p><p>This release of the Apache Knox Gateway has been tested against the <a class="externalLink" href="http://hortonworks.com/products/hortonworks-sandbox/">Hortonworks Sandbox 1.2</a> with <a href="sandbox.html">these changes</a>.</p><p></p></div></div><div class="section"><h2>Installation<a name="Installation"></a></h2><div class="section"><h3>1. Extract the distribution ZIP<a name="a1._Extract_the_distribution_ZIP"></a></h3><p>Download and extract the gateway-${gateway-version}.zip file into the installation directory that will contain your <tt>{GATEWAY_HOME}</tt></p>
-<div class="source"><pre>jar xf gateway-${gateway-version}.zip
-</pre></div><p>This will create a directory <tt>gateway-${gateway-version}</tt> in your current directory.</p></div><div class="section"><h3>2. Enter the <tt>{GATEWAY_HOME}</tt> directory<a name="a2._Enter_the_GATEWAY_HOME_directory"></a></h3>
-<div class="source"><pre>cd gateway-${gateway-version}
+limitations under the License. --><p></p><div class="section"><h2>Apache Knox Gateway - Getting Started<a name="Apache_Knox_Gateway_-_Getting_Started"></a></h2><p>This guide describes the steps required to install, deploy and validate the Apache Knox Gateway.</p><p></p></div><div class="section"><h2>Requirements<a name="Requirements"></a></h2><p>The following prerequisites must be installed to successfully complete the steps described in this guide.</p><div class="section"><h3>Java<a name="Java"></a></h3><p>Java 1.6 or later</p></div><div class="section"><h3>Hadoop<a name="Hadoop"></a></h3><p>A local installation of a Hadoop Cluster is required at this time. Hadoop EC2 cluster and/or Sandbox installations are currently difficult to access remotely via the Gateway. The EC2 and Sandbox limitation is caused by Hadoop services running with internal IP addresses. For the Gateway to work in these cases it will need to be deployed on the EC2 cluster or Sandbox, at this time.</p><p>
 The instructions that follow assume that the Gateway is <i>not</i> collocated with the Hadoop clusters themselves and (most importantly) that the hostnames and IP addresses of the cluster services are accessible by the gateway where ever it happens to be running.</p><p>The Hadoop cluster should be ensured to have WebHDFS, WebHCat (i.e. Templeton) and Oozie configured, deployed and running.</p><p>This release of the Apache Knox Gateway has been tested against the <a class="externalLink" href="http://hortonworks.com/products/hortonworks-sandbox/">Hortonworks Sandbox 1.2</a> with <a href="sandbox.html">these changes</a>.</p><p></p></div></div><div class="section"><h2>Installation<a name="Installation"></a></h2><div class="section"><h3>1. Extract the distribution ZIP<a name="a1._Extract_the_distribution_ZIP"></a></h3><p>Download and extract the knox-0.2.0-SNAPSHOT.zip file into the installation directory that will contain your <tt>{GATEWAY_HOME}</tt></p>
+<div class="source"><pre>jar xf knox-0.2.0-SNAPSHOT.zip
+</pre></div><p>This will create a directory <tt>knox-0.2.0-SNAPSHOT</tt> in your current directory.</p></div><div class="section"><h3>2. Enter the <tt>{GATEWAY_HOME}</tt> directory<a name="a2._Enter_the_GATEWAY_HOME_directory"></a></h3>
+<div class="source"><pre>cd knox-0.2.0-SNAPSHOT
 </pre></div><p>The fully qualified name of this directory will be referenced as <tt>{GATEWAY_HOME}</tt> throughout the remainder of this document.</p></div><div class="section"><h3>3. Start the demo LDAP server (ApacheDS)<a name="a3._Start_the_demo_LDAP_server_ApacheDS"></a></h3><p>First, understand that the LDAP server provided here is for demonstration purposes. You may configure the LDAP specifics within the topology descriptor for the cluster as described in step 5 below, in order to customize what LDAP instance to use. The assumption is that most users will leverage the demo LDAP server while evaluating this release and should therefore continue with the instructions here in step 3.</p><p>Edit <tt>{GATEWAY_HOME}/conf/users.ldif</tt> if required and add your users and groups to the file. A number of normal Hadoop users (e.g. hdfs, mapred, hcat, hive) have already been included. Note that the passwords in this file are fictitious and have nothing to do with the actual acc
 ounts on the Hadoop cluster you are using. There is also a copy of this file in the templates directory that you can use to start over if necessary.</p><p>Start the LDAP server - pointing it to the config dir where it will find the users.ldif file in the conf directory.</p>
 <div class="source"><pre>java -jar bin/ldap.jar conf &amp;
 </pre></div><p>There are a number of log messages of the form <tt>Created null.</tt> that can safely be ignored. Take note of the port on which it was started as this needs to match later configuration. This will create a directory named org.apache.hadoop.gateway.security.EmbeddedApacheDirectoryServer that can safely be ignored.</p></div><div class="section"><h3>4. Start the Gateway server<a name="a4._Start_the_Gateway_server"></a></h3>

Modified: incubator/knox/site/index.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/index.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/index.html (original)
+++ incubator/knox/site/index.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/issue-tracking.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/issue-tracking.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/issue-tracking.html (original)
+++ incubator/knox/site/issue-tracking.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/license.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/license.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/license.html (original)
+++ incubator/knox/site/license.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/mail-lists.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/mail-lists.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/mail-lists.html (original)
+++ incubator/knox/site/mail-lists.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/news.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/news.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/news.html (original)
+++ incubator/knox/site/news.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/privacy-policy.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/privacy-policy.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/privacy-policy.html (original)
+++ incubator/knox/site/privacy-policy.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/project-info.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/project-info.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/project-info.html (original)
+++ incubator/knox/site/project-info.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/release-0-2-0.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/release-0-2-0.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/release-0-2-0.html (original)
+++ incubator/knox/site/release-0-2-0.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/release-process.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/release-process.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/release-process.html (original)
+++ incubator/knox/site/release-process.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/roadmap-0-3-0.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/roadmap-0-3-0.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/roadmap-0-3-0.html (original)
+++ incubator/knox/site/roadmap-0-3-0.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/team-list.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/team-list.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/team-list.html (original)
+++ incubator/knox/site/team-list.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/site/template.html
URL: http://svn.apache.org/viewvc/incubator/knox/site/template.html?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/site/template.html (original)
+++ incubator/knox/site/template.html Wed Mar 27 18:15:34 2013
@@ -44,7 +44,7 @@
               
                 
                 &nbsp;| <span id="publishDate">Last Published: 2013-03-27</span>
-              &nbsp;| <span id="projectVersion">Version: 0.0.0-SNAPSHOT</span>
+              &nbsp;| <span id="projectVersion">Version: 0.2.0-SNAPSHOT</span>
             </div>
       <div class="clear">
         <hr/>
@@ -83,6 +83,9 @@
                   <li class="none">
                           <a href="https://cwiki.apache.org/confluence/display/KNOX/Index" class="externalLink" title="Wiki">Wiki</a>
             </li>
+                  <li class="none">
+                          <a href="https://cwiki.apache.org/KNOX/dependencies.html" class="externalLink" title="Dependencies">Dependencies</a>
+            </li>
           </ul>
                        <h5>Releases</h5>
                   <ul>

Modified: incubator/knox/trunk/pom.xml
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/pom.xml?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/trunk/pom.xml (original)
+++ incubator/knox/trunk/pom.xml Wed Mar 27 18:15:34 2013
@@ -23,16 +23,18 @@
     <modelVersion>4.0.0</modelVersion>
     <groupId>org.apache.hadoop</groupId>
     <artifactId>gateway-site</artifactId>
-    <version>0.0.0-SNAPSHOT</version>
+    <version>0.2.0-SNAPSHOT</version>
 
     <name>Apache Knox Gateway</name>
     <description>Knox is a gateway for Hadoop clusters.</description>
     <url>http://incubator.apache.org/knox</url>
 
     <properties>
+        <SDS>\$</SDS>
         <HHH>###</HHH>
         <HHHH>####</HHHH>
         <HHHHH>#####</HHHHH>
+        <gateway-version>0.2.0-SNAPSHOT</gateway-version>
     </properties>
 
     <licenses>

Modified: incubator/knox/trunk/src/site/markdown/client.md.vm
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/src/site/markdown/client.md.vm?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/trunk/src/site/markdown/client.md.vm (original)
+++ incubator/knox/trunk/src/site/markdown/client.md.vm Wed Mar 27 18:15:34 2013
@@ -64,14 +64,10 @@ The DSL requires a shell to interpret th
 The shell can either be used interactively or to execute a script file.
 To simplify use, the distribution contains an embedded version of the Groovy shell.
 
-The shell can be run interactively.
+The shell can be run interactively.  Use the command `exit` to exit.
 
     java -jar bin/shell.jar
 
-The shell can also be used to execute a script by passing a single filename argument.
-
-    java -jar bin/shell.jar sample/SmokeTestJob.groovy
-
 When running interactively it may be helpful to reduce some of the output generated by the shell console.
 Use the following command in the interactive shell to reduce that output.
 This only needs to be done once as these preferences are persisted.
@@ -82,6 +78,10 @@ This only needs to be done once as these
 Also when running interactively use the `exit` command to terminate the shell.
 Using `^C` to exit can sometimes leaves the parent shell in a problematic state.
 
+The shell can also be used to execute a script by passing a single filename argument.
+
+    java -jar bin/shell.jar samples/ExamplePutFile.groovy
+
 
 Examples
 --------
@@ -95,7 +95,12 @@ Below is a very simple example of an int
 The `knox:000>` in the example above is the prompt from the embedded Groovy console.
 If you output doesn't look like this you may need to set the verbosity and show-last-result preferences as described above in the Usage section.
 
-Without using some other tool to browse HDFS it is impossible to tell that that this command did anything.
+If you relieve an error `HTTP/1.1 403 Forbidden` it may be because that file already exists.
+Try deleting it with the following command and then try again.
+
+    knox:000> Hdfs.rm(hadoop).file("/tmp/example/README").now()
+
+Without using some other tool to browse HDFS it is hard to tell that that this command did anything.
 Execute this to get a bit more feedback.
 
     knox:000> println "Status=" + Hdfs.put( hadoop ).file( "README" ).to( "/tmp/example/README2" ).now().statusCode
@@ -133,7 +138,7 @@ The shutdown command attempts to ensures
 
 All of the commands above could have been combined into a script file and executed as a single line.
 
-    java -jar bin/shell.jar samples/Example.groovy
+    java -jar bin/shell.jar samples/ExamplePutFile.groovy
 
 This script file is available in the distribution but for convenience, this is the content.
 
@@ -153,6 +158,7 @@ This script file is available in the dis
     json = (new JsonSlurper()).parseText( text )
     println json.FileStatuses.FileStatus.pathSuffix
     hadoop.shutdown()
+    exit
 
 Notice the Hdfs.rm command.  This is included simply to ensure that the script can be rerun.
 Without this an error would result the second time it is run.
@@ -417,8 +423,8 @@ In order to add new service and commands
 Fortunately there is a very simple way to add classes and JARs to the shell classpath.
 The first time the shell is executed it will create a configuration file in the same directory as the JAR with the same base name and a `.cfg` extension.
 
-    bin/shell-${gateway-version}.jar
-    bin/shell-${gateway-version}.cfg
+    bin/shell.jar
+    bin/shell.cfg
 
 That file contains both the main class for the shell as well as a definition of the classpath.
 Currently that file will by default contain the following.
@@ -434,7 +440,7 @@ These happen to be Groovy source files b
 The easiest way to add these to the shell is to compile them directory into the `ext` directory.
 *Note: This command depends upon having the Groovy compiler installed and available on the execution path.*
 
-    groovyc -d ext -cp bin/shell-${gateway-version}.jar samples/SampleService.groovy samples/SampleSimpleCommand.groovy samples/SampleComplexCommand.groovy
+    groovyc -d ext -cp bin/shell.jar samples/SampleService.groovy samples/SampleSimpleCommand.groovy samples/SampleComplexCommand.groovy
 
 These source files are available in the samples directory of the distribution but these are included here for convenience.
 
@@ -549,19 +555,22 @@ ${HHH} Sample Complex Command (Groovy)
 Groovy
 ------
 The shell included in the distribution is basically an unmodified packaging of the Groovy shell.
+The distribution does however provide a wrapper that makes it very easy to setup the class path for the shell.
+In fact the JARs required to execute the DSL are included on the class path by default.
 Therefore these command are functionally equivalent if you have Groovy [installed][15].
+See below for a description of what is required for `{JARs required by the DSL from lib and dep}`
 
-    java -jar bin/shell.jar sample/SmokeTestJob.groovy
-    groovy -cp bin/shell.jar sample/SmokeTestJob.groovy
+    java -jar bin/shell.jar samples/ExamplePutFile.groovy
+    groovy -classpath {JARs required by the DSL from lib and dep} samples/ExamplePutFile.groovy
 
 The interactive shell isn't exactly equivalent.
-However the only difference is that the shell-${gateway-version}.jar automatically executes some additional imports that are useful for the KnoxShell DSL.
+However the only difference is that the shell.jar automatically executes some additional imports that are useful for the KnoxShell DSL.
 So these two sets of commands should be functionality equivalent.
 ***However there is currently a class loading issue that prevents the groovysh command from working propertly.***
 
     java -jar bin/shell.jar
 
-    groovysh -cp bin/shell-${gateway-version}.jar
+    groovysh -classpath {JARs required by the DSL from lib and dep}
     import org.apache.hadoop.gateway.shell.Hadoop
     import org.apache.hadoop.gateway.shell.hdfs.Hdfs
     import org.apache.hadoop.gateway.shell.job.Job
@@ -570,7 +579,7 @@ So these two sets of commands should be 
 
 Alternatively, you can use the Groovy Console which does not appear to have the same class loading issue.
 
-    groovyConsole -cp bin/shell.jar
+    groovyConsole -classpath {JARs required by the DSL from lib and dep}
 
     import org.apache.hadoop.gateway.shell.Hadoop
     import org.apache.hadoop.gateway.shell.hdfs.Hdfs
@@ -578,6 +587,24 @@ Alternatively, you can use the Groovy Co
     import org.apache.hadoop.gateway.shell.workflow.Workflow
     import java.util.concurrent.TimeUnit
 
+The list of JARs currently required by the DSL is
+
+    lib/gateway-shell-${gateway-version}.jar
+    dep/httpclient-4.2.3.jar
+    dep/httpcore-4.2.2.jar
+    dep/commons-lang3-3.1.jar
+    dep/commons-codec-1.7.jar
+
+So on Linux/MacOS you would need this command
+
+    groovy -cp lib/gateway-shell-0.2.0-SNAPSHOT.jar:dep/httpclient-4.2.3.jar:dep/httpcore-4.2.2.jar:dep/commons-lang3-3.1.jar:dep/commons-codec-1.7.jar samples/ExamplePutFile.groovy
+
+and on Windows you would need this command
+
+    groovy -cp lib/gateway-shell-0.2.0-SNAPSHOT.jar;dep/httpclient-4.2.3.jar;dep/httpcore-4.2.2.jar;dep/commons-lang3-3.1.jar;dep/commons-codec-1.7.jar samples/ExamplePutFile.groovy
+
+The exact list of required JARs is likely to change from release to release so it is recommended that you utilize the wrapper `bin/shell.jar`.
+
 In addition because the DSL can be used via standard Groovy, the Groovy integrations in many popular IDEs (e.g. IntelliJ , Eclipse) can also be used.
 This makes it particularly nice to develop and execute scripts to interact with Hadoop.
 The code-completion feature in particular provides immense value.

Modified: incubator/knox/trunk/src/site/markdown/examples.md.vm
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/src/site/markdown/examples.md.vm?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/trunk/src/site/markdown/examples.md.vm (original)
+++ incubator/knox/trunk/src/site/markdown/examples.md.vm Wed Mar 27 18:15:34 2013
@@ -80,10 +80,6 @@ You can use the "embedded" Groovy interp
 
     java -jar bin/shell.jar samples/ExampleSubmitJob.groovy
 
-You can load the KnoxShell DSL script into the standard Groovy Console.
-
-    groovyConsole -cp bin/shell-${gateway-version}.jar samples/ExampleSubmitJob.groovy
-
 You can manually type in the KnoxShell DSL script into the "embedded" Groovy
 interpreter provided with the distribution.
 
@@ -131,12 +127,14 @@ interactive shell.
     while( !done && count++ < 60 ) {
       sleep( 1000 )
       json = Job.queryStatus(hadoop).jobId(jobId).now().string
-      done = JsonPath.read( json, "\$.status.jobComplete" )
+      done = JsonPath.read( json, "${SDS}.status.jobComplete" )
     }
     println "Done " + done
 
     println "Shutdown " + hadoop.shutdown( 10, SECONDS )
 
+    exit
+
 ------------------------------------------------------------------------------
 Example #2: WebHDFS & Oozie via KnoxShell DSL
 ------------------------------------------------------------------------------
@@ -146,10 +144,8 @@ the job will be submitted via a Oozie wo
 this depending upon your preference.
 
 You can use the "embedded" Groovy interpreter provided with the distribution.
-    java -jar bin/shell.jar samples/ExampleSubmitWorkflow.groovy
 
-You can load the KnoxShell DSL script into the standard Groovy Console.
-    groovyConsole -cp bin/shell-${gateway-version}.jar samples/ExampleSubmitWorkflow.groovy
+    java -jar bin/shell.jar samples/ExampleSubmitWorkflow.groovy
 
 You can manually type in the KnoxShell DSL script into the "embedded" Groovy
 interpreter provided with the distribution.
@@ -230,21 +226,25 @@ interactive shell.
     while( status != "SUCCEEDED" && count++ < 60 ) {
       sleep( 1000 )
       json = Workflow.status(hadoop).jobId( jobId ).now().string
-      status = JsonPath.read( json, "\$.status" )
+      status = JsonPath.read( json, "${SDS}.status" )
     }
     println "Job status " + status;
 
     println "Shutdown " + hadoop.shutdown( 10, SECONDS )
 
+    exit
+
 ------------------------------------------------------------------------------
 Example #3: WebHDFS & Templeton/WebHCat via cURL
 ------------------------------------------------------------------------------
 The example below illustrates the sequence of curl commands that could be used
 to run a "word count" map reduce job.  It utilizes the hadoop-examples.jar
-from a Hadoop install for running a simple word count job.  Take care to
+from a Hadoop install for running a simple word count job.  A copy of that
+jar has been included in the samples directory for convenience.  Take care to
 follow the instructions below for steps 4/5 and 6/7 where the Location header
 returned by the call to the NameNode is copied for use with the call to the
-DataNode that follows it.
+DataNode that follows it.  These replacement values are identified with { }
+markup.
 
     # 0. Optionally cleanup the test directory in case a previous example was run without cleaning up.
     curl -i -k -u mapred:mapred-password -X DELETE \
@@ -263,7 +263,7 @@ DataNode that follows it.
       'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test/hadoop-examples.jar?op=CREATE'
 
     # 4. Upload hadoop-examples.jar to /tmp/test.  Use a hadoop-examples.jar from a Hadoop install.
-    curl -i -k -u mapred:mapred-password -T hadoop-examples.jar -X PUT '{Value Location header from command above}'
+    curl -i -k -u mapred:mapred-password -T samples/hadoop-examples.jar -X PUT '{Value Location header from command above}'
 
     # 5. Create the inode for a sample file README in /tmp/test/input
     curl -i -k -u mapred:mapred-password -X PUT \
@@ -301,8 +301,9 @@ Example #4: WebHDFS & Oozie via cURL
 The example below illustrates the sequence of curl commands that could be used
 to run a "word count" map reduce job via an Oozie workflow.  It utilizes the
 hadoop-examples.jar from a Hadoop install for running a simple word count job.
+A copy of that jar has been included in the samples directory for convenience.
 Take care to follow the instructions below where replacement values are
-required.  These replacement values are identivied with { } markup.
+required.  These replacement values are identified with { } markup.
 
     # 0. Optionally cleanup the test directory in case a previous example was run without cleaning up.
     curl -i -k -u mapred:mapred-password -X DELETE \
@@ -321,7 +322,7 @@ required.  These replacement values are 
       'https://localhost:8443/gateway/sample/namenode/api/v1/tmp/test/lib/hadoop-examples.jar?op=CREATE'
 
     # 4. Upload hadoop-examples.jar to /tmp/test/lib.  Use a hadoop-examples.jar from a Hadoop install.
-    curl -i -k -u mapred:mapred-password -T hadoop-examples.jar -X PUT \
+    curl -i -k -u mapred:mapred-password -T samples/hadoop-examples.jar -X PUT \
       '{Value Location header from command above}'
 
     # 5. Create the inode for a sample input file readme.txt in /tmp/test/input.
@@ -346,7 +347,7 @@ required.  These replacement values are 
     # 8. Submit the job via Oozie
     # Take note of the Job ID in the JSON response as this will be used in the next step.
     curl -i -k -u mapred:mapred-password -T workflow-configuration.xml -H Content-Type:application/xml -X POST \
-      'https://localhost:8443/gateway/oozie/sample/api/v1/jobs?action=start'
+      'https://localhost:8443/gateway/sample/oozie/api/v1/jobs?action=start'
 
     # 9. Query the job status via Oozie.
     curl -i -k -u mapred:mapred-password -X GET \

Modified: incubator/knox/trunk/src/site/markdown/getting-started.md.vm
URL: http://svn.apache.org/viewvc/incubator/knox/trunk/src/site/markdown/getting-started.md.vm?rev=1461761&r1=1461760&r2=1461761&view=diff
==============================================================================
--- incubator/knox/trunk/src/site/markdown/getting-started.md.vm (original)
+++ incubator/knox/trunk/src/site/markdown/getting-started.md.vm Wed Mar 27 18:15:34 2013
@@ -57,17 +57,17 @@ Installation
 ------------------------------------------------------------------------------
 ${HHH} 1. Extract the distribution ZIP
 
-Download and extract the gateway-${gateway-version}.zip file into the
+Download and extract the knox-${gateway-version}.zip file into the
 installation directory that will contain your `{GATEWAY_HOME}`
 
-    jar xf gateway-${gateway-version}.zip
+    jar xf knox-${gateway-version}.zip
 
-This will create a directory `gateway-${gateway-version}` in your current
+This will create a directory `knox-${gateway-version}` in your current
 directory.
 
 ${HHH} 2. Enter the `{GATEWAY_HOME}` directory
 
-    cd gateway-${gateway-version}
+    cd knox-${gateway-version}
 
 The fully qualified name of this directory will be referenced as
 `{GATEWAY_HOME}` throughout the remainder of this document.