You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@knox.apache.org by lm...@apache.org on 2017/02/24 17:58:56 UTC

svn commit: r1784309 - in /knox: site/books/knox-0-10-0/ site/books/knox-0-11-0/ site/books/knox-0-12-0/ trunk/books/0.10.0/ trunk/books/0.11.0/ trunk/books/0.12.0/

Author: lmccay
Date: Fri Feb 24 17:58:56 2017
New Revision: 1784309

URL: http://svn.apache.org/viewvc?rev=1784309&view=rev
Log:
moved submitSqoop docs to proper section

Modified:
    knox/site/books/knox-0-10-0/user-guide.html
    knox/site/books/knox-0-11-0/user-guide.html
    knox/site/books/knox-0-12-0/user-guide.html
    knox/trunk/books/0.10.0/book_client-details.md
    knox/trunk/books/0.10.0/service_webhcat.md
    knox/trunk/books/0.11.0/book_client-details.md
    knox/trunk/books/0.11.0/service_webhcat.md
    knox/trunk/books/0.12.0/book_client-details.md
    knox/trunk/books/0.12.0/service_webhcat.md

Modified: knox/site/books/knox-0-10-0/user-guide.html
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-10-0/user-guide.html?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/site/books/knox-0-10-0/user-guide.html (original)
+++ knox/site/books/knox-0-10-0/user-guide.html Fri Feb 24 17:58:56 2017
@@ -2904,19 +2904,7 @@ json = (new JsonSlurper()).parseText( te
 println json.FileStatuses.FileStatus.pathSuffix
 session.shutdown()
 exit
-</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
-<pre><code>Job.submitSqoop(session)
-    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
-    .statusDir(remoteStatusDir)
-    .now().jobId
-</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
-<ul>
-  <li>command (String) - The sqoop command string to execute.</li>
-  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
-  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
-  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
-  <li>statusDir (String) - The remote directory to store status output.</li>
-</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
+</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
 <pre><code>future = Hdfs.put(session).file(&quot;README&quot;).to(&quot;/tmp/example/README&quot;).later()
 println future.get().statusCode
 </code></pre><p>The future.get() method will block until the asynchronous command is complete. To illustrate the usefulness of this however multiple concurrent commands are required.</p>
@@ -3415,7 +3403,19 @@ session.shutdown()
   <ul>
     <li><code>Job.submitHive(session).file(remoteHiveFileName).arg(&quot;-v&quot;).statusDir(remoteStatusDir).now()</code></li>
   </ul></li>
-</ul><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
+</ul><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
+<pre><code>Job.submitSqoop(session)
+    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
+    .statusDir(remoteStatusDir)
+    .now().jobId
+</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
+<ul>
+  <li>command (String) - The sqoop command string to execute.</li>
+  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
+  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
+  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
+  <li>statusDir (String) - The remote directory to store status output.</li>
+</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
 <ul>
   <li>Request
   <ul>

Modified: knox/site/books/knox-0-11-0/user-guide.html
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-11-0/user-guide.html?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/site/books/knox-0-11-0/user-guide.html (original)
+++ knox/site/books/knox-0-11-0/user-guide.html Fri Feb 24 17:58:56 2017
@@ -2996,19 +2996,7 @@ json = (new JsonSlurper()).parseText( te
 println json.FileStatuses.FileStatus.pathSuffix
 session.shutdown()
 exit
-</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
-<pre><code>Job.submitSqoop(session)
-    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
-    .statusDir(remoteStatusDir)
-    .now().jobId
-</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
-<ul>
-  <li>command (String) - The sqoop command string to execute.</li>
-  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
-  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
-  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
-  <li>statusDir (String) - The remote directory to store status output.</li>
-</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
+</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
 <pre><code>future = Hdfs.put(session).file(&quot;README&quot;).to(&quot;/tmp/example/README&quot;).later()
 println future.get().statusCode
 </code></pre><p>The future.get() method will block until the asynchronous command is complete. To illustrate the usefulness of this however multiple concurrent commands are required.</p>
@@ -3508,7 +3496,19 @@ session.shutdown()
   <ul>
     <li><code>Job.submitHive(session).file(remoteHiveFileName).arg(&quot;-v&quot;).statusDir(remoteStatusDir).now()</code></li>
   </ul></li>
-</ul><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
+</ul><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
+<pre><code>Job.submitSqoop(session)
+    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
+    .statusDir(remoteStatusDir)
+    .now().jobId
+</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
+<ul>
+  <li>command (String) - The sqoop command string to execute.</li>
+  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
+  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
+  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
+  <li>statusDir (String) - The remote directory to store status output.</li>
+</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
 <ul>
   <li>Request
   <ul>

Modified: knox/site/books/knox-0-12-0/user-guide.html
URL: http://svn.apache.org/viewvc/knox/site/books/knox-0-12-0/user-guide.html?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/site/books/knox-0-12-0/user-guide.html (original)
+++ knox/site/books/knox-0-12-0/user-guide.html Fri Feb 24 17:58:56 2017
@@ -2996,19 +2996,7 @@ json = (new JsonSlurper()).parseText( te
 println json.FileStatuses.FileStatus.pathSuffix
 session.shutdown()
 exit
-</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
-<pre><code>Job.submitSqoop(session)
-    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
-    .statusDir(remoteStatusDir)
-    .now().jobId
-</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
-<ul>
-  <li>command (String) - The sqoop command string to execute.</li>
-  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
-  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
-  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
-  <li>statusDir (String) - The remote directory to store status output.</li>
-</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
+</code></pre><p>Notice the <code>Hdfs.rm</code> command. This is included simply to ensure that the script can be rerun. Without this an error would result the second time it is run.</p><h3><a id="Futures">Futures</a> <a href="#Futures"><img src="markbook-section-link.png"/></a></h3><p>The DSL supports the ability to invoke commands asynchronously via the later() invocation method. The object returned from the later() method is a java.util.concurrent.Future parameterized with the response type of the command. This is an example of how to asynchronously put a file to HDFS.</p>
 <pre><code>future = Hdfs.put(session).file(&quot;README&quot;).to(&quot;/tmp/example/README&quot;).later()
 println future.get().statusCode
 </code></pre><p>The future.get() method will block until the asynchronous command is complete. To illustrate the usefulness of this however multiple concurrent commands are required.</p>
@@ -3508,7 +3496,19 @@ session.shutdown()
   <ul>
     <li><code>Job.submitHive(session).file(remoteHiveFileName).arg(&quot;-v&quot;).statusDir(remoteStatusDir).now()</code></li>
   </ul></li>
-</ul><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
+</ul><h4><a id="submitSqoop+Job+API">submitSqoop Job API</a> <a href="#submitSqoop+Job+API"><img src="markbook-section-link.png"/></a></h4><p>Using the Knox DSL, you can now easily submit and monitor <a href="https://sqoop.apache.org">Apache Sqoop</a> jobs. The WebHCat Job class now supports the <code>submitSqoop</code> command.</p>
+<pre><code>Job.submitSqoop(session)
+    .command(&quot;import --connect jdbc:mysql://hostname:3306/dbname ... &quot;)
+    .statusDir(remoteStatusDir)
+    .now().jobId
+</code></pre><p>The <code>submitSqoop</code> command supports the following arguments:</p>
+<ul>
+  <li>command (String) - The sqoop command string to execute.</li>
+  <li>files (String) - Comma separated files to be copied to the templeton controller job.</li>
+  <li>optionsfile (String) - The remote file which contain Sqoop command need to run.</li>
+  <li>libdir (String) - The remote directory containing jdbc jar to include with sqoop lib</li>
+  <li>statusDir (String) - The remote directory to store status output.</li>
+</ul><p>A complete example is available here: <a href="https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL">https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL</a></p><h5><a id="queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user.">queryQueue() - Return a list of all job IDs registered to the user.</a> <a href="#queryQueue()+-+Return+a+list+of+all+job+IDs+registered+to+the+user."><img src="markbook-section-link.png"/></a></h5>
 <ul>
   <li>Request
   <ul>

Modified: knox/trunk/books/0.10.0/book_client-details.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.10.0/book_client-details.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.10.0/book_client-details.md (original)
+++ knox/trunk/books/0.10.0/book_client-details.md Fri Feb 24 17:58:56 2017
@@ -176,24 +176,6 @@ This would be the content of that script
 Notice the `Hdfs.rm` command.  This is included simply to ensure that the script can be rerun.
 Without this an error would result the second time it is run.
 
-#### submitSqoop Job API ####
-Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
-
-    Job.submitSqoop(session)
-        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
-        .statusDir(remoteStatusDir)
-        .now().jobId
-
-The `submitSqoop` command supports the following arguments:
-
-* command (String) - The sqoop command string to execute.
-* files (String) - Comma separated files to be copied to the templeton controller job.
-* optionsfile (String) - The remote file which contain Sqoop command need to run.
-* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
-* statusDir (String) - The remote directory to store status output.
-
-A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
-
 ### Futures ###
 
 The DSL supports the ability to invoke commands asynchronously via the later() invocation method.

Modified: knox/trunk/books/0.10.0/service_webhcat.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.10.0/service_webhcat.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.10.0/service_webhcat.md (original)
+++ knox/trunk/books/0.10.0/service_webhcat.md Fri Feb 24 17:58:56 2017
@@ -139,6 +139,24 @@ Each line from the file `samples/Example
 * Example
     * `Job.submitHive(session).file(remoteHiveFileName).arg("-v").statusDir(remoteStatusDir).now()`
 
+#### submitSqoop Job API ####
+Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
+
+    Job.submitSqoop(session)
+        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
+        .statusDir(remoteStatusDir)
+        .now().jobId
+
+The `submitSqoop` command supports the following arguments:
+
+* command (String) - The sqoop command string to execute.
+* files (String) - Comma separated files to be copied to the templeton controller job.
+* optionsfile (String) - The remote file which contain Sqoop command need to run.
+* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
+* statusDir (String) - The remote directory to store status output.
+
+A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
+
 ##### queryQueue() - Return a list of all job IDs registered to the user.
 
 * Request

Modified: knox/trunk/books/0.11.0/book_client-details.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.11.0/book_client-details.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.11.0/book_client-details.md (original)
+++ knox/trunk/books/0.11.0/book_client-details.md Fri Feb 24 17:58:56 2017
@@ -176,24 +176,6 @@ This would be the content of that script
 Notice the `Hdfs.rm` command.  This is included simply to ensure that the script can be rerun.
 Without this an error would result the second time it is run.
 
-#### submitSqoop Job API ####
-Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
-
-    Job.submitSqoop(session)
-        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
-        .statusDir(remoteStatusDir)
-        .now().jobId
-
-The `submitSqoop` command supports the following arguments:
-
-* command (String) - The sqoop command string to execute.
-* files (String) - Comma separated files to be copied to the templeton controller job.
-* optionsfile (String) - The remote file which contain Sqoop command need to run.
-* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
-* statusDir (String) - The remote directory to store status output.
-
-A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
-
 ### Futures ###
 
 The DSL supports the ability to invoke commands asynchronously via the later() invocation method.

Modified: knox/trunk/books/0.11.0/service_webhcat.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.11.0/service_webhcat.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.11.0/service_webhcat.md (original)
+++ knox/trunk/books/0.11.0/service_webhcat.md Fri Feb 24 17:58:56 2017
@@ -139,6 +139,24 @@ Each line from the file `samples/Example
 * Example
     * `Job.submitHive(session).file(remoteHiveFileName).arg("-v").statusDir(remoteStatusDir).now()`
 
+#### submitSqoop Job API ####
+Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
+
+    Job.submitSqoop(session)
+        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
+        .statusDir(remoteStatusDir)
+        .now().jobId
+
+The `submitSqoop` command supports the following arguments:
+
+* command (String) - The sqoop command string to execute.
+* files (String) - Comma separated files to be copied to the templeton controller job.
+* optionsfile (String) - The remote file which contain Sqoop command need to run.
+* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
+* statusDir (String) - The remote directory to store status output.
+
+A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
+
 ##### queryQueue() - Return a list of all job IDs registered to the user.
 
 * Request

Modified: knox/trunk/books/0.12.0/book_client-details.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.12.0/book_client-details.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.12.0/book_client-details.md (original)
+++ knox/trunk/books/0.12.0/book_client-details.md Fri Feb 24 17:58:56 2017
@@ -176,24 +176,6 @@ This would be the content of that script
 Notice the `Hdfs.rm` command.  This is included simply to ensure that the script can be rerun.
 Without this an error would result the second time it is run.
 
-#### submitSqoop Job API ####
-Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
-
-    Job.submitSqoop(session)
-        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
-        .statusDir(remoteStatusDir)
-        .now().jobId
-
-The `submitSqoop` command supports the following arguments:
-
-* command (String) - The sqoop command string to execute.
-* files (String) - Comma separated files to be copied to the templeton controller job.
-* optionsfile (String) - The remote file which contain Sqoop command need to run.
-* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
-* statusDir (String) - The remote directory to store status output.
-
-A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
-
 ### Futures ###
 
 The DSL supports the ability to invoke commands asynchronously via the later() invocation method.

Modified: knox/trunk/books/0.12.0/service_webhcat.md
URL: http://svn.apache.org/viewvc/knox/trunk/books/0.12.0/service_webhcat.md?rev=1784309&r1=1784308&r2=1784309&view=diff
==============================================================================
--- knox/trunk/books/0.12.0/service_webhcat.md (original)
+++ knox/trunk/books/0.12.0/service_webhcat.md Fri Feb 24 17:58:56 2017
@@ -139,6 +139,25 @@ Each line from the file `samples/Example
 * Example
     * `Job.submitHive(session).file(remoteHiveFileName).arg("-v").statusDir(remoteStatusDir).now()`
 
+#### submitSqoop Job API ####
+Using the Knox DSL, you can now easily submit and monitor [Apache Sqoop](https://sqoop.apache.org) jobs. The WebHCat Job class now supports the `submitSqoop` command.
+
+    Job.submitSqoop(session)
+        .command("import --connect jdbc:mysql://hostname:3306/dbname ... ")
+        .statusDir(remoteStatusDir)
+        .now().jobId
+
+The `submitSqoop` command supports the following arguments:
+
+* command (String) - The sqoop command string to execute.
+* files (String) - Comma separated files to be copied to the templeton controller job.
+* optionsfile (String) - The remote file which contain Sqoop command need to run.
+* libdir (String) - The remote directory containing jdbc jar to include with sqoop lib
+* statusDir (String) - The remote directory to store status output.
+
+A complete example is available here: https://cwiki.apache.org/confluence/display/KNOX/2016/11/08/Running+SQOOP+job+via+KNOX+Shell+DSL
+
+
 ##### queryQueue() - Return a list of all job IDs registered to the user.
 
 * Request