You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hbase.apache.org by mi...@apache.org on 2015/12/16 18:00:14 UTC

[28/30] hbase-site git commit: Updated 0.94 docs to 0f35a32ab123ee299f4aaaea02b4ba2d2b43cff2

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/Operation.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/Operation.html b/0.94/apidocs/org/apache/hadoop/hbase/client/Operation.html
index a037b52..7e0f116 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/Operation.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/Operation.html
@@ -142,7 +142,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
 <tr class="altColor">
 <td class="colFirst"><code>abstract <a href="http://docs.oracle.com/javase/6/docs/api/java/util/Map.html?is-external=true" title="class or interface in java.util">Map</a>&lt;<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>,<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>&gt;</code></td>
 <td class="colLast"><code><strong><a href="../../../../../org/apache/hadoop/hbase/client/Operation.html#getFingerprint()">getFingerprint</a></strong>()</code>
-<div class="block">Produces a Map containing a fingerprint which identifies the type and
+<div class="block">Produces a Map containing a fingerprint which identifies the type and 
  the static schema components of a query (i.e.</div>
 </td>
 </tr>
@@ -169,7 +169,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
 <tr class="altColor">
 <td class="colFirst"><code>abstract <a href="http://docs.oracle.com/javase/6/docs/api/java/util/Map.html?is-external=true" title="class or interface in java.util">Map</a>&lt;<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>,<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>&gt;</code></td>
 <td class="colLast"><code><strong><a href="../../../../../org/apache/hadoop/hbase/client/Operation.html#toMap(int)">toMap</a></strong>(int&nbsp;maxCols)</code>
-<div class="block">Produces a Map containing a summary of the details of a query
+<div class="block">Produces a Map containing a summary of the details of a query 
  beyond the scope of the fingerprint (i.e.</div>
 </td>
 </tr>
@@ -231,7 +231,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
 <li class="blockList">
 <h4>getFingerprint</h4>
 <pre>public abstract&nbsp;<a href="http://docs.oracle.com/javase/6/docs/api/java/util/Map.html?is-external=true" title="class or interface in java.util">Map</a>&lt;<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>,<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>&gt;&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/Operation.html#line.41">getFingerprint</a>()</pre>
-<div class="block">Produces a Map containing a fingerprint which identifies the type and
+<div class="block">Produces a Map containing a fingerprint which identifies the type and 
  the static schema components of a query (i.e. column families)</div>
 <dl><dt><span class="strong">Returns:</span></dt><dd>a map containing fingerprint information (i.e. column families)</dd></dl>
 </li>
@@ -243,7 +243,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
 <li class="blockList">
 <h4>toMap</h4>
 <pre>public abstract&nbsp;<a href="http://docs.oracle.com/javase/6/docs/api/java/util/Map.html?is-external=true" title="class or interface in java.util">Map</a>&lt;<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>,<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>&gt;&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/Operation.html#line.49">toMap</a>(int&nbsp;maxCols)</pre>
-<div class="block">Produces a Map containing a summary of the details of a query
+<div class="block">Produces a Map containing a summary of the details of a query 
  beyond the scope of the fingerprint (i.e. columns, rows...)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>maxCols</code> - a limit on the number of columns output prior to truncation</dd>
 <dt><span class="strong">Returns:</span></dt><dd>a map containing parameters of a query (i.e. rows, columns...)</dd></dl>
@@ -299,7 +299,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
 <h4>toString</h4>
 <pre>public&nbsp;<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/Operation.html#line.87">toString</a>(int&nbsp;maxCols)</pre>
 <div class="block">Produces a string representation of this Operation. It defaults to a JSON
- representation, but falls back to a string representation of the
+ representation, but falls back to a string representation of the 
  fingerprint and details in the case of a JSON encoding failure.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>maxCols</code> - a limit on the number of columns output in the summary
  prior to truncation</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/Scan.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/Scan.html b/0.94/apidocs/org/apache/hadoop/hbase/client/Scan.html
index 0d9def7..f988412 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/Scan.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/Scan.html
@@ -688,7 +688,7 @@ implements org.apache.hadoop.io.Writable</pre>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>minStamp</code> - minimum timestamp value, inclusive</dd><dd><code>maxStamp</code> - maximum timestamp value, exclusive</dd>
 <dt><span class="strong">Returns:</span></dt><dd>this</dd>
 <dt><span class="strong">Throws:</span></dt>
-<dd><code><a href="http://docs.oracle.com/javase/6/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code> - if invalid time range</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions()"><code>setMaxVersions()</code></a>,
+<dd><code><a href="http://docs.oracle.com/javase/6/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code> - if invalid time range</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions()"><code>setMaxVersions()</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions(int)"><code>setMaxVersions(int)</code></a></dd></dl>
 </li>
 </ul>
@@ -704,7 +704,7 @@ implements org.apache.hadoop.io.Writable</pre>
  and you want all versions returned, up the number of versions beyond the
  defaut.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>timestamp</code> - version timestamp</dd>
-<dt><span class="strong">Returns:</span></dt><dd>this</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions()"><code>setMaxVersions()</code></a>,
+<dt><span class="strong">Returns:</span></dt><dd>this</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions()"><code>setMaxVersions()</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/client/Scan.html#setMaxVersions(int)"><code>setMaxVersions(int)</code></a></dd></dl>
 </li>
 </ul>
@@ -997,17 +997,17 @@ implements org.apache.hadoop.io.Writable</pre>
 <div class="block">Set whether this scan is a small scan
  <p>
  Small scan should use pread and big scan can use seek + read
-
+ 
  seek + read is fast but can cause two problem (1) resource contention (2)
  cause too much network io
-
+ 
  [89-fb] Using pread for non-compaction read request
  https://issues.apache.org/jira/browse/HBASE-7266
-
+ 
  On the other hand, if setting it true, we would do
  openScanner,next,closeScanner in one RPC call. It means the better
  performance for small scan. [HBASE-9488].
-
+ 
  Generally, if the scan range is within one data block(64KB), it could be
  considered as a small scan.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>small</code> - </dd></dl>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html b/0.94/apidocs/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html
index 074044d..bdf97bd 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html
@@ -168,7 +168,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <tr class="rowColor">
 <td class="colFirst"><code><a href="../../../../../org/apache/hadoop/hbase/HColumnDescriptor.html" title="class in org.apache.hadoop.hbase">HColumnDescriptor</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#removeFamily(byte[])">removeFamily</a></strong>(byte[]&nbsp;column)</code>
-<div class="block">Removes the HColumnDescriptor with name specified by the parameter column
+<div class="block">Removes the HColumnDescriptor with name specified by the parameter column 
  from the table descriptor</div>
 </td>
 </tr>
@@ -182,7 +182,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <tr class="rowColor">
 <td class="colFirst"><code>void</code></td>
 <td class="colLast"><code><strong><a href="../../../../../org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#setMemStoreFlushSize(long)">setMemStoreFlushSize</a></strong>(long&nbsp;memstoreFlushSize)</code>
-<div class="block">Represents the maximum size of the memstore after which the contents of the
+<div class="block">Represents the maximum size of the memstore after which the contents of the 
  memstore are flushed to the filesystem.</div>
 </td>
 </tr>
@@ -276,7 +276,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <h4>removeFamily</h4>
 <pre>public&nbsp;<a href="../../../../../org/apache/hadoop/hbase/HColumnDescriptor.html" title="class in org.apache.hadoop.hbase">HColumnDescriptor</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#line.73">removeFamily</a>(byte[]&nbsp;column)</pre>
 <div class="block"><strong>Description copied from class:&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html#removeFamily(byte[])">HTableDescriptor</a></code></strong></div>
-<div class="block">Removes the HColumnDescriptor with name specified by the parameter column
+<div class="block">Removes the HColumnDescriptor with name specified by the parameter column 
  from the table descriptor</div>
 <dl>
 <dt><strong>Overrides:</strong></dt>
@@ -295,7 +295,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <pre>public&nbsp;void&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#line.81">setReadOnly</a>(boolean&nbsp;readOnly)</pre>
 <div class="block"><strong>Description copied from class:&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html#setReadOnly(boolean)">HTableDescriptor</a></code></strong></div>
 <div class="block">Setting the table as read only sets all the columns in the table as read
- only. By default all tables are modifiable, but if the readOnly flag is
+ only. By default all tables are modifiable, but if the readOnly flag is 
  set to true then the contents of the table can only be read but not modified.</div>
 <dl>
 <dt><strong>Overrides:</strong></dt>
@@ -345,14 +345,14 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <pre>public&nbsp;void&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#line.105">setMaxFileSize</a>(long&nbsp;maxFileSize)</pre>
 <div class="block"><strong>Description copied from class:&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html#setMaxFileSize(long)">HTableDescriptor</a></code></strong></div>
 <div class="block">Sets the maximum size upto which a region can grow to after which a region
- split is triggered. The region size is represented by the size of the biggest
- store file in that region, i.e. If the biggest store file grows beyond the
- maxFileSize, then the region split is triggered. This defaults to a value of
+ split is triggered. The region size is represented by the size of the biggest 
+ store file in that region, i.e. If the biggest store file grows beyond the 
+ maxFileSize, then the region split is triggered. This defaults to a value of 
  256 MB.
  <p>
- This is not an absolute value and might vary. Assume that a single row exceeds
+ This is not an absolute value and might vary. Assume that a single row exceeds 
  the maxFileSize then the storeFileSize will be greater than maxFileSize since
- a single row cannot be split across multiple regions
+ a single row cannot be split across multiple regions 
  </p></div>
 <dl>
 <dt><strong>Overrides:</strong></dt>
@@ -369,7 +369,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" t
 <h4>setMemStoreFlushSize</h4>
 <pre>public&nbsp;void&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/client/UnmodifyableHTableDescriptor.html#line.113">setMemStoreFlushSize</a>(long&nbsp;memstoreFlushSize)</pre>
 <div class="block"><strong>Description copied from class:&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html#setMemStoreFlushSize(long)">HTableDescriptor</a></code></strong></div>
-<div class="block">Represents the maximum size of the memstore after which the contents of the
+<div class="block">Represents the maximum size of the memstore after which the contents of the 
  memstore are flushed to the filesystem. This defaults to a size of 64 MB.</div>
 <dl>
 <dt><strong>Overrides:</strong></dt>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html b/0.94/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html
index dc7e51c..3908e91 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html
@@ -340,7 +340,7 @@ service.</div>
                             byte[]&nbsp;family)</code>
 <div class="block"><strong>Deprecated.</strong>&nbsp;
 <div class="block"><i>As of version 0.92 this method is deprecated without
- replacement.
+ replacement.   
  getRowOrBefore is used internally to find entries in .META. and makes
  various assumptions about the table (which are true for .META. but not
  in general) to be efficient.</i></div>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.Call.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.Call.html b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.Call.html
index f83ed50..b0d8c10 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.Call.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.Call.html
@@ -104,8 +104,8 @@
  <a href="../../../../../../org/apache/hadoop/hbase/ipc/CoprocessorProtocol.html" title="interface in org.apache.hadoop.hbase.ipc"><code>CoprocessorProtocol</code></a>
  sub-type instance.
  </p></div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/package-summary.html"><code>org.apache.hadoop.hbase.client.coprocessor</code></a>,
-<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>,
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/package-summary.html"><code>org.apache.hadoop.hbase.client.coprocessor</code></a>, 
+<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>, 
 <a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call,%20org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call, org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.html b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.html
index 4544bae..832f121 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Batch.html
@@ -244,7 +244,7 @@ extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?
  and return the results</dd>
 <dt><span class="strong">Throws:</span></dt>
 <dd><code><a href="http://docs.oracle.com/javase/6/docs/api/java/lang/NoSuchMethodException.html?is-external=true" title="class or interface in java.lang">NoSuchMethodException</a></code> - if the method named, with the given argument
-     types, cannot be found in the protocol class</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/Batch.html#forMethod(java.lang.reflect.Method,%20java.lang.Object...)"><code>forMethod(java.lang.reflect.Method, Object...)</code></a>,
+     types, cannot be found in the protocol class</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/Batch.html#forMethod(java.lang.reflect.Method,%20java.lang.Object...)"><code>forMethod(java.lang.reflect.Method, Object...)</code></a>, 
 <a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call,%20org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call, org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html
index ced1c2c..8421125 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html
@@ -102,7 +102,7 @@
 <pre>public class <a href="../../../../../../src-html/org/apache/hadoop/hbase/client/coprocessor/BigDecimalColumnInterpreter.html#line.37">BigDecimalColumnInterpreter</a>
 extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>
 implements <a href="../../../../../../org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.html" title="interface in org.apache.hadoop.hbase.coprocessor">ColumnInterpreter</a>&lt;<a href="http://docs.oracle.com/javase/6/docs/api/java/math/BigDecimal.html?is-external=true" title="class or interface in java.math">BigDecimal</a>,<a href="http://docs.oracle.com/javase/6/docs/api/java/math/BigDecimal.html?is-external=true" title="class or interface in java.math">BigDecimal</a>&gt;</pre>
-<div class="block">ColumnInterpreter for doing Aggregation's with BigDecimal columns.
+<div class="block">ColumnInterpreter for doing Aggregation's with BigDecimal columns. 
  This class is required at the RegionServer also.</div>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Exec.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Exec.html b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Exec.html
index eb3cae4..e4a7c80 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Exec.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/Exec.html
@@ -124,8 +124,8 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/client/Row.html" t
  to wrap the <code>CoprocessorProtocol</code> method invocations requested in
  RPC calls.  It should not be used directly by HBase clients.
  </p></div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/ExecResult.html" title="class in org.apache.hadoop.hbase.client.coprocessor"><code>ExecResult</code></a>,
-<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>,
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/ExecResult.html" title="class in org.apache.hadoop.hbase.client.coprocessor"><code>ExecResult</code></a>, 
+<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>, 
 <a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call,%20org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call, org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/ExecResult.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/ExecResult.html b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/ExecResult.html
index 36d6c26..79bb13d 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/ExecResult.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/client/coprocessor/ExecResult.html
@@ -113,8 +113,8 @@ implements org.apache.hadoop.io.Writable</pre>
  responses from <a href="../../../../../../org/apache/hadoop/hbase/ipc/CoprocessorProtocol.html" title="interface in org.apache.hadoop.hbase.ipc"><code>CoprocessorProtocol</code></a>
  method invocations.  It should not be used directly by clients.
  </p></div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/Exec.html" title="class in org.apache.hadoop.hbase.client.coprocessor"><code>Exec</code></a>,
-<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>,
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/client/coprocessor/Exec.html" title="class in org.apache.hadoop.hbase.client.coprocessor"><code>Exec</code></a>, 
+<a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call)</code></a>, 
 <a href="../../../../../../org/apache/hadoop/hbase/client/HTable.html#coprocessorExec(java.lang.Class,%20byte[],%20byte[],%20org.apache.hadoop.hbase.client.coprocessor.Batch.Call,%20org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)"><code>HTable.coprocessorExec(Class, byte[], byte[], org.apache.hadoop.hbase.client.coprocessor.Batch.Call, org.apache.hadoop.hbase.client.coprocessor.Batch.Callback)</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/constraint/Constraint.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/constraint/Constraint.html b/0.94/apidocs/org/apache/hadoop/hbase/constraint/Constraint.html
index f4daeeb..c6f67cc 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/constraint/Constraint.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/constraint/Constraint.html
@@ -132,7 +132,7 @@ extends org.apache.hadoop.conf.Configurable</pre>
  but it is possible.
  <p>
  NOTE: Implementing classes must have a nullary (no-args) constructor</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/constraint/BaseConstraint.html" title="class in org.apache.hadoop.hbase.constraint"><code>BaseConstraint</code></a>,
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/constraint/BaseConstraint.html" title="class in org.apache.hadoop.hbase.constraint"><code>BaseConstraint</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html" title="class in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/constraint/package-summary.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/constraint/package-summary.html b/0.94/apidocs/org/apache/hadoop/hbase/constraint/package-summary.html
index ae80708..9f1a3b1 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/constraint/package-summary.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/constraint/package-summary.html
@@ -139,7 +139,7 @@
 </a>
 <h2 title="Package org.apache.hadoop.hbase.constraint Description">Package org.apache.hadoop.hbase.constraint Description</h2>
 <div class="block">Restrict the domain of a data attribute, often times to fulfill business rules/requirements.
-
+ 
  <p>
  <h2> Table of Contents</h2>
  <ul>
@@ -151,94 +151,94 @@
  </p>
 
  <h2><a name="overview">Overview</a></h2>
- Constraints are used to enforce business rules in a database.
- By checking all <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a> on a given table, you can enforce very specific data policies.
- For instance, you can ensure that a certain column family-column qualifier pair always has a value between 1 and 10.
+ Constraints are used to enforce business rules in a database. 
+ By checking all <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a> on a given table, you can enforce very specific data policies. 
+ For instance, you can ensure that a certain column family-column qualifier pair always has a value between 1 and 10. 
  Otherwise, the <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Put</code></a> is rejected and the data integrity is maintained.
  <p>
- Constraints are designed to be configurable, so a constraint can be used across different tables, but implement different
+ Constraints are designed to be configurable, so a constraint can be used across different tables, but implement different 
  behavior depending on the specific configuration given to that constraint.
  <p>
- By adding a constraint to a table (see <a href="#usage">Example Usage</a>), constraints will automatically enabled.
- You also then have the option of to disable (just 'turn off') or remove (delete all associated information) all constraints on a table.
- If you remove all constraints
- (see <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#remove(org.apache.hadoop.hbase.HTableDescriptor)"><code>Constraints.remove(org.apache.hadoop.hbase.HTableDescriptor)</code></a>,
- you must re-add any <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a> you want on that table.
- However, if they are just disabled (see <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#disable(org.apache.hadoop.hbase.HTableDescriptor)"><code>Constraints.disable(org.apache.hadoop.hbase.HTableDescriptor)</code></a>,
+ By adding a constraint to a table (see <a href="#usage">Example Usage</a>), constraints will automatically enabled. 
+ You also then have the option of to disable (just 'turn off') or remove (delete all associated information) all constraints on a table. 
+ If you remove all constraints 
+ (see <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#remove(org.apache.hadoop.hbase.HTableDescriptor)"><code>Constraints.remove(org.apache.hadoop.hbase.HTableDescriptor)</code></a>, 
+ you must re-add any <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a> you want on that table. 
+ However, if they are just disabled (see <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#disable(org.apache.hadoop.hbase.HTableDescriptor)"><code>Constraints.disable(org.apache.hadoop.hbase.HTableDescriptor)</code></a>, 
  all you need to do is enable constraints again, and everything will be turned back on as it was configured.
  Individual constraints can also be individually enabled, disabled or removed without affecting other constraints.
  <p>
- By default, constraints are disabled on a table.
+ By default, constraints are disabled on a table. 
  This means you will not see <i>any</i> slow down on a table if constraints are not enabled.
  <p>
 
  <h2><a name="concurrency">Concurrency and Atomicity</a></h2>
- Currently, no attempts at enforcing correctness in a multi-threaded scenario when modifying a constraint, via
- <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html" title="class in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a>, to the the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>.
- This is particularly important when adding a constraint(s) to the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>
+ Currently, no attempts at enforcing correctness in a multi-threaded scenario when modifying a constraint, via 
+ <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html" title="class in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a>, to the the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>. 
+ This is particularly important when adding a constraint(s) to the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a> 
  as it first retrieves the next priority from a custom value set in the descriptor,
- adds each constraint (with increasing priority) to the descriptor, and then the next available priority is re-stored
- back in the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>.
+ adds each constraint (with increasing priority) to the descriptor, and then the next available priority is re-stored 
+ back in the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>. 
  <p>
- Locking is recommended around each of Constraints add methods:
- <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#add(org.apache.hadoop.hbase.HTableDescriptor,%20java.lang.Class...)"><code>Constraints.add(org.apache.hadoop.hbase.HTableDescriptor, Class...)</code></a>,
- <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#add(org.apache.hadoop.hbase.HTableDescriptor,%20org.apache.hadoop.hbase.util.Pair...)"><code>Constraints.add(org.apache.hadoop.hbase.HTableDescriptor, org.apache.hadoop.hbase.util.Pair...)</code></a>,
+ Locking is recommended around each of Constraints add methods: 
+ <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#add(org.apache.hadoop.hbase.HTableDescriptor,%20java.lang.Class...)"><code>Constraints.add(org.apache.hadoop.hbase.HTableDescriptor, Class...)</code></a>, 
+ <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#add(org.apache.hadoop.hbase.HTableDescriptor,%20org.apache.hadoop.hbase.util.Pair...)"><code>Constraints.add(org.apache.hadoop.hbase.HTableDescriptor, org.apache.hadoop.hbase.util.Pair...)</code></a>, 
  and <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html#add(org.apache.hadoop.hbase.HTableDescriptor,%20java.lang.Class,%20org.apache.hadoop.conf.Configuration)"><code>Constraints.add(org.apache.hadoop.hbase.HTableDescriptor, Class, org.apache.hadoop.conf.Configuration)</code></a>.
  Any changes on <i>a single HTableDescriptor</i> should be serialized, either within a single thread or via external mechanisms.
  <p>
- Note that having a higher priority means that a constraint will run later; e.g. a constraint with priority 1 will run before a
- constraint with priority 2.
+ Note that having a higher priority means that a constraint will run later; e.g. a constraint with priority 1 will run before a 
+ constraint with priority 2. 
  <p>
- Since Constraints currently are designed to just implement simple checks (e.g. is the value in the right range), there will
- be no atomicity conflicts.
- Even if one of the puts finishes the constraint first, the single row will not be corrupted and the 'fastest' write will win;
+ Since Constraints currently are designed to just implement simple checks (e.g. is the value in the right range), there will 
+ be no atomicity conflicts. 
+ Even if one of the puts finishes the constraint first, the single row will not be corrupted and the 'fastest' write will win; 
  the underlying region takes care of breaking the tie and ensuring that writes get serialized to the table.
- So yes, this doesn't ensure that we are going to get specific ordering or even a fully consistent view of the underlying data.
+ So yes, this doesn't ensure that we are going to get specific ordering or even a fully consistent view of the underlying data. 
  <p>
  Each constraint should only use local/instance variables, unless doing more advanced usage. Static variables could cause difficulties
  when checking concurrent writes to the same region, leading to either highly locked situations (decreasing through-put) or higher probability of errors.
  However, as long as each constraint just uses local variables, each thread interacting with the constraint will execute correctly and efficiently.
 
  <h2><a name="caveats">Caveats</a></h2>
- In traditional (SQL) databases, Constraints are often used to enforce <a href="http://en.wikipedia.org/wiki/Relational_database#Constraints">referential integrity</a>.
- However, in HBase, this will likely cause significant overhead and dramatically decrease the number of
- <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a>/second possible on a table. This is because to check the referential integrity
+ In traditional (SQL) databases, Constraints are often used to enforce <a href="http://en.wikipedia.org/wiki/Relational_database#Constraints">referential integrity</a>. 
+ However, in HBase, this will likely cause significant overhead and dramatically decrease the number of 
+ <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a>/second possible on a table. This is because to check the referential integrity 
  when making a <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Put</code></a>, one must block on a scan for the 'remote' table, checking for the valid reference.
- For millions of <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a> a second, this will breakdown very quickly.
+ For millions of <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Puts</code></a> a second, this will breakdown very quickly. 
  There are several options around the blocking behavior including, but not limited to:
  <ul>
- <li>Create a 'pre-join' table where the keys are already denormalized</li>
+ <li>Create a 'pre-join' table where the keys are already denormalized</li>  
  <li>Designing for 'incorrect' references</li>
  <li>Using an external enforcement mechanism</li>
  </ul>
  There are also several general considerations that must be taken into account, when using Constraints:
  <ol>
- <li>All changes made via <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html" title="class in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a> will make modifications to the
- <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a> for a given table. As such, the usual renabling of tables should be used for
+ <li>All changes made via <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraints.html" title="class in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a> will make modifications to the 
+ <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a> for a given table. As such, the usual renabling of tables should be used for 
  propagating changes to the table. When at all possible, Constraints should be added to the table before the table is created.</li>
- <li>Constraints are run in the order that they are added to a table. This has implications for what order constraints should
+ <li>Constraints are run in the order that they are added to a table. This has implications for what order constraints should 
  be added to a table.</li>
- <li>Whenever new Constraint jars are added to a region server, those region servers need to go through a rolling restart to
+ <li>Whenever new Constraint jars are added to a region server, those region servers need to go through a rolling restart to 
  make sure that they pick up the new jars and can enable the new constraints.</li>
  <li>There are certain keys that are reserved for the Configuration namespace:
  <ul>
  <li>_ENABLED - used server-side to determine if a constraint should be run</li>
  <li>_PRIORITY - used server-side to determine what order a constraint should be run</li>
  </ul>
- If these items are set, they will be respected in the constraint configuration, but they are taken care of by default in when
+ If these items are set, they will be respected in the constraint configuration, but they are taken care of by default in when 
  adding constraints to an <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a> via the usual method.</li>
  </ol>
- <p>
- Under the hood, constraints are implemented as a Coprocessor (see <a href="../../../../../org/apache/hadoop/hbase/constraint/ConstraintProcessor.html" title="class in org.apache.hadoop.hbase.constraint"><code>ConstraintProcessor</code></a>
+ <p> 
+ Under the hood, constraints are implemented as a Coprocessor (see <a href="../../../../../org/apache/hadoop/hbase/constraint/ConstraintProcessor.html" title="class in org.apache.hadoop.hbase.constraint"><code>ConstraintProcessor</code></a> 
  if you are interested).
 
 
  <h2><a name="usage">Example usage</a></h2>
- First, you must define a <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a>.
+ First, you must define a <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a>. 
  The best way to do this is to extend <a href="../../../../../org/apache/hadoop/hbase/constraint/BaseConstraint.html" title="class in org.apache.hadoop.hbase.constraint"><code>BaseConstraint</code></a>, which takes care of some of the more
  mundane details of using a <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a>.
  <p>
- Let's look at one possible implementation of a constraint - an IntegerConstraint(there are also several simple examples in the tests).
+ Let's look at one possible implementation of a constraint - an IntegerConstraint(there are also several simple examples in the tests). 
  The IntegerConstraint checks to make sure that the value is a String-encoded <code>int</code>.
  It is really simple to implement this kind of constraint, the only method needs to be implemented is
  <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html#check(org.apache.hadoop.hbase.client.Put)"><code>Constraint.check(org.apache.hadoop.hbase.client.Put)</code></a>:
@@ -262,18 +262,18 @@
  } catch (NumberFormatException e) {
  throw new ConstraintException("Value in Put (" + p
  + ") was not a String-encoded integer", e);
- } } }
+ } } } 
  </pre></blockquote>
  </div>
  <p>
- Note that all exceptions that you expect to be thrown must be caught and then rethrown as a
- <a href="../../../../../org/apache/hadoop/hbase/constraint/ConstraintException.html" title="class in org.apache.hadoop.hbase.constraint"><code>ConstraintException</code></a>. This way, you can be sure that a
- <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Put</code></a> fails for an expected reason, rather than for any reason.
- For example, an <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/OutOfMemoryError.html?is-external=true" title="class or interface in java.lang"><code>OutOfMemoryError</code></a> is probably indicative of an inherent problem in
+ Note that all exceptions that you expect to be thrown must be caught and then rethrown as a 
+ <a href="../../../../../org/apache/hadoop/hbase/constraint/ConstraintException.html" title="class in org.apache.hadoop.hbase.constraint"><code>ConstraintException</code></a>. This way, you can be sure that a 
+ <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Put</code></a> fails for an expected reason, rather than for any reason. 
+ For example, an <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/OutOfMemoryError.html?is-external=true" title="class or interface in java.lang"><code>OutOfMemoryError</code></a> is probably indicative of an inherent problem in 
  the <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraint</code></a>, rather than a failed <a href="../../../../../org/apache/hadoop/hbase/client/Put.html" title="class in org.apache.hadoop.hbase.client"><code>Put</code></a>.
  <p>
  If an unexpected exception is thrown (for example, any kind of uncaught <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/RuntimeException.html?is-external=true" title="class or interface in java.lang"><code>RuntimeException</code></a>),
- constraint-checking will be 'unloaded' from the regionserver where that error occurred.
+ constraint-checking will be 'unloaded' from the regionserver where that error occurred. 
  This means no further <a href="../../../../../org/apache/hadoop/hbase/constraint/Constraint.html" title="interface in org.apache.hadoop.hbase.constraint"><code>Constraints</code></a> will be checked on that server
  until it is reloaded. This is done to ensure the system remains as available as possible.
  Therefore, be careful when writing your own Constraint.
@@ -287,14 +287,14 @@
  Constraints.add(desc, IntegerConstraint.class);
  </pre></blockquote></div>
  <p>
- Once we added the IntegerConstraint, constraints will be enabled on the table (once it is created) and
+ Once we added the IntegerConstraint, constraints will be enabled on the table (once it is created) and 
  we will always check to make sure that the value is an String-encoded integer.
- <p>
+ <p> 
  However, suppose we also write our own constraint, <code>MyConstraint.java</code>.
- First, you need to make sure this class-files are in the classpath (in a jar) on the regionserver where
+ First, you need to make sure this class-files are in the classpath (in a jar) on the regionserver where 
  that constraint will be run (this could require a rolling restart on the region server - see <a href="#caveats">Caveats</a> above)
  <p>
- Suppose that MyConstraint also uses a Configuration (see <code>Configurable.getConf()</code>).
+ Suppose that MyConstraint also uses a Configuration (see <code>Configurable.getConf()</code>). 
  Then adding MyConstraint looks like this:
 
  <div style="background-color: #cccccc; padding: 2px">
@@ -312,7 +312,7 @@
  <i>will be run first</i>, followed by MyConstraint.
  <p>
  Suppose we realize that the <code>Configuration</code> for MyConstraint is actually wrong
- when it was added to the table. Note, when it is added to the table, it is <i>not</i> added by reference,
+ when it was added to the table. Note, when it is added to the table, it is <i>not</i> added by reference, 
  but is instead copied into the <a href="../../../../../org/apache/hadoop/hbase/HTableDescriptor.html" title="class in org.apache.hadoop.hbase"><code>HTableDescriptor</code></a>.
  Thus, to change the <code>Configuration</code> we are using for MyConstraint, we need to do this:
 
@@ -323,7 +323,7 @@
  Constraints.setConfiguration(desc, MyConstraint.class, conf);
  </pre></blockquote></div>
  <p>
- This will overwrite the previous configuration for MyConstraint, but <i>not</i> change the order of the
+ This will overwrite the previous configuration for MyConstraint, but <i>not</i> change the order of the 
  constraint nor if it is enabled/disabled.
  <p>
  Note that the same constraint class can be added multiple times to a table without repercussion.
@@ -337,7 +337,7 @@
  </pre></blockquote></div>
  <p>
  This just turns off MyConstraint, but retains the position and the configuration associated with MyConstraint.
- Now, if we want to re-enable the constraint, its just another one-liner:
+ Now, if we want to re-enable the constraint, its just another one-liner: 
  <div style="background-color: #cccccc">
  <blockquote><pre>
  Constraints.enable(desc, MyConstraint.class);

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateImplementation.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateImplementation.html b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateImplementation.html
index d3acee4..c4f557b 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateImplementation.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateImplementation.html
@@ -509,7 +509,7 @@ implements <a href="../../../../../org/apache/hadoop/hbase/coprocessor/Aggregate
  It is computed for the combination of column
  family and column qualifier(s) in the given row range as defined in the
  Scan object. In its current implementation, it takes one column family and
- two column qualifiers. The first qualifier is for values column and
+ two column qualifiers. The first qualifier is for values column and 
  the second qualifier (optional) is for weight column.</div>
 <dl>
 <dt><strong>Specified by:</strong></dt>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.html b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.html
index 494fbde..91432b3 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.html
@@ -372,7 +372,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/ipc/CoprocessorProtocol.
  It is computed for the combination of column
  family and column qualifier(s) in the given row range as defined in the
  Scan object. In its current implementation, it takes one column family and
- two column qualifiers. The first qualifier is for values column and
+ two column qualifiers. The first qualifier is for values column and 
  the second qualifier (optional) is for weight column.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>ci</code> - </dd><dd><code>scan</code> - </dd>
 <dt><span class="strong">Returns:</span></dt><dd>Pair</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/BulkDeleteProtocol.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/BulkDeleteProtocol.html b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/BulkDeleteProtocol.html
index b0c0691..d3c96cf 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/BulkDeleteProtocol.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/BulkDeleteProtocol.html
@@ -104,7 +104,7 @@ extends <a href="../../../../../../org/apache/hadoop/hbase/ipc/CoprocessorProtoc
  // set scan properties(rowkey range, filters, timerange etc).
  HTable ht = ...;
  long noOfDeletedRows = 0L;
- Batch.Call&lt;BulkDeleteProtocol, BulkDeleteResponse&gt; callable =
+ Batch.Call&lt;BulkDeleteProtocol, BulkDeleteResponse&gt; callable = 
      new Batch.Call&lt;BulkDeleteProtocol, BulkDeleteResponse&gt;() {
    public BulkDeleteResponse call(BulkDeleteProtocol instance) throws IOException {
      return instance.deleteRows(scan, BulkDeleteProtocol.DeleteType, timestamp, rowBatchSize);

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/ZooKeeperScanPolicyObserver.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/ZooKeeperScanPolicyObserver.html b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/ZooKeeperScanPolicyObserver.html
index 06bcbf6..f9fb1d3 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/ZooKeeperScanPolicyObserver.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/coprocessor/example/ZooKeeperScanPolicyObserver.html
@@ -114,7 +114,7 @@ extends <a href="../../../../../../org/apache/hadoop/hbase/coprocessor/BaseRegio
 
  This would be useful for an incremental backup tool, which would indicate the last
  time of a successful backup via ZK and instruct HBase to not delete data that was
- inserted since (based on wall clock time).
+ inserted since (based on wall clock time). 
 
  This implements org.apache.zookeeper.Watcher directly instead of using
  <a href="../../../../../../org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.html" title="class in org.apache.hadoop.hbase.zookeeper"><code>ZooKeeperWatcher</code></a>, because RegionObservers come and go and currently

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignExceptionSnare.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignExceptionSnare.html b/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignExceptionSnare.html
index 827a719..0066af8 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignExceptionSnare.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignExceptionSnare.html
@@ -106,7 +106,7 @@ public interface <a href="../../../../../src-html/org/apache/hadoop/hbase/errorh
  <p>
  NOTE: This is very similar to the InterruptedException/interrupt/interrupted pattern.  There,
  the notification state is bound to a Thread.  Using this, applications receive Exceptions in
- the snare.  The snare is referenced and checked by multiple threads which enables exception
+ the snare.  The snare is referenced and checked by multiple threads which enables exception 
  notification in all the involved threads/processes.</div>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/TimeoutException.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/TimeoutException.html b/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/TimeoutException.html
index 54de9de..b28de4b 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/TimeoutException.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/errorhandling/TimeoutException.html
@@ -114,7 +114,7 @@
 public class <a href="../../../../../src-html/org/apache/hadoop/hbase/errorhandling/TimeoutException.html#line.30">TimeoutException</a>
 extends <a href="http://docs.oracle.com/javase/6/docs/api/java/lang/Exception.html?is-external=true" title="class or interface in java.lang">Exception</a></pre>
 <div class="block">Exception for timeout of a task.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/errorhandling/TimeoutExceptionInjector.html" title="class in org.apache.hadoop.hbase.errorhandling"><code>TimeoutExceptionInjector</code></a>,
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/errorhandling/TimeoutExceptionInjector.html" title="class in org.apache.hadoop.hbase.errorhandling"><code>TimeoutExceptionInjector</code></a>, 
 <a href="../../../../../serialized-form.html#org.apache.hadoop.hbase.errorhandling.TimeoutException">Serialized Form</a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/executor/EventHandler.EventType.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/executor/EventHandler.EventType.html b/0.94/apidocs/org/apache/hadoop/hbase/executor/EventHandler.EventType.html
index 059a0bf..041dcfe 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/executor/EventHandler.EventType.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/executor/EventHandler.EventType.html
@@ -543,7 +543,7 @@ for (EventHandler.EventType c : EventHandler.EventType.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/executor/EventHandler.EventType.html" title="enum in org.apache.hadoop.hbase.executor">EventHandler.EventType</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/executor/EventHandler.EventType.html#line.87">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html b/0.94/apidocs/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html
index 51921fb..fd3c81c 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html
@@ -364,7 +364,7 @@ for (ExecutorService.ExecutorType c : ExecutorService.ExecutorType.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html" title="enum in org.apache.hadoop.hbase.executor">ExecutorService.ExecutorType</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/executor/ExecutorService.ExecutorType.html#line.25">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html
index bf3a275..507603d 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html
@@ -264,7 +264,7 @@ for (BitComparator.BitwiseOp c : BitComparator.BitwiseOp.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html" title="enum in org.apache.hadoop.hbase.filter">BitComparator.BitwiseOp</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/filter/BitComparator.BitwiseOp.html#line.37">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html
index 5251509..56e705a 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html
@@ -324,7 +324,7 @@ for (CompareFilter.CompareOp c : CompareFilter.CompareOp.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html" title="enum in org.apache.hadoop.hbase.filter">CompareFilter.CompareOp</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/filter/CompareFilter.CompareOp.html#line.167">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html
index e9cc0e0..a1f975d 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html
@@ -114,7 +114,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/CompareFilter.htm
 <div class="block">A filter for adding inter-column timestamp matching
  Only cells with a correspondingly timestamped entry in
  the target column will be retained
- Not compatible with Scan.setBatch as operations need
+ Not compatible with Scan.setBatch as operations need 
  full rows for correct filtering</div>
 </li>
 </ul>
@@ -441,7 +441,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/CompareFilter.htm
 <pre>public&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/filter/DependentColumnFilter.html#line.87">DependentColumnFilter</a>(byte[]&nbsp;family,
                      byte[]&nbsp;qualifier)</pre>
 <div class="block">Constructor for DependentColumn filter.
- Keyvalues where a keyvalue from target column
+ Keyvalues where a keyvalue from target column 
  with the same timestamp do not exist will be dropped.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>family</code> - name of target column family</dd><dd><code>qualifier</code> - name of column qualifier</dd></dl>
 </li>
@@ -456,7 +456,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/CompareFilter.htm
                      byte[]&nbsp;qualifier,
                      boolean&nbsp;dropDependentColumn)</pre>
 <div class="block">Constructor for DependentColumn filter.
- Keyvalues where a keyvalue from target column
+ Keyvalues where a keyvalue from target column 
  with the same timestamp do not exist will be dropped.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>family</code> - name of dependent column family</dd><dd><code>qualifier</code> - name of dependent qualifier</dd><dd><code>dropDependentColumn</code> - whether the dependent columns keyvalues should be discarded</dd></dl>
 </li>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html
index f4b971d..4dce715 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html
@@ -310,7 +310,7 @@ for (Filter.ReturnCode c : Filter.ReturnCode.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/filter/Filter.ReturnCode.html" title="enum in org.apache.hadoop.hbase.filter">Filter.ReturnCode</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/filter/Filter.ReturnCode.html#line.172">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.html
index 0362104..5b8033c 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/Filter.html
@@ -304,7 +304,7 @@ extends org.apache.hadoop.io.Writable</pre>
  If the KeyValue is changed a new KeyValue object must be returned.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>v</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterBase.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterBase.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterBase.html
index ced8c16..250029d 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterBase.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterBase.html
@@ -375,7 +375,7 @@ implements <a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html" t
 <dd><code><a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html#transform(org.apache.hadoop.hbase.KeyValue)">transform</a></code>&nbsp;in interface&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html" title="interface in org.apache.hadoop.hbase.filter">Filter</a></code></dd>
 <dt><span class="strong">Parameters:</span></dt><dd><code>v</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.Operator.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.Operator.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.Operator.html
index 38c85a7..25f1bd6 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.Operator.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.Operator.html
@@ -249,7 +249,7 @@ for (FilterList.Operator c : FilterList.Operator.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/filter/FilterList.Operator.html" title="enum in org.apache.hadoop.hbase.filter">FilterList.Operator</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html#line.55">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
index 23f3173..a1e5154 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
@@ -503,7 +503,7 @@ implements <a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html" t
 <dd><code><a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html#transform(org.apache.hadoop.hbase.KeyValue)">transform</a></code>&nbsp;in interface&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/filter/Filter.html" title="interface in org.apache.hadoop.hbase.filter">Filter</a></code></dd>
 <dt><span class="strong">Parameters:</span></dt><dd><code>v</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/KeyOnlyFilter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/KeyOnlyFilter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/KeyOnlyFilter.html
index 2463e18..468c62b 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/KeyOnlyFilter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/KeyOnlyFilter.html
@@ -253,7 +253,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html"
 <dd><code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html#transform(org.apache.hadoop.hbase.KeyValue)">transform</a></code>&nbsp;in class&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html" title="class in org.apache.hadoop.hbase.filter">FilterBase</a></code></dd>
 <dt><span class="strong">Parameters:</span></dt><dd><code>kv</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/SkipFilter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/SkipFilter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/SkipFilter.html
index 1119298..1839670 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/SkipFilter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/SkipFilter.html
@@ -344,7 +344,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html"
 <dd><code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html#transform(org.apache.hadoop.hbase.KeyValue)">transform</a></code>&nbsp;in class&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html" title="class in org.apache.hadoop.hbase.filter">FilterBase</a></code></dd>
 <dt><span class="strong">Parameters:</span></dt><dd><code>v</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/WhileMatchFilter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/WhileMatchFilter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/WhileMatchFilter.html
index b59a8b7..c8b643c 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/WhileMatchFilter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/WhileMatchFilter.html
@@ -389,7 +389,7 @@ extends <a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html"
 <dd><code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html#transform(org.apache.hadoop.hbase.KeyValue)">transform</a></code>&nbsp;in class&nbsp;<code><a href="../../../../../org/apache/hadoop/hbase/filter/FilterBase.html" title="class in org.apache.hadoop.hbase.filter">FilterBase</a></code></dd>
 <dt><span class="strong">Parameters:</span></dt><dd><code>v</code> - the KeyValue in question</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the changed KeyValue</dd><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../org/apache/hadoop/hbase/KeyValue.html#shallowCopy()"><code>The transformed KeyValue is what is eventually returned to the
- client. Most filters will return the passed KeyValue unchanged.</code></a>,
+ client. Most filters will return the passed KeyValue unchanged.</code></a>, 
 <a href="../../../../../org/apache/hadoop/hbase/filter/KeyOnlyFilter.html#transform(org.apache.hadoop.hbase.KeyValue)"><code>for an example of a transformation.</code></a></dd></dl>
 </li>
 </ul>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/CompareFilter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/CompareFilter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/CompareFilter.html
index f8c5e1f..be29b30 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/CompareFilter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/CompareFilter.html
@@ -103,7 +103,7 @@
 <div class="block">A filter for adding inter-column timestamp matching
  Only cells with a correspondingly timestamped entry in
  the target column will be retained
- Not compatible with Scan.setBatch as operations need
+ Not compatible with Scan.setBatch as operations need 
  full rows for correct filtering</div>
 </td>
 </tr>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
index f74a23c..f9f59de 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
@@ -228,7 +228,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility</div>
 <div class="block">A filter for adding inter-column timestamp matching
  Only cells with a correspondingly timestamped entry in
  the target column will be retained
- Not compatible with Scan.setBatch as operations need
+ Not compatible with Scan.setBatch as operations need 
  full rows for correct filtering</div>
 </td>
 </tr>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/FilterBase.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/FilterBase.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/FilterBase.html
index d15ad40..e7bccdf 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/FilterBase.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/class-use/FilterBase.html
@@ -135,7 +135,7 @@
 <div class="block">A filter for adding inter-column timestamp matching
  Only cells with a correspondingly timestamped entry in
  the target column will be retained
- Not compatible with Scan.setBatch as operations need
+ Not compatible with Scan.setBatch as operations need 
  full rows for correct filtering</div>
 </td>
 </tr>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/filter/package-summary.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/filter/package-summary.html b/0.94/apidocs/org/apache/hadoop/hbase/filter/package-summary.html
index 6086ea7..bdb3b0c 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/filter/package-summary.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/filter/package-summary.html
@@ -155,7 +155,7 @@
 <div class="block">A filter for adding inter-column timestamp matching
  Only cells with a correspondingly timestamped entry in
  the target column will be retained
- Not compatible with Scan.setBatch as operations need
+ Not compatible with Scan.setBatch as operations need 
  full rows for correct filtering</div>
 </td>
 </tr>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/fs/HFileSystem.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/fs/HFileSystem.html b/0.94/apidocs/org/apache/hadoop/hbase/fs/HFileSystem.html
index c1b6dc9..cb6eb0b 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/fs/HFileSystem.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/fs/HFileSystem.html
@@ -117,7 +117,7 @@
 <pre>public class <a href="../../../../../src-html/org/apache/hadoop/hbase/fs/HFileSystem.html#line.44">HFileSystem</a>
 extends org.apache.hadoop.fs.FilterFileSystem</pre>
 <div class="block">An encapsulation for the FileSystem object that hbase uses to access
- data. This class allows the flexibility of using
+ data. This class allows the flexibility of using  
  separate filesystem objects for reading and writing hfiles and hlogs.
  In future, if we want to make hlogs be in a different filesystem,
  this is the place to make it happen.</div>
@@ -215,7 +215,7 @@ extends org.apache.hadoop.fs.FilterFileSystem</pre>
                                     short&nbsp;replication,
                                     long&nbsp;blockSize,
                                     org.apache.hadoop.util.Progressable&nbsp;progress)</code>
-<div class="block">The org.apache.hadoop.fs.FilterFileSystem does not yet support
+<div class="block">The org.apache.hadoop.fs.FilterFileSystem does not yet support 
  createNonRecursive.</div>
 </td>
 </tr>
@@ -240,7 +240,7 @@ extends org.apache.hadoop.fs.FilterFileSystem</pre>
 <tr class="rowColor">
 <td class="colFirst"><code>org.apache.hadoop.fs.FileSystem</code></td>
 <td class="colLast"><code><strong><a href="../../../../../org/apache/hadoop/hbase/fs/HFileSystem.html#getNoChecksumFs()">getNoChecksumFs</a></strong>()</code>
-<div class="block">Returns the filesystem that is specially setup for
+<div class="block">Returns the filesystem that is specially setup for 
  doing reads from storage.</div>
 </td>
 </tr>
@@ -318,7 +318,7 @@ extends org.apache.hadoop.fs.FilterFileSystem</pre>
 <h4>HFileSystem</h4>
 <pre>public&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/fs/HFileSystem.html#line.91">HFileSystem</a>(org.apache.hadoop.fs.FileSystem&nbsp;fs)</pre>
 <div class="block">Wrap a FileSystem object within a HFileSystem. The noChecksumFs and
- writefs are both set to be the same specified fs.
+ writefs are both set to be the same specified fs. 
  Do not verify hbase-checksums while reading data from filesystem.</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>fs</code> - Set the noChecksumFs and writeFs to this specified filesystem.</dd></dl>
 </li>
@@ -338,8 +338,8 @@ extends org.apache.hadoop.fs.FilterFileSystem</pre>
 <li class="blockList">
 <h4>getNoChecksumFs</h4>
 <pre>public&nbsp;org.apache.hadoop.fs.FileSystem&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/fs/HFileSystem.html#line.104">getNoChecksumFs</a>()</pre>
-<div class="block">Returns the filesystem that is specially setup for
- doing reads from storage. This object avoids doing
+<div class="block">Returns the filesystem that is specially setup for 
+ doing reads from storage. This object avoids doing 
  checksum verifications for reads.</div>
 <dl><dt><span class="strong">Returns:</span></dt><dd>The FileSystem object that can be used to read data
          from files.</dd></dl>
@@ -433,7 +433,7 @@ extends org.apache.hadoop.fs.FilterFileSystem</pre>
                                                          long&nbsp;blockSize,
                                                          org.apache.hadoop.util.Progressable&nbsp;progress)
                                                            throws <a href="http://docs.oracle.com/javase/6/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
-<div class="block">The org.apache.hadoop.fs.FilterFileSystem does not yet support
+<div class="block">The org.apache.hadoop.fs.FilterFileSystem does not yet support 
  createNonRecursive. This is a hadoop bug and when it is fixed in Hadoop,
  this definition will go away.</div>
 <dl>

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/ecb8d8ba/0.94/apidocs/org/apache/hadoop/hbase/io/Reference.Range.html
----------------------------------------------------------------------
diff --git a/0.94/apidocs/org/apache/hadoop/hbase/io/Reference.Range.html b/0.94/apidocs/org/apache/hadoop/hbase/io/Reference.Range.html
index 10ba7a6..6f5b692 100644
--- a/0.94/apidocs/org/apache/hadoop/hbase/io/Reference.Range.html
+++ b/0.94/apidocs/org/apache/hadoop/hbase/io/Reference.Range.html
@@ -250,7 +250,7 @@ for (Reference.Range c : Reference.Range.values())
 <pre>public static&nbsp;<a href="../../../../../org/apache/hadoop/hbase/io/Reference.Range.html" title="enum in org.apache.hadoop.hbase.io">Reference.Range</a>&nbsp;<a href="../../../../../src-html/org/apache/hadoop/hbase/io/Reference.Range.html#line.42">valueOf</a>(<a href="http://docs.oracle.com/javase/6/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;name)</pre>
 <div class="block">Returns the enum constant of this type with the specified name.
 The string must match <i>exactly</i> an identifier used to declare an
-enum constant in this type.  (Extraneous whitespace characters are
+enum constant in this type.  (Extraneous whitespace characters are 
 not permitted.)</div>
 <dl><dt><span class="strong">Parameters:</span></dt><dd><code>name</code> - the name of the enum constant to be returned.</dd>
 <dt><span class="strong">Returns:</span></dt><dd>the enum constant with the specified name</dd>