You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hbase.apache.org by st...@apache.org on 2010/12/08 22:08:09 UTC

svn commit: r1043676 - /hbase/trunk/src/docbkx/book.xml

Author: stack
Date: Wed Dec  8 21:08:08 2010
New Revision: 1043676

URL: http://svn.apache.org/viewvc?rev=1043676&view=rev
Log:
Added note on how low xceivers fails

Modified:
    hbase/trunk/src/docbkx/book.xml

Modified: hbase/trunk/src/docbkx/book.xml
URL: http://svn.apache.org/viewvc/hbase/trunk/src/docbkx/book.xml?rev=1043676&r1=1043675&r2=1043676&view=diff
==============================================================================
--- hbase/trunk/src/docbkx/book.xml (original)
+++ hbase/trunk/src/docbkx/book.xml Wed Dec  8 21:08:08 2010
@@ -399,6 +399,12 @@ be running to use Hadoop's scripts to ma
       </para>
       <para>Be sure to restart your HDFS after making the above
       configuration.</para>
+      <para>Not having this configuration in place makes for strange looking
+          failures. Eventually you'll see a complain in the datanode logs
+          complaining about the xcievers exceeded, but on the run up to this
+          one manifestation is complaint about missing blocks.  For example:
+          <code>10/12/08 20:10:31 INFO hdfs.DFSClient: Could not obtain block blk_XXXXXXXXXXXXXXXXXXXXXX_YYYYYYYY from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry...</code>
+      </para>
       </section>
 
 <section xml:id="windows">