You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by cu...@apache.org on 2008/10/21 23:42:50 UTC

svn commit: r706781 - in /hadoop/core/trunk: CHANGES.txt docs/libhdfs.html docs/libhdfs.pdf src/docs/src/documentation/content/xdocs/libhdfs.xml src/docs/src/documentation/content/xdocs/site.xml

Author: cutting
Date: Tue Oct 21 14:42:50 2008
New Revision: 706781

URL: http://svn.apache.org/viewvc?rev=706781&view=rev
Log:
HADOOP-4105.  Add Forrest documentation for libhdfs.

Added:
    hadoop/core/trunk/docs/libhdfs.html
    hadoop/core/trunk/docs/libhdfs.pdf
    hadoop/core/trunk/src/docs/src/documentation/content/xdocs/libhdfs.xml
Modified:
    hadoop/core/trunk/CHANGES.txt
    hadoop/core/trunk/src/docs/src/documentation/content/xdocs/site.xml

Modified: hadoop/core/trunk/CHANGES.txt
URL: http://svn.apache.org/viewvc/hadoop/core/trunk/CHANGES.txt?rev=706781&r1=706780&r2=706781&view=diff
==============================================================================
--- hadoop/core/trunk/CHANGES.txt (original)
+++ hadoop/core/trunk/CHANGES.txt Tue Oct 21 14:42:50 2008
@@ -523,6 +523,9 @@
     HADOOP-4438. Update forrest documentation to include missing FsShell
     commands. (Suresh Srinivas via cdouglas)
 
+    HADOOP-4105.  Add forrest documentation for libhdfs.
+    (Pete Wyckoff via cutting)
+
   OPTIMIZATIONS
 
     HADOOP-3556. Removed lock contention in MD5Hash by changing the 

Added: hadoop/core/trunk/docs/libhdfs.html
URL: http://svn.apache.org/viewvc/hadoop/core/trunk/docs/libhdfs.html?rev=706781&view=auto
==============================================================================
--- hadoop/core/trunk/docs/libhdfs.html (added)
+++ hadoop/core/trunk/docs/libhdfs.html Tue Oct 21 14:42:50 2008
@@ -0,0 +1,329 @@
+<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
+<html>
+<head>
+<META http-equiv="Content-Type" content="text/html; charset=UTF-8">
+<meta content="Apache Forrest" name="Generator">
+<meta name="Forrest-version" content="0.8">
+<meta name="Forrest-skin-name" content="pelt">
+<meta name="http-equiv" content="Content-Type">
+<meta name="content" content="text/html;">
+<meta name="charset" content="utf-8">
+<title>C API to HDFS: libhdfs</title>
+<link type="text/css" href="skin/basic.css" rel="stylesheet">
+<link media="screen" type="text/css" href="skin/screen.css" rel="stylesheet">
+<link media="print" type="text/css" href="skin/print.css" rel="stylesheet">
+<link type="text/css" href="skin/profile.css" rel="stylesheet">
+<script src="skin/getBlank.js" language="javascript" type="text/javascript"></script><script src="skin/getMenu.js" language="javascript" type="text/javascript"></script><script src="skin/fontsize.js" language="javascript" type="text/javascript"></script>
+<link rel="shortcut icon" href="images/favicon.ico">
+</head>
+<body onload="init()">
+<script type="text/javascript">ndeSetTextSize();</script>
+<div id="top">
+<!--+
+    |breadtrail
+    +-->
+<div class="breadtrail">
+<a href="http://www.apache.org/">Apache</a> &gt; <a href="http://hadoop.apache.org/">Hadoop</a> &gt; <a href="http://hadoop.apache.org/core/">Core</a><script src="skin/breadcrumbs.js" language="JavaScript" type="text/javascript"></script>
+</div>
+<!--+
+    |header
+    +-->
+<div class="header">
+<!--+
+    |start group logo
+    +-->
+<div class="grouplogo">
+<a href="http://hadoop.apache.org/"><img class="logoImage" alt="Hadoop" src="images/hadoop-logo.jpg" title="Apache Hadoop"></a>
+</div>
+<!--+
+    |end group logo
+    +-->
+<!--+
+    |start Project Logo
+    +-->
+<div class="projectlogo">
+<a href="http://hadoop.apache.org/core/"><img class="logoImage" alt="Hadoop" src="images/core-logo.gif" title="Scalable Computing Platform"></a>
+</div>
+<!--+
+    |end Project Logo
+    +-->
+<!--+
+    |start Search
+    +-->
+<div class="searchbox">
+<form action="http://www.google.com/search" method="get" class="roundtopsmall">
+<input value="hadoop.apache.org" name="sitesearch" type="hidden"><input onFocus="getBlank (this, 'Search the site with google');" size="25" name="q" id="query" type="text" value="Search the site with google">&nbsp; 
+                    <input name="Search" value="Search" type="submit">
+</form>
+</div>
+<!--+
+    |end search
+    +-->
+<!--+
+    |start Tabs
+    +-->
+<ul id="tabs">
+<li>
+<a class="unselected" href="http://hadoop.apache.org/core/">Project</a>
+</li>
+<li>
+<a class="unselected" href="http://wiki.apache.org/hadoop">Wiki</a>
+</li>
+<li class="current">
+<a class="selected" href="index.html">Hadoop 0.20 Documentation</a>
+</li>
+</ul>
+<!--+
+    |end Tabs
+    +-->
+</div>
+</div>
+<div id="main">
+<div id="publishedStrip">
+<!--+
+    |start Subtabs
+    +-->
+<div id="level2tabs"></div>
+<!--+
+    |end Endtabs
+    +-->
+<script type="text/javascript"><!--
+document.write("Last Published: " + document.lastModified);
+//  --></script>
+</div>
+<!--+
+    |breadtrail
+    +-->
+<div class="breadtrail">
+
+             &nbsp;
+           </div>
+<!--+
+    |start Menu, mainarea
+    +-->
+<!--+
+    |start Menu
+    +-->
+<div id="menu">
+<div onclick="SwitchMenu('menu_selected_1.1', 'skin/')" id="menu_selected_1.1Title" class="menutitle" style="background-image: url('skin/images/chapter_open.gif');">Documentation</div>
+<div id="menu_selected_1.1" class="selectedmenuitemgroup" style="display: block;">
+<div class="menuitem">
+<a href="index.html">Overview</a>
+</div>
+<div class="menuitem">
+<a href="quickstart.html">Hadoop Quick Start</a>
+</div>
+<div class="menuitem">
+<a href="cluster_setup.html">Hadoop Cluster Setup</a>
+</div>
+<div class="menuitem">
+<a href="mapred_tutorial.html">Hadoop Map/Reduce Tutorial</a>
+</div>
+<div class="menuitem">
+<a href="commands_manual.html">Hadoop Command Guide</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_shell.html">Hadoop FS Shell Guide</a>
+</div>
+<div class="menuitem">
+<a href="distcp.html">Hadoop DistCp Guide</a>
+</div>
+<div class="menuitem">
+<a href="native_libraries.html">Hadoop Native Libraries</a>
+</div>
+<div class="menuitem">
+<a href="streaming.html">Hadoop Streaming</a>
+</div>
+<div class="menuitem">
+<a href="hadoop_archives.html">Hadoop Archives</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_user_guide.html">HDFS User Guide</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_design.html">HDFS Architecture</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_permissions_guide.html">HDFS Admin Guide: Permissions</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_quota_admin_guide.html">HDFS Admin Guide: Quotas</a>
+</div>
+<div class="menuitem">
+<a href="SLG_user_guide.html">HDFS Utilities</a>
+</div>
+<div class="menupage">
+<div class="menupagetitle">HDFS C API</div>
+</div>
+<div class="menuitem">
+<a href="hod_user_guide.html">HOD User Guide</a>
+</div>
+<div class="menuitem">
+<a href="hod_admin_guide.html">HOD Admin Guide</a>
+</div>
+<div class="menuitem">
+<a href="hod_config_guide.html">HOD Config Guide</a>
+</div>
+<div class="menuitem">
+<a href="capacity_scheduler.html">Capacity Scheduler</a>
+</div>
+<div class="menuitem">
+<a href="api/index.html">API Docs</a>
+</div>
+<div class="menuitem">
+<a href="jdiff/changes.html">API Changes</a>
+</div>
+<div class="menuitem">
+<a href="http://wiki.apache.org/hadoop/">Wiki</a>
+</div>
+<div class="menuitem">
+<a href="http://wiki.apache.org/hadoop/FAQ">FAQ</a>
+</div>
+<div class="menuitem">
+<a href="releasenotes.html">Release Notes</a>
+</div>
+<div class="menuitem">
+<a href="changes.html">Change Log</a>
+</div>
+</div>
+<div id="credit"></div>
+<div id="roundbottom">
+<img style="display: none" class="corner" height="15" width="15" alt="" src="skin/images/rc-b-l-15-1body-2menu-3menu.png"></div>
+<!--+
+  |alternative credits
+  +-->
+<div id="credit2"></div>
+</div>
+<!--+
+    |end Menu
+    +-->
+<!--+
+    |start content
+    +-->
+<div id="content">
+<div title="Portable Document Format" class="pdflink">
+<a class="dida" href="libhdfs.pdf"><img alt="PDF -icon" src="skin/images/pdfdoc.gif" class="skin"><br>
+        PDF</a>
+</div>
+<h1>C API to HDFS: libhdfs</h1>
+<div id="minitoc-area">
+<ul class="minitoc">
+<li>
+<a href="#C+API+to+HDFS%3A+libhdfs">C API to HDFS: libhdfs</a>
+</li>
+<li>
+<a href="#The+APIs">The APIs</a>
+</li>
+<li>
+<a href="#A+sample+program">A sample program</a>
+</li>
+<li>
+<a href="#How+to+link+with+the+library">How to link with the library</a>
+</li>
+<li>
+<a href="#Common+problems">Common problems</a>
+</li>
+<li>
+<a href="#libhdfs+is+thread+safe">libhdfs is thread safe</a>
+</li>
+</ul>
+</div>
+
+<a name="N10019"></a><a name="C+API+to+HDFS%3A+libhdfs"></a>
+<h2 class="h3">C API to HDFS: libhdfs</h2>
+<div class="section">
+<p>
+libhdfs is a JNI based C api for Hadoop's DFS. It provides C apis to a subset of the HDFS APIs to manipulate DFS files and the filesystem. libhdfs is part of the hadoop distribution and comes pre-compiled in ${HADOOP_HOME}/libhdfs/libhdfs.so .
+</p>
+</div>
+
+<a name="N10023"></a><a name="The+APIs"></a>
+<h2 class="h3">The APIs</h2>
+<div class="section">
+<p>
+The libhdfs APIs are a subset of: <a href="api/org/apache/hadoop/fs/FileSystem.html">hadoop fs APIs</a>.  
+</p>
+<p>
+The header file for libhdfs describes each API in detail and is available in ${HADOOP_HOME}/src/c++/libhdfs/hdfs.h
+</p>
+</div>
+
+<a name="N10034"></a><a name="A+sample+program"></a>
+<h2 class="h3">A sample program</h2>
+<div class="section">
+<pre class="code">
+#include "hdfs.h" 
+
+int main(int argc, char **argv) {
+
+    hdfsFS fs = hdfsConnect("default", 0);
+    const char* writePath = "/tmp/testfile.txt";
+    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
+    if(!writeFile) {
+          fprintf(stderr, "Failed to open %s for writing!\n", writePath);
+          exit(-1);
+    }
+    char* buffer = "Hello, World!";
+    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
+    if (hdfsFlush(fs, writeFile)) {
+           fprintf(stderr, "Failed to 'flush' %s\n", writePath); 
+          exit(-1);
+    }
+   hdfsCloseFile(fs, writeFile);
+}
+
+</pre>
+</div>
+
+
+<a name="N1003F"></a><a name="How+to+link+with+the+library"></a>
+<h2 class="h3">How to link with the library</h2>
+<div class="section">
+<p>
+See the Makefile for hdfs_test.c in the libhdfs source directory (${HADOOP_HOME}/src/c++/libhdfs/Makefile) or something like:
+gcc above_sample.c -I${HADOOP_HOME}/src/c++/libhdfs -L${HADOOP_HOME}/libhdfs -lhdfs -o above_sample
+</p>
+</div>
+
+<a name="N10049"></a><a name="Common+problems"></a>
+<h2 class="h3">Common problems</h2>
+<div class="section">
+<p>
+The most common problem is the CLASSPATH is not set properly when calling a program that uses libhdfs. Make sure you set it to all the hadoop jars needed to run Hadoop itself. Currently, there is no way to programmatically generate the classpath, but a good bet is to include all the jar files in ${HADOOP_HOME} and ${HADOOP_HOME}/lib as well as the right configuration directory containing hadoop-site.xml
+</p>
+</div>
+
+<a name="N10053"></a><a name="libhdfs+is+thread+safe"></a>
+<h2 class="h3">libhdfs is thread safe</h2>
+<div class="section">
+<p>Concurrency and Hadoop FS "handles" - the hadoop FS implementation includes a FS handle cache which caches based on the URI of the namenode along with the user connecting. So, all calls to hdfsConnect will return the same handle but calls to hdfsConnectAsUser with different users will return different handles.  But, since HDFS client handles are completely thread safe, this has no bearing on concurrency. 
+</p>
+<p>Concurrency and libhdfs/JNI - the libhdfs calls to JNI should always be creating thread local storage, so (in theory), libhdfs should be as thread safe as the underlying calls to the Hadoop FS.
+</p>
+</div>
+
+</div>
+<!--+
+    |end content
+    +-->
+<div class="clearboth">&nbsp;</div>
+</div>
+<div id="footer">
+<!--+
+    |start bottomstrip
+    +-->
+<div class="lastmodified">
+<script type="text/javascript"><!--
+document.write("Last Published: " + document.lastModified);
+//  --></script>
+</div>
+<div class="copyright">
+        Copyright &copy;
+         2008 <a href="http://www.apache.org/licenses/">The Apache Software Foundation.</a>
+</div>
+<!--+
+    |end bottomstrip
+    +-->
+</div>
+</body>
+</html>

Added: hadoop/core/trunk/docs/libhdfs.pdf
URL: http://svn.apache.org/viewvc/hadoop/core/trunk/docs/libhdfs.pdf?rev=706781&view=auto
==============================================================================
--- hadoop/core/trunk/docs/libhdfs.pdf (added)
+++ hadoop/core/trunk/docs/libhdfs.pdf Tue Oct 21 14:42:50 2008
@@ -0,0 +1,335 @@
+%PDF-1.3
+%ª«¬­
+4 0 obj
+<< /Type /Info
+/Producer (FOP 0.20.5) >>
+endobj
+5 0 obj
+<< /Length 640 /Filter [ /ASCII85Decode /FlateDecode ]
+ >>
+stream
+Gau`Q9lldX&A@ZcFEA!n[/'im;rY`Zp=)jI,s;fD(aab].!>LWbEn?]8Vcrti.L^`AY7hcG5!?S0%UM/NX:RUO*"_JO.`$KLpS6h@CIUE_+Y%].a[3bFb,k.&)-Eip%PX*(i@n!\*Jf&(?X:k\kSV$oFj[N>"1bB#K@Dq&HIQ9O#4F)W\ho_:aq/fr@sY<.Io"$pM,k&0]i!?nFprr^lle_G8*:)I8\Jk[*XY8D['Bl<e5dkjoa$P?8*IVrf@ZGkVpYWJ)-)j=gl_n!=T/BlcHpC:'UCtTAsY-JV9D5O$j@:D[;$K+!o?\6_*?g<@=*&<Th-Y;b@;j+&Yl7Mbt"S=78Oa>O>-i<:7`,C;$\.i`#2W:D*)WlfS;dCfJ>cot/WGHN/F'jB;:fd69@c'B`\i[Q<]`5+KZjc560rA\k4Y0M3)O`]2?r>njk4\IEE1.R'[i3n8=Mc/r#I3V>HoYM+i)>2<&!f!p3VXl?He0Sqj&*IV^2a3lqnE>No8qT7EO)^%?<"lfJ8$)RdS$74O>n"uSaFE3B!JGNM1!uHX+TbJnc;hnK>l;2-*eC^2`6j6[\#mi.$gZ^\lX*E[a#G%61>WY#_rE=)KPN`]AYN<:u4a5poeYr>Rg0uT20+/H$c2~>
+endstream
+endobj
+6 0 obj
+<< /Type /Page
+/Parent 1 0 R
+/MediaBox [ 0 0 612 792 ]
+/Resources 3 0 R
+/Contents 5 0 R
+/Annots 7 0 R
+>>
+endobj
+7 0 obj
+[
+8 0 R
+10 0 R
+12 0 R
+14 0 R
+16 0 R
+18 0 R
+]
+endobj
+8 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 546.166 226.016 534.166 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 9 0 R
+/H /I
+>>
+endobj
+10 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 527.966 155.66 515.966 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 11 0 R
+/H /I
+>>
+endobj
+12 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 509.766 199.316 497.766 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 13 0 R
+/H /I
+>>
+endobj
+14 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 491.566 244.328 479.566 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 15 0 R
+/H /I
+>>
+endobj
+16 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 473.366 202.34 461.366 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 17 0 R
+/H /I
+>>
+endobj
+18 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 102.0 455.166 209.648 443.166 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A 19 0 R
+/H /I
+>>
+endobj
+20 0 obj
+<< /Length 1838 /Filter [ /ASCII85Decode /FlateDecode ]
+ >>
+stream
+GatU4D/\/e&H;*)_0R%=V[XkJmiD*F9q/"K>A&qH@[I@PQ=sKY/?+f&2!b"qGBuJ&<9d...@lH8s0p>d!WY-%tW2X!Y!UR$ic!nYq"Z'00NW-9BgHd_m0-$]915?<.R9huQ;\465hCj74Q.!-U't1(mqRX\F_<Rj_P]s0O4G_el'Mg[Eo1nqd4RJs)A,,f#46pLm%1K8uC$AGfD5F[C+=hBLH:^]VHMgu+/C_u*-k]kS*_h'fk5-j)+t/=1qc.N.@FT!Xj2Fl]p[K>Lia!J`ielo'.>6r
 "^.V4BOl]$+dJRcDQ!;`XN@Hj[Y=_K)V*?hVFIn;e:iIQrcR&)-gjJqne,88'5Q7'(cTf*,>Mt<"Q...@6q>~>
+endstream
+endobj
+21 0 obj
+<< /Type /Page
+/Parent 1 0 R
+/MediaBox [ 0 0 612 792 ]
+/Resources 3 0 R
+/Contents 20 0 R
+/Annots 22 0 R
+>>
+endobj
+22 0 obj
+[
+23 0 R
+]
+endobj
+23 0 obj
+<< /Type /Annot
+/Subtype /Link
+/Rect [ 250.308 550.932 324.3 538.932 ]
+/C [ 0 0 0 ]
+/Border [ 0 0 0 ]
+/A << /URI (api/org/apache/hadoop/fs/FileSystem.html)
+/S /URI >>
+/H /I
+>>
+endobj
+24 0 obj
+<< /Length 1291 /Filter [ /ASCII85Decode /FlateDecode ]
+ >>
+stream
+Gat%#9lK&M&A@sB#`B`:+CQNu?gc2/DX(MLS$_m\\j\nXPf6Z9?';4''`IJ@=gBo,W746g*ep,OpL`QIoXLJT4N3<s&D((J'A(o44NE.?c^D=_('RXr9d=:V3BuDSjOUm9_tO"?+f\/aK>!IX;mK$O3kmZ?:Ql=TaWKH?p%I4;o/$nI&k,Q7]RM$4&LDKVP9205D%`Q&q-=VF!kq0V\;S*SfFY8"8e4s;:M2ZFCA=p9T>gQre`p_[?J=l!!_#s[)^0AuQA=F0?SI^29$<CLgYmZ$cYb&2HEp5[DF47BYV%=2fI2Ie/,?'`EeR-VRJr4cWCA)t+IWlZ*<"FiAQtHFN"n/_@Q2rS;2GTh/7Y+(Veco/b6p76.otfN`^r,tM;**WBV]1H/e<.qH5N_d5?&_NYijQ'US=3d<C=7(XCdW12F9i9;PeP9(FY\GUTHE2$@umD(c>-%?5<_XO!Hjj]M?WsA(QqP8i%;E06%)6</Y\o?I%Fm5JR--I/a'&+69uX`\'D>"/:^pnXe]ekiL`*^dSQdH>o$c$Y-@*S8T>p!QX5kEQM*1jMA@*.Be_flT4]FEQuR3<S>F4JnO4#9o/-3>p9lLXFAa?e3l'LBMp["UFI,W1D3L^P6uW/DUC<s(jZd^Sof3#lL[%>l!^AeQa?@l)LfFOi$ric]A_uQb"9WkGdRX)Kh8p*:#DLt+DQ,IV.&`r(G22BVDtE#_:&ID/Gp='g/*LB#:F++qsr3+F=SHD<DrlGbY]%`_=h+M,L:sF9;=RakZI%De5e)>B7h9O1GV&.2eKZ:V[KI`ZO-T@k^qabnC*Qb7ZJ(T!:&n(=>fKW/EZ5U_*CMU"`56q7.b.KT'*2?GEYO>/L8O2ICCHT*ug4#>!^%MUs&MB8(3XUS=tDIHO!kD?m[^G(Pi6.1aV)eW(f+-Kn^>>THah]_l,Bp&,1E3W.];m#(cf(,c0,.LJL.-aVbbP=+^6;cPR-l+ds$4V'R[04PuN
 DTMQTs03Kk1`3j%Z)*TR0Jji#8R!sb;qe@ZS5RCZXlg<LI=>)DeSEYMUNmYOc*f)jm?TS_TDpW8^HeJiC7_E[OSYm2_mn2J/8FU@9`?%RW5O4f:%1pQj(.Z)[1Oo46fNg7FZQ68gNKZ/L_p3)56]Q.K2^WOsY=k5W@Cn)^e$AVI\]s&",OSKWJ\i>AV?iRPG+7&/M&usGqY'GG?m?#L,QO(cBgd-"o@lj,*.,R6]ts8F2]8^rA#2!D]@.P]m$5Zb^=SD5j@MUBO/nM<a#^?gN+6[XC^/&W-d0RrgqOMI:;6~>
+endstream
+endobj
+25 0 obj
+<< /Type /Page
+/Parent 1 0 R
+/MediaBox [ 0 0 612 792 ]
+/Resources 3 0 R
+/Contents 24 0 R
+>>
+endobj
+27 0 obj
+<<
+ /Title (\376\377\0\61\0\40\0\103\0\40\0\101\0\120\0\111\0\40\0\164\0\157\0\40\0\110\0\104\0\106\0\123\0\72\0\40\0\154\0\151\0\142\0\150\0\144\0\146\0\163)
+ /Parent 26 0 R
+ /Next 28 0 R
+ /A 9 0 R
+>> endobj
+28 0 obj
+<<
+ /Title (\376\377\0\62\0\40\0\124\0\150\0\145\0\40\0\101\0\120\0\111\0\163)
+ /Parent 26 0 R
+ /Prev 27 0 R
+ /Next 29 0 R
+ /A 11 0 R
+>> endobj
+29 0 obj
+<<
+ /Title (\376\377\0\63\0\40\0\101\0\40\0\163\0\141\0\155\0\160\0\154\0\145\0\40\0\160\0\162\0\157\0\147\0\162\0\141\0\155)
+ /Parent 26 0 R
+ /Prev 28 0 R
+ /Next 30 0 R
+ /A 13 0 R
+>> endobj
+30 0 obj
+<<
+ /Title (\376\377\0\64\0\40\0\110\0\157\0\167\0\40\0\164\0\157\0\40\0\154\0\151\0\156\0\153\0\40\0\167\0\151\0\164\0\150\0\40\0\164\0\150\0\145\0\40\0\154\0\151\0\142\0\162\0\141\0\162\0\171)
+ /Parent 26 0 R
+ /Prev 29 0 R
+ /Next 31 0 R
+ /A 15 0 R
+>> endobj
+31 0 obj
+<<
+ /Title (\376\377\0\65\0\40\0\103\0\157\0\155\0\155\0\157\0\156\0\40\0\160\0\162\0\157\0\142\0\154\0\145\0\155\0\163)
+ /Parent 26 0 R
+ /Prev 30 0 R
+ /Next 32 0 R
+ /A 17 0 R
+>> endobj
+32 0 obj
+<<
+ /Title (\376\377\0\66\0\40\0\154\0\151\0\142\0\150\0\144\0\146\0\163\0\40\0\151\0\163\0\40\0\164\0\150\0\162\0\145\0\141\0\144\0\40\0\163\0\141\0\146\0\145)
+ /Parent 26 0 R
+ /Prev 31 0 R
+ /A 19 0 R
+>> endobj
+33 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F3
+/BaseFont /Helvetica-Bold
+/Encoding /WinAnsiEncoding >>
+endobj
+34 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F5
+/BaseFont /Times-Roman
+/Encoding /WinAnsiEncoding >>
+endobj
+35 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F1
+/BaseFont /Helvetica
+/Encoding /WinAnsiEncoding >>
+endobj
+36 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F9
+/BaseFont /Courier
+/Encoding /WinAnsiEncoding >>
+endobj
+37 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F2
+/BaseFont /Helvetica-Oblique
+/Encoding /WinAnsiEncoding >>
+endobj
+38 0 obj
+<< /Type /Font
+/Subtype /Type1
+/Name /F7
+/BaseFont /Times-Bold
+/Encoding /WinAnsiEncoding >>
+endobj
+1 0 obj
+<< /Type /Pages
+/Count 3
+/Kids [6 0 R 21 0 R 25 0 R ] >>
+endobj
+2 0 obj
+<< /Type /Catalog
+/Pages 1 0 R
+ /Outlines 26 0 R
+ /PageMode /UseOutlines
+ >>
+endobj
+3 0 obj
+<< 
+/Font << /F3 33 0 R /F5 34 0 R /F1 35 0 R /F9 36 0 R /F2 37 0 R /F7 38 0 R >> 
+/ProcSet [ /PDF /ImageC /Text ] >> 
+endobj
+9 0 obj
+<<
+/S /GoTo
+/D [21 0 R /XYZ 85.0 659.0 null]
+>>
+endobj
+11 0 obj
+<<
+/S /GoTo
+/D [21 0 R /XYZ 85.0 580.266 null]
+>>
+endobj
+13 0 obj
+<<
+/S /GoTo
+/D [21 0 R /XYZ 85.0 493.532 null]
+>>
+endobj
+15 0 obj
+<<
+/S /GoTo
+/D [21 0 R /XYZ 85.0 235.618 null]
+>>
+endobj
+17 0 obj
+<<
+/S /GoTo
+/D [25 0 R /XYZ 85.0 659.0 null]
+>>
+endobj
+19 0 obj
+<<
+/S /GoTo
+/D [25 0 R /XYZ 85.0 553.866 null]
+>>
+endobj
+26 0 obj
+<<
+ /First 27 0 R
+ /Last 32 0 R
+>> endobj
+xref
+0 39
+0000000000 65535 f 
+0000007483 00000 n 
+0000007555 00000 n 
+0000007647 00000 n 
+0000000015 00000 n 
+0000000071 00000 n 
+0000000802 00000 n 
+0000000922 00000 n 
+0000000982 00000 n 
+0000007781 00000 n 
+0000001117 00000 n 
+0000007844 00000 n 
+0000001253 00000 n 
+0000007910 00000 n 
+0000001390 00000 n 
+0000007976 00000 n 
+0000001527 00000 n 
+0000008042 00000 n 
+0000001663 00000 n 
+0000008106 00000 n 
+0000001800 00000 n 
+0000003731 00000 n 
+0000003854 00000 n 
+0000003881 00000 n 
+0000004073 00000 n 
+0000005457 00000 n 
+0000008172 00000 n 
+0000005565 00000 n 
+0000005783 00000 n 
+0000005936 00000 n 
+0000006136 00000 n 
+0000006405 00000 n 
+0000006600 00000 n 
+0000006821 00000 n 
+0000006934 00000 n 
+0000007044 00000 n 
+0000007152 00000 n 
+0000007258 00000 n 
+0000007374 00000 n 
+trailer
+<<
+/Size 39
+/Root 2 0 R
+/Info 4 0 R
+>>
+startxref
+8223
+%%EOF

Added: hadoop/core/trunk/src/docs/src/documentation/content/xdocs/libhdfs.xml
URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/docs/src/documentation/content/xdocs/libhdfs.xml?rev=706781&view=auto
==============================================================================
--- hadoop/core/trunk/src/docs/src/documentation/content/xdocs/libhdfs.xml (added)
+++ hadoop/core/trunk/src/docs/src/documentation/content/xdocs/libhdfs.xml Tue Oct 21 14:42:50 2008
@@ -0,0 +1,96 @@
+<?xml version="1.0"?>
+<!--
+  Copyright 2002-2004 The Apache Software Foundation
+
+  Licensed under the Apache License, Version 2.0 (the "License");
+  you may not use this file except in compliance with the License.
+  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+
+<!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN"
+          "http://forrest.apache.org/dtd/document-v20.dtd">
+
+
+<document>
+<header>
+<title>C API to HDFS: libhdfs</title>
+<meta name="http-equiv">Content-Type</meta>
+<meta name="content">text/html;</meta>
+<meta name="charset">utf-8</meta>
+</header>
+<body>
+<section>
+<title>C API to HDFS: libhdfs</title>
+
+<p>
+libhdfs is a JNI based C api for Hadoop's DFS. It provides C apis to a subset of the HDFS APIs to manipulate DFS files and the filesystem. libhdfs is part of the hadoop distribution and comes pre-compiled in ${HADOOP_HOME}/libhdfs/libhdfs.so .
+</p>
+
+</section>
+<section>
+<title>The APIs</title>
+
+<p>
+The libhdfs APIs are a subset of: <a href="api/org/apache/hadoop/fs/FileSystem.html" >hadoop fs APIs</a>.  
+</p>
+<p>
+The header file for libhdfs describes each API in detail and is available in ${HADOOP_HOME}/src/c++/libhdfs/hdfs.h
+</p>
+</section>
+<section>
+<title>A sample program</title>
+
+<source>
+#include "hdfs.h" 
+
+int main(int argc, char **argv) {
+
+    hdfsFS fs = hdfsConnect("default", 0);
+    const char* writePath = "/tmp/testfile.txt";
+    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
+    if(!writeFile) {
+          fprintf(stderr, "Failed to open %s for writing!\n", writePath);
+          exit(-1);
+    }
+    char* buffer = "Hello, World!";
+    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
+    if (hdfsFlush(fs, writeFile)) {
+           fprintf(stderr, "Failed to 'flush' %s\n", writePath); 
+          exit(-1);
+    }
+   hdfsCloseFile(fs, writeFile);
+}
+
+</source>
+</section>
+
+<section>
+<title>How to link with the library</title>
+<p>
+See the Makefile for hdfs_test.c in the libhdfs source directory (${HADOOP_HOME}/src/c++/libhdfs/Makefile) or something like:
+gcc above_sample.c -I${HADOOP_HOME}/src/c++/libhdfs -L${HADOOP_HOME}/libhdfs -lhdfs -o above_sample
+</p>
+</section>
+<section>
+<title>Common problems</title>
+<p>
+The most common problem is the CLASSPATH is not set properly when calling a program that uses libhdfs. Make sure you set it to all the hadoop jars needed to run Hadoop itself. Currently, there is no way to programmatically generate the classpath, but a good bet is to include all the jar files in ${HADOOP_HOME} and ${HADOOP_HOME}/lib as well as the right configuration directory containing hadoop-site.xml
+</p>
+</section>
+<section>
+<title>libhdfs is thread safe</title>
+<p>Concurrency and Hadoop FS "handles" - the hadoop FS implementation includes a FS handle cache which caches based on the URI of the namenode along with the user connecting. So, all calls to hdfsConnect will return the same handle but calls to hdfsConnectAsUser with different users will return different handles.  But, since HDFS client handles are completely thread safe, this has no bearing on concurrency. 
+</p>
+<p>Concurrency and libhdfs/JNI - the libhdfs calls to JNI should always be creating thread local storage, so (in theory), libhdfs should be as thread safe as the underlying calls to the Hadoop FS.
+</p>
+</section>
+</body>
+</document>

Modified: hadoop/core/trunk/src/docs/src/documentation/content/xdocs/site.xml
URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/docs/src/documentation/content/xdocs/site.xml?rev=706781&r1=706780&r2=706781&view=diff
==============================================================================
--- hadoop/core/trunk/src/docs/src/documentation/content/xdocs/site.xml (original)
+++ hadoop/core/trunk/src/docs/src/documentation/content/xdocs/site.xml Tue Oct 21 14:42:50 2008
@@ -47,6 +47,7 @@
     <hdfs      label="HDFS Admin Guide: Permissions"    href="hdfs_permissions_guide.html" />
     <hdfs      label="HDFS Admin Guide: Quotas" href="hdfs_quota_admin_guide.html" />
     <fs        label="HDFS Utilities"  href="SLG_user_guide.html" />
+    <libhdfs   label="HDFS C API"         href="libhdfs.html" />
     <hod-user-guide label="HOD User Guide" href="hod_user_guide.html"/>
     <hod-admin-guide label="HOD Admin Guide" href="hod_admin_guide.html"/>
     <hod-config-guide label="HOD Config Guide" href="hod_config_guide.html"/>