You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-commits@hadoop.apache.org by cn...@apache.org on 2014/08/23 07:30:21 UTC

svn commit: r1619967 - in /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs: CHANGES.txt src/site/apt/LibHdfs.apt.vm

Author: cnauroth
Date: Sat Aug 23 05:30:21 2014
New Revision: 1619967

URL: http://svn.apache.org/r1619967
Log:
HDFS-4852. libhdfs documentation is out of date. Contributed by Chris Nauroth.

Modified:
    hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
    hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm

Modified: hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt?rev=1619967&r1=1619966&r2=1619967&view=diff
==============================================================================
--- hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt (original)
+++ hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt Sat Aug 23 05:30:21 2014
@@ -633,6 +633,8 @@ Release 2.6.0 - UNRELEASED
     HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in
     security cluster (zhaoyunjiong via Arpit Agarwal)
 
+    HDFS-4852. libhdfs documentation is out of date. (cnauroth)
+
 Release 2.5.0 - 2014-08-11
 
   INCOMPATIBLE CHANGES

Modified: hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm
URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm?rev=1619967&r1=1619966&r2=1619967&view=diff
==============================================================================
--- hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm (original)
+++ hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm Sat Aug 23 05:30:21 2014
@@ -26,14 +26,17 @@ C API libhdfs
    (HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate
    HDFS files and the filesystem. libhdfs is part of the Hadoop
    distribution and comes pre-compiled in
-   <<<${HADOOP_PREFIX}/libhdfs/libhdfs.so>>> .
+   <<<${HADOOP_HDFS_HOME}/lib/native/libhdfs.so>>> .  libhdfs is compatible with
+   Windows and can be built on Windows by running <<<mvn compile>>> within the
+   <<<hadoop-hdfs-project/hadoop-hdfs>>> directory of the source tree.
 
 * The APIs
 
-   The libhdfs APIs are a subset of: {{{hadoop fs APIs}}}.
+   The libhdfs APIs are a subset of the
+   {{{../../api/org/apache/hadoop/fs/FileSystem.html}Hadoop FileSystem APIs}}.
 
    The header file for libhdfs describes each API in detail and is
-   available in <<<${HADOOP_PREFIX}/src/c++/libhdfs/hdfs.h>>>
+   available in <<<${HADOOP_HDFS_HOME}/include/hdfs.h>>>.
 
 * A Sample Program
 
@@ -55,24 +58,28 @@ C API libhdfs
                fprintf(stderr, "Failed to 'flush' %s\n", writePath);
               exit(-1);
         }
-       hdfsCloseFile(fs, writeFile);
+        hdfsCloseFile(fs, writeFile);
     }
 ----
 
 * How To Link With The Library
 
-   See the Makefile for <<<hdfs_test.c>>> in the libhdfs source directory
-   (<<<${HADOOP_PREFIX}/src/c++/libhdfs/Makefile>>>) or something like:
-   <<<gcc above_sample.c -I${HADOOP_PREFIX}/src/c++/libhdfs -L${HADOOP_PREFIX}/libhdfs -lhdfs -o above_sample>>>
+   See the CMake file for <<<test_libhdfs_ops.c>>> in the libhdfs source
+   directory (<<<hadoop-hdfs-project/hadoop-hdfs/src/CMakeLists.txt>>>) or
+   something like:
+   <<<gcc above_sample.c -I${HADOOP_HDFS_HOME}/include -L${HADOOP_HDFS_HOME}/lib/native -lhdfs -o above_sample>>>
 
 * Common Problems
 
    The most common problem is the <<<CLASSPATH>>> is not set properly when
    calling a program that uses libhdfs. Make sure you set it to all the
-   Hadoop jars needed to run Hadoop itself. Currently, there is no way to
-   programmatically generate the classpath, but a good bet is to include
-   all the jar files in <<<${HADOOP_PREFIX}>>> and <<<${HADOOP_PREFIX}/lib>>> as well
-   as the right configuration directory containing <<<hdfs-site.xml>>>
+   Hadoop jars needed to run Hadoop itself as well as the right configuration
+   directory containing <<<hdfs-site.xml>>>.  It is not valid to use wildcard
+   syntax for specifying multiple jars.  It may be useful to run
+   <<<hadoop classpath --glob>>> or <<<hadoop classpath --jar <path>>>> to
+   generate the correct classpath for your deployment.  See
+   {{{../hadoop-common/CommandsManual.html#classpath}Hadoop Commands Reference}}
+   for more information on this command.
 
 * Thread Safe