You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2011/10/06 15:11:50 UTC

[Hadoop Wiki] Update of "HadoopDfsReadWriteExample" by SteveLoughran

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HadoopDfsReadWriteExample" page has been changed by SteveLoughran:
http://wiki.apache.org/hadoop/HadoopDfsReadWriteExample?action=diff&rev1=6&rev2=7

Comment:
some preamble on classpaths

  Reading from and writing to Hadoop DFS is no different from how it is done with other file systems. The example [[attachment:HadoopDFSFileReadWrite.java]] reads a file from HDFS and writes it to another file on HDFS (copy command). 
  
  Hadoop [[http://hadoop.apache.org/core/api/org/apache/hadoop/fs/FileSystem.html|FileSystem]] API describes the methods available to user. Let us walk through the code to understand how it is done.
+ 
+ == Before we Begin ==
+ 
+ Any Java program can talk to HDFS, provided that program is set up right
+  1. The classpath contains the Hadoop JAR files and its client-side dependencies. (we are being vague here as those dependencies varies from version to version).
+  1. The hadoop configuration files on the classpath
+  1. Log4J on the classpath along with a '''log4.properties''' resource, or commons-logging preconfigured to use a different logging framework.
+ 
+ == Step By Step ==
  
  Create a [[http://hadoop.apache.org/core/api/org/apache/hadoop/fs/FileSystem.html|FileSystem]] instance by passing a new Configuration object. Please note that the following example code assumes that the Configuration object will automatically load the '''hadoop-default.xml''' and '''hadoop-site.xml''' configuration files. You may need to explicitly add these resource paths if you are not running inside of the Hadoop runtime environment.