You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2012/02/03 04:48:37 UTC

[Hadoop Wiki] Update of "QwertyManiac/BuildingHadoopTrunk" by QwertyManiac

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "QwertyManiac/BuildingHadoopTrunk" page has been changed by QwertyManiac:
http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk?action=diff&rev1=4&rev2=5

Comment:
Eclipse changes

    5. Do ensure the version is right with a {{{protoc --version}}}
  3. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''', '''automake''', '''make''', '''zlib''', etc. for various native-code components you may want to hack on.
  4. Enter the top level checkout directory ({{{hadoop}}}) and issue {{{mvn install -DskipTests}}} to kick off the compile.
+ 5. If you want to generate eclipse project files, run: {{mvn eclipse:eclipse}}.
  
  = Building branch-0.23 =
  
@@ -34, +35 @@

  
  1. Checkout the sources (Use any method below):
    * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}
-     * Checkout the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+     * Checkout the branch-0.23 branch once this is done: {{{cd hadoop; git checkout branch-0.23}}}
    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
-     * Checkout the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+     * Checkout the branch-0.23 branch once this is done: {{{cd hadoop; git checkout branch-0.23}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.23 hadoop}}}
+ 2. If you want to generate eclipse project files, run: {{mvn eclipse:eclipse}}.
  
  = Building branch-0.22 =
  
@@ -54, +56 @@

    1. For instance, to build the "mapred" project, you need to begin by entering its directory: {{{cd hadoop/hdfs}}}.
    2. To then compile the whole project, run: {{{ant compile}}}.
    3. The above instructions can be repeated for {{{hadoop/common}}} and {{{hadoop/hdfs}}} project directories.
+ 3. If you want to generate eclipse project files, under each project's root directory, run: {{ant eclipse}}.
  
  = Building branch-0.21 =
  
@@ -75, +78 @@

    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-1 hadoop}}}
  2. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''', '''automake''', '''make''', '''zlib''', etc. for various native-code components you may want to hack on.
  3. The source code all lies under the same project directory, so you just need to issue an Ant build: {{{cd hadoop; ant compile}}}
+ 4. If you want to generate eclipse project files, under each project's root directory, run: {{ant eclipse}}.