You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2012/01/04 11:50:30 UTC

[Hadoop Wiki] Update of "QwertyManiac/BuildingHadoopTrunk" by QwertyManiac

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "QwertyManiac/BuildingHadoopTrunk" page has been changed by QwertyManiac:
http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk

Comment:
Initial draft.

New page:
= Prerequisites for build =

1. A Java JDK - To compile and use Apache Hadoop.
  * Most of us use Oracle's JDK or OpenJDK.
    * OpenJDK - [[http://openjdk.org]]
    * Oracle JDK - [[http://java.com]]
2. Apache Maven (3+) - To build and manage the Apache Hadoop projects and its dependencies.
  * A latest release of Apache Maven ({{{mvn}}}) can be got at [[http://maven.apache.org]].
3. Git or Apache Subversion - To fetch Apache Hadoop sources and manage patches.
  * Git is available via [[http://git.org]]
  * Subversion can be got via [[http://subversion.apache.org]]
4. Some spirit is always good to have.

= Version 0.24 and upwards =

1. Checkout the sources (Use any method below):
  * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}
  * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
  * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk hadoop}}}
2. Download and install Google Protobuf 2.4+ in your OS/Distribution.
  1. On OSX, you can get Homebrew and do {{{brew install protobuf}}}
  2. On RHEL/CentOS/Fedora, do {{{yum install protobuf-compiler}}}
  3. On Ubuntu, do {{{apt-get install protobuf}}}
  4. (The list can go on, but you get the idea, and you have access to a web search engines…)
  5. Do ensure the version is right with a {{{protoc --version}}}
3. '''Optional''': Install all the usual build essentials like '''gcc''', '''autoconf''', '''make''', '''zlib''', etc. for various native-code components you may want to hack on.
4. Enter the top level checkout directory ({{{hadoop}}}) and issue {{{mvn install -DskipTests}}} to kick off the compile.