You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2013/10/25 11:14:38 UTC

[Hadoop Wiki] Update of "HowToContribute" by SteveLoughran

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by SteveLoughran:
https://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=87&rev2=88

Comment:
minor updates related to Jvm versions

  
  ==== Build Tools ====
  
- To build the code, install (as well as the programs needed to run Hadoop on Windows, if that is your development platform)
+ To build the code, install (as well as the programs needed to build Hadoop on Windows, if that is your development platform)
   * [[http://maven.apache.org/|Apache Maven]]
   * [[http://java.com/|Oracle Java 6 or 7]], or [[http://openjdk.java.net/|OpenJDK]]
  These should also be on your PATH; test by executing {{{mvn}}} and {{{javac}}} respectively.
@@ -43, +43 @@

  
  On Linux, you need the tools to create the native libraries.
  
- For CentOS and redhat:
+ For RHEL (and hence also CentOS):
  {{{
  yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool 
  }}}
@@ -268, +268 @@

  hadoop-common$ mvn clean install -DskipTests
  }}}
   . A word of caution: `mvn install` pushes the artifacts into your local Maven repository which is shared by all your projects.
-  * Switch to the dependent project and make any changes there (e.g., that rely on a new API you introduced in common).
+  * Switch to the dependent project and make any changes there (e.g., that rely on a new API you introduced in hadoop-common).
   * Finally, create separate patches for your common and hdfs/mapred changes, and file them as separate JIRA issues associated with the appropriate projects.
  
  === Contributing your work ===