You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2011/08/02 19:54:30 UTC

[Hadoop Wiki] Update of "HowToContribute" by TomWhite

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=55&rev2=56

Comment:
Post HADOOP-6671 changes

    * Place your class in the {{{src/test}}} tree.
    * {{{TestFileSystem.java}}} and {{{TestMapRed.java}}} are examples of standalone MapReduce-based tests.
    * {{{TestPath.java}}} is an example of a non MapReduce-based test.
+   * You can run all the Common unit tests with {{{mvn test}}}, or a specific unit test with {{{mvn -Dtest=<class name without package prefix> test}}}. Run these commands from the {{{hadoop-trunk}}} directory.
-   * You can run all the unit tests with the command {{{ant test}}}, or you can run a specific unit test with the command {{{ant -Dtestcase=<class name without package prefix> test}}} (for example {{{ant -Dtestcase=TestFileSystem test}}})
+   * For HDFS and MapReduce, you can run all the unit tests with the command {{{ant test}}}, or you can run a specific unit test with the command {{{ant -Dtestcase=<class name without package prefix> test}}} (for example {{{ant -Dtestcase=TestFileSystem test}}})
-    * '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]''' You can run all the Common unit tests with {{{mvn test}}}, or a specific unit test with {{{mvn -Dtest=<class name without package prefix> test}}}.
  
  ==== Using Ant ====
- Hadoop is built by Ant, a Java building tool.  This section will eventually describe how Ant is used within Hadoop.  To start, simply read a good Ant tutorial.  The following is a good tutorial, though keep in mind that Hadoop isn't structured according to the ways outlined in the tutorial.  Use the tutorial to get a basic understand of Ant but not to understand how Ant is used for Hadoop:
+ Hadoop HDFS and MapReduce are built by Ant, a Java building tool. (Common is build using Maven, see below.) This section will eventually describe how Ant is used within Hadoop.  To start, simply read a good Ant tutorial.  The following is a good tutorial, though keep in mind that Hadoop isn't structured according to the ways outlined in the tutorial.  Use the tutorial to get a basic understand of Ant but not to understand how Ant is used for Hadoop:
  
   * Good Ant tutorial: http://i-proving.ca/space/Technologies/Ant+Tutorial
  
@@ -61, +61 @@

  }}}
  
  ==== Using Maven ====
- '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]'''
  Hadoop Common is built using Maven. You need to use version 3 or later.
  
  === Generating a patch ===
@@ -86, +85 @@

  
  Unit tests development guidelines HowToDevelopUnitTests
  
- '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]'''
  For building Hadoop Common with Maven, use the following to run all unit tests and build a distribution. The {{{-Ptest-patch}}} profile will check that no new compiler warnings have been introduced by your patch.
  
  {{{
@@ -104, +102 @@

  }}}
  Examine all public classes you've changed to see that documentation is complete, informative, and properly formatted.  Your patch must not generate any javadoc warnings.
  
+ For Common, use Maven to build the javadoc as follows:
- '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]'''
- Build the javadoc with Maven:
  {{{
  mvn javadoc:javadoc
  firefox hadoop-common/target/site/api/index.html
@@ -177, +174 @@

   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} will tell you the default value on your system.
  
- '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]'''
  For testing a patch in Hadoop Common, use a command like this one, run from the top-level ({{{hadoop-trunk}}}) checkout:
  {{{
  dev-support/test-patch.sh DEVELOPER \
@@ -208, +204 @@

  You may find that you need to modify both the common project and MapReduce or HDFS. Or perhaps you have changed something in common, and need to verify that these changes do not break the existing unit tests for HDFS and MapReduce. Hadoop's build system integrates with a local maven repository to support cross-project development. Use this general workflow for your development:
  
   * Make your changes in common
-  * Run any unit tests there (e.g. 'ant test')
+  * Run any unit tests there (e.g. 'mvn test')
   * ''Publish'' your new common jar to your local mvn repository:<<BR>>
   {{{
- common$ ant clean jar mvn-install
+ hadoop-common$ mvn clean install
  }}}
   . A word of caution: `mvn-install` pushes the artifacts into your local Maven repository which is shared by all your projects.
-  . '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is committed]'''<<BR>>
-  {{{
- hadoop-common$ mvn clean install
- }}}
   * Switch to the dependent project and make any changes there (e.g., that rely on a new API you introduced in common).
   * When you are ready, recompile and test this -- using the local mvn repository instead of the public Hadoop repository:<<BR>>
   {{{
@@ -233, +225 @@

  
  When you believe that your patch is ready to be committed, select the '''Submit Patch''' link on the issue's Jira.  Submitted patches will be automatically tested against "trunk" by [[http://hudson.zones.apache.org/hudson/view/Hadoop/|Hudson]], the project's continuous integration engine.  Upon test completion, Hudson will add a success ("+1") message or failure ("-1") to your issue report in Jira.  If your issue contains multiple patch versions, Hudson tests the last patch uploaded.
  
- Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean install javadoc:javadoc checkstyle:checkstyle}}}) before selecting '''Submit Patch'''.  Tests should all pass.  Javadoc should report '''no''' warnings or errors. Checkstyle's error count should not exceed that listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch tester, which would create too much noise on the mailing list and in Jira.  Submitting patches that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
+ Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean install javadoc:javadoc checkstyle:checkstyle}}} in the case of Common) before selecting '''Submit Patch'''.  Tests should all pass.  Javadoc should report '''no''' warnings or errors. Checkstyle's error count should not exceed that listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch tester, which would create too much noise on the mailing list and in Jira.  Submitting patches that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
  
  If your patch involves performance optimizations, they should be validated by benchmarks that demonstrate an improvement.