You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2013/10/25 11:39:11 UTC

[Hadoop Wiki] Update of "HowToSetupYourDevelopmentEnvironment" by SteveLoughran

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToSetupYourDevelopmentEnvironment" page has been changed by SteveLoughran:
https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment?action=diff&rev1=31&rev2=32

Comment:
make sure the docs are in sync w/ mvn build and current structure

  
  This page describes how to get your environment setup and is IDE agnostic.
  
- ''this article is out of date -it covers Hadoop 1.x, not the restructured and maven-based Hadoop 2.x build''
- 
  = Requirements =
-  * Java 6
+  * Java 6 or 7
-  * Ant
+  * Maven
   * Your favorite IDE
  
  = Setup Your Development Environment in Linux =
  
  The instructions below talk about how to get an environment setup using the command line to build, control source, and test.  These instructions are therefore IDE independent.  Take a look at EclipseEnvironment for instructions on how to configure Eclipse to build, control source, and test.  If you prefer ItelliJ IDEA, then take a look [[HadoopUnderIDEA| here]]
  
-  * Choose a good place to put your code.  You will eventually use your source code to run Hadoop, so choose wisely.  I chose /code/hadoop.
+  * Choose a good place to put your code.  You will eventually use your source code to run Hadoop, so choose wisely. For example ~/code/hadoop.
-  * Get the source.  This is documented here: HowToContribute.  Put the source in /code/hadoop (or whatever you chose) so that you have /code/hadoop/hadoop-core-trunk
+  * Get the source.  This is documented in HowToContribute.  Put the source in ~/code/hadoop (or whatever you chose) so that you have ~/code/hadoop/hadoop-common
-  * cd into ''hadoop-core-trunk'', or whatever you named the directory
+  * cd into ''hadoop-common'', or whatever you named the directory
-  * attempt to run ''ant test''
+  * attempt to run ''mvn install''
    *  If you get any strange errors (other than JUnit test failures and errors), then consult the ''Build Errors'' section below.
+  * follow GettingStartedWithHadoop to learn how to run Hadoop.
-  * run ''ant'' to compile (this may not be necessary if you've already run ''ant test'')
-  * follow GettingStartedWithHadoop or the instructions below to learn how to run Hadoop (use this guide if you use Ubuntu: http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29)
-   *  Use the hadoop-core-trunk folder just as you would a downloaded version of Hadoop (symlink hadoop-core-trunk to hadoop)
    *  If you run in to any problems, refer to the ''Runtime Errors'' below, along with the troubleshooting document here: TroubleShooting
  
  = Run HDFS in pseudo-distributed mode from the dev tree =
@@ -123, +119 @@

  
  {{{Exception in thread "main" java.lang.AssertionError: Missing tools.jar at: /Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/Classes/classes.jar. Expression: file.exists()}}}
  
- This happens because one of the modules used in the Hadoop build expects {{classes.jar}} to be in a location it no longer is on Oracle Java 7+ on OS/X. See [https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]
+ This happens because one of the modules used in the Hadoop build expects {{classes.jar}} to be in a location it no longer is on Oracle Java 7+ on OS/X. See [[https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]]
  
  = Runtime Errors =