You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2011/08/03 00:59:10 UTC
[Hadoop Wiki] Update of "EclipseEnvironment" by TomWhite
Dear Wiki user,
You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.
The "EclipseEnvironment" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/EclipseEnvironment?action=diff&rev1=42&rev2=43
Comment:
Updated following Mavenization of Hadoop Common (HADOOP-6671)
This document (currently) assumes you already have Eclipse downloaded, installed, and configured to your liking.
== Quick Start ==
- We will begin by downloading the Hadoop source. The hadoop-common source tree has three subfolders(subprojects) underneath it that you will see after you pull down the source code. hdfs, common, and mapreduce.
+ We will begin by downloading the Hadoop source. The hadoop-common source tree has three subprojects underneath it that you will see after you pull down the source code: hadoop-common, hdfs, and mapreduce.
Let's begin by getting the latest source from GitHub (please note there is a time delay between the Apache svn repository and replicating over changes to GitHub).
@@ -18, +18 @@
From a Hadoop checkout (see HowToContribute) in your Eclipse base directory type (assuming you're in the hadoop-common top level directory)
{{{
- cd common; ant compile eclipse
+ mvn test -DskipTests
+ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
- cd ../; cd hdfs; ant compile eclipse
+ cd hdfs; ant compile eclipse
cd ../; cd mapreduce; ant compile eclipse
}}}
*Note: If the mapreduce compile fails try to compile just the core "ant compile-core eclipse"
- Then in Eclipse:
+ Then in Eclipse
+ For Common
+ * File -> Import...
+ * Choose "Existing Projects into Workspace"
+ * Select the top-level Hadoop directory as the root directory
+ * Select the hadoop-annotations, hadoop-assemblies, and hadoop-common projects
+ * Click "Finish"
+ * To get the projects to build cleanly:
+ ** Add target/src/generated-src/test/java as a source directory for hadoop-common
+ ** You may have to [[http://stackoverflow.com/questions/860187/access-restriction-on-class-due-to-restriction-on-required-library-rt-jar|add then remove]] the JRE System Library to avoid errors due to access restrictions
+
+ For HDFS and MapReduce
* File -> New Project...
* Choose the "Java Project" from the wizard
* Enter the project name corresponding to the checkout directory, e.g. `hadoop-common-trunk`