You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by 朱韬 <ry...@163.com> on 2011/02/25 02:41:13 UTC

Problem with compiling and deploying customed hadoop

Hi,guys:
      To meet the needs of the current project, I have to modify the policy of scheduler. So I checked out the source code fromhttp://svn.apache.org/repos/asf/hadoop/mapreduce/trunk/. Then I modified some code and compiled using this script:
      #!/bin/bash
     export JAVA_HOME=/usr/share/jdk1.6.0_14
     export CFLAGS=-m64
     export CXXFLAGS=-m64
     export ANT_HOME=/opt/apache-ant-1.8.2
     export PATH=$PATH:$ANT_HOME/bin
     ant -Dversion=0.21.0 -Dcompile.native=true -Dforrest.home=/home/hadoop/apache-forrest-0.9 clean tar
     It was Ok before these steps. Then I replaced "hadoop-mapred-0.21.0.jar", hadoop-mapred-0.21.0-sources.jar,  hadoop-mapred-examples-0.21.0.jar,hadoop-mapred-test-0.21.0.jar,and hadoop-mapred-tools-0.21.0.jar inRelease 0.21.0 with the compiled jar files from the above step. Also I added my scheduler to lib. When starting the customed hadoop, I encounter the problems as blow:
      starting namenode, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-namenode-hdt1.hypercloud.ict.out
10.61.0.143: starting datanode, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-datanode-hdt1.hypercloud.ict.out
10.61.0.7: starting datanode, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-datanode-hdt2.hypercloud.ict.out
10.61.0.6: starting datanode, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-datanode-hdt0.hypercloud.ict.out
10.61.0.143: starting secondarynamenode, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-secondarynamenode-hdt1.hypercloud.ict.out
starting jobtracker, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-jobtracker-hdt1.hypercloud.ict.out
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/security/RefreshUserMappingsProtocol
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
10.61.0.143: starting tasktracker, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-tasktracker-hdt1.hypercloud.ict.out
10.61.0.6: starting tasktracker, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-tasktracker-hdt0.hypercloud.ict.out
10.61.0.7: starting tasktracker, logging to /home/hadoop/hadoop-0.21.0/logs/hadoop-hadoop-tasktracker-hdt2.hypercloud.ict.out
10.61.0.143: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/io/SecureIOUtils$AlreadyExistsException
10.61.0.143: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException
10.61.0.143:    at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
10.61.0.143:    at java.security.AccessController.doPrivileged(Native Method)
10.61.0.143:    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
10.61.0.143:    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
10.61.0.143:    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
10.61.0.143:    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
10.61.0.143:    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
10.61.0.143: Could not find the main class: org.apache.hadoop.mapred.TaskTracker.  Program will exit.
10.61.0.7: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/io/SecureIOUtils$AlreadyExistsException
10.61.0.7: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException
10.61.0.7:      at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
10.61.0.7:      at java.security.AccessController.doPrivileged(Native Method)
10.61.0.7:      at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
10.61.0.7:      at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
10.61.0.7:      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
10.61.0.7:      at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
10.61.0.7:      at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
10.61.0.7: Could not find the main class: org.apache.hadoop.mapred.TaskTracker.  Program will exit.
10.61.0.6: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/io/SecureIOUtils$AlreadyExistsException
10.61.0.6: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException
10.61.0.6:      at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
10.61.0.6:      at java.security.AccessController.doPrivileged(Native Method)
10.61.0.6:      at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
10.61.0.6:      at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
10.61.0.6:      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
10.61.0.6:      at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
10.61.0.6:      at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
10.61.0.6: Could not find the main class: org.apache.hadoop.mapred.TaskTracker.  Program will exit.
         Thank you.
                                                                                                             zhutao


Re: Problem with compiling and deploying customed hadoop

Posted by Allen Wittenauer <aw...@linkedin.com>.
On Feb 24, 2011, at 5:41 PM, 朱韬 wrote:

> Hi,guys:
>      To meet the needs of the current project, I have to modify the policy of scheduler. So I checked out the source code fromhttp://svn.apache.org/repos/asf/hadoop/mapreduce/trunk/. Then I modified some code and compiled using this script:

	Up here you say trunk...

>     ant -Dversion=0.21.0

	Down here you say 0.21.

	Did you grab the wrong branch?