You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by 闫昆 <ya...@gmail.com> on 2013/06/20 10:23:11 UTC

exec pi error

Hi All:
I instation cdh4.3
thank you for help
Error. As follows:

[hadoop@hydra0001 mapreduce]$ hadoop jar
hadoop-mapreduce-examples-2.0.0-cdh4.3.0.jar pi 10 10
Number of Maps  = 10
Samples per Map = 10
13/06/20 16:06:38 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Wrote input for Map #0
Wrote input for Map #1
Wrote input for Map #2
Wrote input for Map #3
Wrote input for Map #4
Wrote input for Map #5
Wrote input for Map #6
Wrote input for Map #7
Wrote input for Map #8
Wrote input for Map #9
Starting Job
13/06/20 16:06:39 INFO service.AbstractService:
Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
13/06/20 16:06:39 INFO service.AbstractService:
Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
13/06/20 16:06:40 INFO input.FileInputFormat: Total input paths to process
: 10
13/06/20 16:06:40 INFO mapreduce.JobSubmitter: number of splits:10
13/06/20 16:06:40 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar
13/06/20 16:06:40 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative
13/06/20 16:06:40 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces
13/06/20 16:06:40 WARN conf.Configuration: mapred.output.value.class is
deprecated. Instead, use mapreduce.job.output.value.class
13/06/20 16:06:40 WARN conf.Configuration:
mapred.reduce.tasks.speculative.execution is deprecated. Instead, use
mapreduce.reduce.speculative
13/06/20 16:06:40 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class
13/06/20 16:06:40 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, usemapreduce.job.name
13/06/20 16:06:40 WARN conf.Configuration: mapreduce.reduce.class is
deprecated. Instead, use mapreduce.job.reduce.class
13/06/20 16:06:40 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class
13/06/20 16:06:40 WARN conf.Configuration: mapred.input.dir is deprecated.
Instead, use mapreduce.input.fileinputformat.inputdir
13/06/20 16:06:40 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir
13/06/20 16:06:40 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class
13/06/20 16:06:40 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps
13/06/20 16:06:40 WARN conf.Configuration: mapred.output.key.class is
deprecated. Instead, use mapreduce.job.output.key.class
13/06/20 16:06:40 WARN conf.Configuration: mapred.working.dir is
deprecated. Instead, use mapreduce.job.working.dir
13/06/20 16:06:40 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1371710258290_0003
13/06/20 16:06:40 INFO client.YarnClientImpl: Submitted application
application_1371710258290_0003 to ResourceManager at hydra0001/
192.5.1.50:8032
13/06/20 16:06:40 INFO mapreduce.Job: The url to track the job:
http://hydra0001:8088/proxy/application_1371710258290_0003/
13/06/20 16:06:40 INFO mapreduce.Job: Running job: job_1371710258290_0003
13/06/20 16:06:42 INFO mapreduce.Job: Job job_1371710258290_0003 running in
uber mode : false
13/06/20 16:06:42 INFO mapreduce.Job:  map 0% reduce 0%
13/06/20 16:06:42 INFO mapreduce.Job: Job job_1371710258290_0003 failed
with state FAILED due to: Application application_1371710258290_0003 failed
1 times due to AM Container for appattempt_1371710258290_0003_000001 exited
with  exitCode: 1 due to:
.Failing this attempt.. Failing the application.
13/06/20 16:06:42 INFO mapreduce.Job: Counters: 0
Job Finished in 3.185 seconds
java.io.FileNotFoundException: File does not exist:
hdfs://hydra0001/user/hadoop/QuasiMonteCarlo_TMP_3_141592654/out/reduce-out
        at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:824)
        at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1704)
        at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728)
        at
org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314)
        at
org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:351)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at
org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:360)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
        at
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
        at
org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:68)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


the error logs as follows

:Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/yarn/service/CompositeService

	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.yarn.service.CompositeService
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
	... 13 more


yarn-env.sh

export HADOOP_FREFIX=/opt/module/hadoop-2.0.0-cdh4.3.0
export HADOOP_COMMON_HOME=${HADOOP_FREFIX}
export HADOOP_HDFS_HOME=${HADOOP_FREFIX}
export PATH=$PATH:$HADOOP_FREFIX/bin
export PATH=$PATH:$HADOOP_FREFIX/sbin
export HADOOP_MAPRED_HOME=${HADOOP_FREFIX}
export YARN_HOME=${HADOOP_FREFIX}
export HADOOP_CONF_HOME=${HADOOP_FREFIX}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_FREFIX}/etc/hadoop
export JAVA_HOME=/opt/module/jdk1.7.0_15

hadoop-env.sh

export HADOOP_FREFIX=/opt/module/hadoop-2.0.0-cdh4.3.0
export HADOOP_COMMON_HOME=${HADOOP_FREFIX}
export HADOOP_HDFS_HOME=${HADOOP_FREFIX}
export PATH=$PATH:$HADOOP_FREFIX/bin
export PATH=$PATH:$HADOOP_FREFIX/sbin
export HADOOP_MAPRED_HOME=${HADOOP_FREFIX}
export YARN_HOME=${HADOOP_FREFIX}
export HADOOP_CONF_HOME=${HADOOP_FREFIX}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_FREFIX}/etc/hadoop
export JAVA_HOME=/opt/module/jdk1.7.0_15

Hadoop env as follow:

export HADOOP_FREFIX=/opt/module/hadoop-2.0.0-cdh4.3.0
export HADOOP_COMMON_HOME=${HADOOP_FREFIX}
export HADOOP_HDFS_HOME=${HADOOP_FREFIX}
export PATH=$PATH:$HADOOP_FREFIX/bin
export PATH=$PATH:$HADOOP_FREFIX/sbin
export HADOOP_MAPRED_HOME=${HADOOP_FREFIX}
export YARN_HOME=${HADOOP_FREFIX}