You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by George Pang <p0...@gmail.com> on 2009/05/08 09:40:13 UTC

ClassNotFoundException

Dear  users,
I got "ClassNotFoundException" when run the WordCount example on hadoop
using Eclipse.  Does anyone know where is the problem?

Thank you!

George

Re: ClassNotFoundException

Posted by jason hadoop <ja...@gmail.com>.
<rel> is short for the hadoop version you are using 0.18.x, 0.19.x or 0.20.x
etc
You must make all of the required jars available to all of your tasks. You
can either install them all the tasktracker machines and setup the
tasktracker classpath to include them, or distributed them via the
distributed cache.

chapter 5 of my book goes into this in some detail, and is available now as
a download. http://www.apress.com/book/view/9781430219422

On Fri, May 8, 2009 at 4:22 PM, georgep <p0...@gmail.com> wrote:

>
> Sorry, I misspell you name, Jason
>
> George
>
> georgep wrote:
> >
> > Hi Joe,
> >
> > Thank you for the reply, but do I need to include every supporting jar
> > file to the application path? What is the -<rel>-?
> >
> > George
> >
> >
> > jason hadoop wrote:
> >>
> >> 1) when running under windows, include the cygwin bin directory in your
> >> windows path environment variable
> >> 2) eclipse is not so good at submitting supporting jar files, in your
> >> application lauch path add a -libjars path/hadoop-<rel>-examples.jar.
> >>
> >>
> >> --
> >> Alpha Chapters of my book on Hadoop are available
> >> http://www.apress.com/book/view/9781430219422
> >> www.prohadoopbook.com a community for Hadoop Professionals
> >>
> >>
> >
> >
>
> --
> View this message in context:
> http://www.nabble.com/ClassNotFoundException-tp23441528p23455206.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>


-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422
www.prohadoopbook.com a community for Hadoop Professionals

Re: ClassNotFoundException

Posted by georgep <p0...@gmail.com>.
Sorry, I misspell you name, Jason

George

georgep wrote:
> 
> Hi Joe,
> 
> Thank you for the reply, but do I need to include every supporting jar
> file to the application path? What is the -<rel>-?
> 
> George
> 
> 
> jason hadoop wrote:
>> 
>> 1) when running under windows, include the cygwin bin directory in your
>> windows path environment variable
>> 2) eclipse is not so good at submitting supporting jar files, in your
>> application lauch path add a -libjars path/hadoop-<rel>-examples.jar.
>> 
>> 
>> -- 
>> Alpha Chapters of my book on Hadoop are available
>> http://www.apress.com/book/view/9781430219422
>> www.prohadoopbook.com a community for Hadoop Professionals
>> 
>> 
> 
> 

-- 
View this message in context: http://www.nabble.com/ClassNotFoundException-tp23441528p23455206.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: ClassNotFoundException

Posted by georgep <p0...@gmail.com>.
Hi Joe,

Thank you for the reply, but do I need to include every supporting jar file
to the application path? What is the -<rel>-?

George


jason hadoop wrote:
> 
> 1) when running under windows, include the cygwin bin directory in your
> windows path environment variable
> 2) eclipse is not so good at submitting supporting jar files, in your
> application lauch path add a -libjars path/hadoop-<rel>-examples.jar.
> 
> 
> -- 
> Alpha Chapters of my book on Hadoop are available
> http://www.apress.com/book/view/9781430219422
> www.prohadoopbook.com a community for Hadoop Professionals
> 
> 

-- 
View this message in context: http://www.nabble.com/ClassNotFoundException-tp23441528p23455180.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: ClassNotFoundException

Posted by jason hadoop <ja...@gmail.com>.
1) when running under windows, include the cygwin bin directory in your
windows path environment variable
2) eclipse is not so good at submitting supporting jar files, in your
application lauch path add a -libjars path/hadoop-<rel>-examples.jar.

On Fri, May 8, 2009 at 10:13 AM, georgep <p0...@gmail.com> wrote:

>
> When run as a java application, the trace:
>
> 09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
> javax.security.auth.login.LoginException: Login failed: Cannot run program
> "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
>        at
>
> org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
>        at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
>        at
> org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:354)
>        at
>
> org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:377)
>        at mapreduce.WordCount.main(WordCount.java:44)
> 09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
> javax.security.auth.login.LoginException: Login failed: Cannot run program
> "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
>        at
>
> org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
>        at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
>        at
> org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:311)
>        at org.apache.hadoop.mapred.JobClient.init(JobClient.java:390)
>        at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:361)
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1011)
>        at mapreduce.WordCount.main(WordCount.java:46)
> 09/05/08 10:08:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> Problem
> 09/05/08 10:08:50 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> java.io.IOException: Failed to get the current user's information.
>        at
>
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:559)
>        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:729)
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1026)
>        at mapreduce.WordCount.main(WordCount.java:46)
> Caused by: javax.security.auth.login.LoginException: Login failed: Cannot
> run program "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:557)
>        ... 3 more
>
> When run on hadoop server:
>
> Trace:
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> mapreduce.test.WordCount
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
>        at java.lang.Class.forName0(Native Method)
>        at java.lang.Class.forName(Class.java:247)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
>
> Thanks!:-):-)
>
>
>
> georgep wrote:
> >
> > Trace:
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > mapreduce.test.WordCount
> >       at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >       at java.lang.Class.forName0(Native Method)
> >       at java.lang.Class.forName(Class.java:247)
> >       at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
> >
> > Code:
> > public class WordCount{
> >
> >           public static void main(String[] args) throws Exception {
> >               try {
> >                               JobConf conf = new
> JobConf(WordCount.class);
> >                                 conf.setJobName("wordcount");
> >
> >                                 conf.setOutputKeyClass(Text.class);
> >
> conf.setOutputValueClass(IntWritable.class);
> >
> >      conf.setMapperClass(Map.class);
> >                                 conf.setCombinerClass(Reduce.class);
> >      conf.setReducerClass(Reduce.class);
> >
> >
> conf.setInputFormat(TextInputFormat.class);
> >
> conf.setOutputFormat(TextOutputFormat.class);
> >
> >      FileInputFormat.setInputPaths(conf, new Path("input"));
> >      FileOutputFormat.setOutputPath(conf, new Path("output"));
> >
> >      JobClient.runJob(conf);
> >                       }
> >                       catch (Exception t) {
> >                               // TODO Auto-generated catch block
> >                               t.printStackTrace();
> >                               System.out.println("Problem");
> >                       }
> >           }
> >
> > }
> >
> >
> > Thank you!
> >
> >
> >
> > TimRobertson100 wrote:
> >>
> >> Can you post the entire error trace please?
> >>
> >> On Fri, May 8, 2009 at 9:40 AM, George Pang <p0...@gmail.com> wrote:
> >>> Dear  users,
> >>> I got "ClassNotFoundException" when run the WordCount example on hadoop
> >>> using Eclipse.  Does anyone know where is the problem?
> >>>
> >>> Thank you!
> >>>
> >>> George
> >>>
> >>
> >>
> >
> >
>
> --
> View this message in context:
> http://www.nabble.com/ClassNotFoundException-tp23441528p23449910.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>


-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422
www.prohadoopbook.com a community for Hadoop Professionals

Re: ClassNotFoundException

Posted by georgep <p0...@gmail.com>.
When run as a java application, the trace:

09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
javax.security.auth.login.LoginException: Login failed: Cannot run program
"whoami": CreateProcess error=2, ¨t²??¨ì«ü©
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
	at
org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
	at org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:354)
	at
org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:377)
	at mapreduce.WordCount.main(WordCount.java:44)
09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
javax.security.auth.login.LoginException: Login failed: Cannot run program
"whoami": CreateProcess error=2, ¨t²??¨ì«ü©
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
	at
org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
	at org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:311)
	at org.apache.hadoop.mapred.JobClient.init(JobClient.java:390)
	at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:361)
	at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1011)
	at mapreduce.WordCount.main(WordCount.java:46)
09/05/08 10:08:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
Problem
09/05/08 10:08:50 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
java.io.IOException: Failed to get the current user's information.
	at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:559)
	at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:729)
	at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1026)
	at mapreduce.WordCount.main(WordCount.java:46)
Caused by: javax.security.auth.login.LoginException: Login failed: Cannot
run program "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
	at
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
	at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:557)
	... 3 more

When run on hadoop server: 

Trace:

Exception in thread "main" java.lang.ClassNotFoundException:
mapreduce.test.WordCount
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
	at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:247)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:148)

Thanks!:-):-)



georgep wrote:
> 
> Trace:
> 
> Exception in thread "main" java.lang.ClassNotFoundException:
> mapreduce.test.WordCount
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> 	at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:247)
> 	at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
> 
> Code:
> public class WordCount{	
> 		
> 	    public static void main(String[] args) throws Exception {
> 	    	try {
> 				JobConf conf = new JobConf(WordCount.class);
> 				  conf.setJobName("wordcount");
> 
> 				  conf.setOutputKeyClass(Text.class);
> 				  conf.setOutputValueClass(IntWritable.class);
> 
>      conf.setMapperClass(Map.class);
> 				  conf.setCombinerClass(Reduce.class);
>      conf.setReducerClass(Reduce.class);
> 
> 				  conf.setInputFormat(TextInputFormat.class);
> 				  conf.setOutputFormat(TextOutputFormat.class);
> 
>      FileInputFormat.setInputPaths(conf, new Path("input"));
>      FileOutputFormat.setOutputPath(conf, new Path("output"));
> 
>      JobClient.runJob(conf);
> 			}
> 			catch (Exception t) {
> 				// TODO Auto-generated catch block
> 				t.printStackTrace();
> 				System.out.println("Problem");
> 			}
> 	    }
> 		
> }
> 
> 
> Thank you!
> 
> 
> 
> TimRobertson100 wrote:
>> 
>> Can you post the entire error trace please?
>> 
>> On Fri, May 8, 2009 at 9:40 AM, George Pang <p0...@gmail.com> wrote:
>>> Dear  users,
>>> I got "ClassNotFoundException" when run the WordCount example on hadoop
>>> using Eclipse.  Does anyone know where is the problem?
>>>
>>> Thank you!
>>>
>>> George
>>>
>> 
>> 
> 
> 

-- 
View this message in context: http://www.nabble.com/ClassNotFoundException-tp23441528p23449910.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: ClassNotFoundException

Posted by georgep <p0...@gmail.com>.
Trace:

Exception in thread "main" java.lang.ClassNotFoundException:
mapreduce.test.WordCount
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
	at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:247)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:148)

Code:
public class WordCount{	
		
	    public static void main(String[] args) throws Exception {
	    	try {
				JobConf conf = new JobConf(WordCount.class);
				  conf.setJobName("wordcount");

				  conf.setOutputKeyClass(Text.class);
				  conf.setOutputValueClass(IntWritable.class);

     conf.setMapperClass(Map.class);
				  conf.setCombinerClass(Reduce.class);
     conf.setReducerClass(Reduce.class);

				  conf.setInputFormat(TextInputFormat.class);
				  conf.setOutputFormat(TextOutputFormat.class);

     FileInputFormat.setInputPaths(conf, new Path("input"));
     FileOutputFormat.setOutputPath(conf, new Path("output"));

     JobClient.runJob(conf);
			}
			catch (Exception t) {
				// TODO Auto-generated catch block
				t.printStackTrace();
				System.out.println("Problem");
			}
	    }
		
}


Thank you!



TimRobertson100 wrote:
> 
> Can you post the entire error trace please?
> 
> On Fri, May 8, 2009 at 9:40 AM, George Pang <p0...@gmail.com> wrote:
>> Dear  users,
>> I got "ClassNotFoundException" when run the WordCount example on hadoop
>> using Eclipse.  Does anyone know where is the problem?
>>
>> Thank you!
>>
>> George
>>
> 
> 

-- 
View this message in context: http://www.nabble.com/ClassNotFoundException-tp23441528p23449216.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: ClassNotFoundException

Posted by tim robertson <ti...@gmail.com>.
Can you post the entire error trace please?

On Fri, May 8, 2009 at 9:40 AM, George Pang <p0...@gmail.com> wrote:
> Dear  users,
> I got "ClassNotFoundException" when run the WordCount example on hadoop
> using Eclipse.  Does anyone know where is the problem?
>
> Thank you!
>
> George
>