You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Hemanth Yamijala <yh...@thoughtworks.com> on 2013/02/01 05:12:26 UTC

Re: Issue with running hadoop program using eclipse

Previously, I have resolved this error by building a jar and then using the
API job.setJarByClass(<driver-class>.class). Can you please try that once ?


On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vi...@gmail.com>wrote:

> Hi I know it class not found error
> but I have Map and reduce Class as part of Driver class
> So what is problem ?
>
> I want to ask whether it is compulsory to have jar of any Hadoop Program
> Because it say "No job jar file set " See JobConf(Class) or
> JobConf#setJar(String).
>
> I am running my program using eclipse(windows machine) and Hadoop(llinux
> machine) remotely.
>
>
>
> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> Hello Vikas,
>>
>>              It clearly shows that the class can not be found. For
>> debugging, you can write your MR job as a standalone java program and debug
>> it. It works. And if you want to just debug your mapper / reducer logic,
>> you should look into using MRUnit. There is a good write-up<http://blog.cloudera.com/blog/2009/07/debugging-mapreduce-programs-with-mrunit/>at Cloudera's blog section which talks about it in detail.
>>
>> HTH
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav <vi...@gmail.com>wrote:
>>
>>> Hi
>>> I have one windows machine and one linux machine
>>> my eclipse is on winowds machine
>>> and hadoop running on single linux machine
>>> I am trying to run wordcount program from eclipse(on windows machine) to
>>> Hadoop(on linux machine)
>>>  I getting following error
>>>
>>>  3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for
>>> parsing the arguments. Applications should implement Tool for the same.
>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.  User
>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to
>>> process : 1
>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load
>>> native-hadoop library for your platform... using builtin-java classes where
>>> applicable
>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not
>>> loaded
>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job:
>>> job_201301300613_0029
>>> 13/01/31 11:48:27 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id :
>>> attempt_201301300613_0029_m_000000_0, Status : FAILED
>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>> WordCount$MapClass
>>>  at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
>>>  at
>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>  at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass
>>>  at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>  at java.lang.Class.forName0(Native Method)
>>>  at java.lang.Class.forName(Class.java:247)
>>>  at
>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
>>>  at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
>>>  ... 8 more
>>>
>>>
>>>
>>>
>>> Also I want to know how to debug Hadoop Program using eclipse.
>>>
>>>
>>>
>>>
>>> Thank You.
>>>
>>>
>>
>>
>
>
> --
> *
> *
> *
>
> Thanx and Regards*
> * Vikas Jadhav*
>

Re: Issue with running hadoop program using eclipse

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Vikas,

     Sorry for late response. You don't have to create the jar separately.
If you have added "job.setJarByClass" as specified by Hemanth sir, it
should work.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Fri, Feb 1, 2013 at 9:42 AM, Hemanth Yamijala
<yh...@thoughtworks.com>wrote:

> Previously, I have resolved this error by building a jar and then using
> the API job.setJarByClass(<driver-class>.class). Can you please try that
> once ?
>
>
> On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vi...@gmail.com>wrote:
>
>> Hi I know it class not found error
>> but I have Map and reduce Class as part of Driver class
>> So what is problem ?
>>
>> I want to ask whether it is compulsory to have jar of any Hadoop Program
>> Because it say "No job jar file set " See JobConf(Class) or
>> JobConf#setJar(String).
>>
>> I am running my program using eclipse(windows machine) and Hadoop(llinux
>> machine) remotely.
>>
>>
>>
>> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Vikas,
>>>
>>>              It clearly shows that the class can not be found. For
>>> debugging, you can write your MR job as a standalone java program and debug
>>> it. It works. And if you want to just debug your mapper / reducer logic,
>>> you should look into using MRUnit. There is a good write-up<http://blog.cloudera.com/blog/2009/07/debugging-mapreduce-programs-with-mrunit/>at Cloudera's blog section which talks about it in detail.
>>>
>>> HTH
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav <vikascjadhav87@gmail.com
>>> > wrote:
>>>
>>>> Hi
>>>> I have one windows machine and one linux machine
>>>> my eclipse is on winowds machine
>>>> and hadoop running on single linux machine
>>>> I am trying to run wordcount program from eclipse(on windows machine)
>>>> to Hadoop(on linux machine)
>>>>  I getting following error
>>>>
>>>>  3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for
>>>> parsing the arguments. Applications should implement Tool for the same.
>>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.  User
>>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to
>>>> process : 1
>>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load
>>>> native-hadoop library for your platform... using builtin-java classes where
>>>> applicable
>>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not
>>>> loaded
>>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job:
>>>> job_201301300613_0029
>>>> 13/01/31 11:48:27 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id :
>>>> attempt_201301300613_0029_m_000000_0, Status : FAILED
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>>> WordCount$MapClass
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
>>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
>>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>  at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass
>>>>  at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>  at java.lang.Class.forName0(Native Method)
>>>>  at java.lang.Class.forName(Class.java:247)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
>>>>  ... 8 more
>>>>
>>>>
>>>>
>>>>
>>>> Also I want to know how to debug Hadoop Program using eclipse.
>>>>
>>>>
>>>>
>>>>
>>>> Thank You.
>>>>
>>>>
>>>
>>>
>>
>>
>> --
>> *
>> *
>> *
>>
>> Thanx and Regards*
>> * Vikas Jadhav*
>>
>
>

Re: Issue with running hadoop program using eclipse

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Vikas,

     Sorry for late response. You don't have to create the jar separately.
If you have added "job.setJarByClass" as specified by Hemanth sir, it
should work.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Fri, Feb 1, 2013 at 9:42 AM, Hemanth Yamijala
<yh...@thoughtworks.com>wrote:

> Previously, I have resolved this error by building a jar and then using
> the API job.setJarByClass(<driver-class>.class). Can you please try that
> once ?
>
>
> On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vi...@gmail.com>wrote:
>
>> Hi I know it class not found error
>> but I have Map and reduce Class as part of Driver class
>> So what is problem ?
>>
>> I want to ask whether it is compulsory to have jar of any Hadoop Program
>> Because it say "No job jar file set " See JobConf(Class) or
>> JobConf#setJar(String).
>>
>> I am running my program using eclipse(windows machine) and Hadoop(llinux
>> machine) remotely.
>>
>>
>>
>> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Vikas,
>>>
>>>              It clearly shows that the class can not be found. For
>>> debugging, you can write your MR job as a standalone java program and debug
>>> it. It works. And if you want to just debug your mapper / reducer logic,
>>> you should look into using MRUnit. There is a good write-up<http://blog.cloudera.com/blog/2009/07/debugging-mapreduce-programs-with-mrunit/>at Cloudera's blog section which talks about it in detail.
>>>
>>> HTH
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav <vikascjadhav87@gmail.com
>>> > wrote:
>>>
>>>> Hi
>>>> I have one windows machine and one linux machine
>>>> my eclipse is on winowds machine
>>>> and hadoop running on single linux machine
>>>> I am trying to run wordcount program from eclipse(on windows machine)
>>>> to Hadoop(on linux machine)
>>>>  I getting following error
>>>>
>>>>  3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for
>>>> parsing the arguments. Applications should implement Tool for the same.
>>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.  User
>>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to
>>>> process : 1
>>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load
>>>> native-hadoop library for your platform... using builtin-java classes where
>>>> applicable
>>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not
>>>> loaded
>>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job:
>>>> job_201301300613_0029
>>>> 13/01/31 11:48:27 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id :
>>>> attempt_201301300613_0029_m_000000_0, Status : FAILED
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>>> WordCount$MapClass
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
>>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
>>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>  at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass
>>>>  at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>  at java.lang.Class.forName0(Native Method)
>>>>  at java.lang.Class.forName(Class.java:247)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
>>>>  ... 8 more
>>>>
>>>>
>>>>
>>>>
>>>> Also I want to know how to debug Hadoop Program using eclipse.
>>>>
>>>>
>>>>
>>>>
>>>> Thank You.
>>>>
>>>>
>>>
>>>
>>
>>
>> --
>> *
>> *
>> *
>>
>> Thanx and Regards*
>> * Vikas Jadhav*
>>
>
>

Re: Issue with running hadoop program using eclipse

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Vikas,

     Sorry for late response. You don't have to create the jar separately.
If you have added "job.setJarByClass" as specified by Hemanth sir, it
should work.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Fri, Feb 1, 2013 at 9:42 AM, Hemanth Yamijala
<yh...@thoughtworks.com>wrote:

> Previously, I have resolved this error by building a jar and then using
> the API job.setJarByClass(<driver-class>.class). Can you please try that
> once ?
>
>
> On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vi...@gmail.com>wrote:
>
>> Hi I know it class not found error
>> but I have Map and reduce Class as part of Driver class
>> So what is problem ?
>>
>> I want to ask whether it is compulsory to have jar of any Hadoop Program
>> Because it say "No job jar file set " See JobConf(Class) or
>> JobConf#setJar(String).
>>
>> I am running my program using eclipse(windows machine) and Hadoop(llinux
>> machine) remotely.
>>
>>
>>
>> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Vikas,
>>>
>>>              It clearly shows that the class can not be found. For
>>> debugging, you can write your MR job as a standalone java program and debug
>>> it. It works. And if you want to just debug your mapper / reducer logic,
>>> you should look into using MRUnit. There is a good write-up<http://blog.cloudera.com/blog/2009/07/debugging-mapreduce-programs-with-mrunit/>at Cloudera's blog section which talks about it in detail.
>>>
>>> HTH
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav <vikascjadhav87@gmail.com
>>> > wrote:
>>>
>>>> Hi
>>>> I have one windows machine and one linux machine
>>>> my eclipse is on winowds machine
>>>> and hadoop running on single linux machine
>>>> I am trying to run wordcount program from eclipse(on windows machine)
>>>> to Hadoop(on linux machine)
>>>>  I getting following error
>>>>
>>>>  3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for
>>>> parsing the arguments. Applications should implement Tool for the same.
>>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.  User
>>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to
>>>> process : 1
>>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load
>>>> native-hadoop library for your platform... using builtin-java classes where
>>>> applicable
>>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not
>>>> loaded
>>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job:
>>>> job_201301300613_0029
>>>> 13/01/31 11:48:27 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id :
>>>> attempt_201301300613_0029_m_000000_0, Status : FAILED
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>>> WordCount$MapClass
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
>>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
>>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>  at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass
>>>>  at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>  at java.lang.Class.forName0(Native Method)
>>>>  at java.lang.Class.forName(Class.java:247)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
>>>>  ... 8 more
>>>>
>>>>
>>>>
>>>>
>>>> Also I want to know how to debug Hadoop Program using eclipse.
>>>>
>>>>
>>>>
>>>>
>>>> Thank You.
>>>>
>>>>
>>>
>>>
>>
>>
>> --
>> *
>> *
>> *
>>
>> Thanx and Regards*
>> * Vikas Jadhav*
>>
>
>

Re: Issue with running hadoop program using eclipse

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Vikas,

     Sorry for late response. You don't have to create the jar separately.
If you have added "job.setJarByClass" as specified by Hemanth sir, it
should work.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Fri, Feb 1, 2013 at 9:42 AM, Hemanth Yamijala
<yh...@thoughtworks.com>wrote:

> Previously, I have resolved this error by building a jar and then using
> the API job.setJarByClass(<driver-class>.class). Can you please try that
> once ?
>
>
> On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vi...@gmail.com>wrote:
>
>> Hi I know it class not found error
>> but I have Map and reduce Class as part of Driver class
>> So what is problem ?
>>
>> I want to ask whether it is compulsory to have jar of any Hadoop Program
>> Because it say "No job jar file set " See JobConf(Class) or
>> JobConf#setJar(String).
>>
>> I am running my program using eclipse(windows machine) and Hadoop(llinux
>> machine) remotely.
>>
>>
>>
>> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Vikas,
>>>
>>>              It clearly shows that the class can not be found. For
>>> debugging, you can write your MR job as a standalone java program and debug
>>> it. It works. And if you want to just debug your mapper / reducer logic,
>>> you should look into using MRUnit. There is a good write-up<http://blog.cloudera.com/blog/2009/07/debugging-mapreduce-programs-with-mrunit/>at Cloudera's blog section which talks about it in detail.
>>>
>>> HTH
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav <vikascjadhav87@gmail.com
>>> > wrote:
>>>
>>>> Hi
>>>> I have one windows machine and one linux machine
>>>> my eclipse is on winowds machine
>>>> and hadoop running on single linux machine
>>>> I am trying to run wordcount program from eclipse(on windows machine)
>>>> to Hadoop(on linux machine)
>>>>  I getting following error
>>>>
>>>>  3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for
>>>> parsing the arguments. Applications should implement Tool for the same.
>>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.  User
>>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to
>>>> process : 1
>>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load
>>>> native-hadoop library for your platform... using builtin-java classes where
>>>> applicable
>>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not
>>>> loaded
>>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job:
>>>> job_201301300613_0029
>>>> 13/01/31 11:48:27 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id :
>>>> attempt_201301300613_0029_m_000000_0, Status : FAILED
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>>> WordCount$MapClass
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
>>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
>>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>  at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass
>>>>  at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>  at java.lang.Class.forName0(Native Method)
>>>>  at java.lang.Class.forName(Class.java:247)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
>>>>  at
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
>>>>  ... 8 more
>>>>
>>>>
>>>>
>>>>
>>>> Also I want to know how to debug Hadoop Program using eclipse.
>>>>
>>>>
>>>>
>>>>
>>>> Thank You.
>>>>
>>>>
>>>
>>>
>>
>>
>> --
>> *
>> *
>> *
>>
>> Thanx and Regards*
>> * Vikas Jadhav*
>>
>
>