You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Ranjan Banerjee <rb...@wisc.edu> on 2012/04/17 02:56:32 UTC
Warning in running mapreduce jobs
Hello,
I am running the default map reduce job (word count) in Hadoop. The job runs fine and I am able to delve into the statistics. However I get a warning as follows
"No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String)."
Also I set the following parameter in the core-site.xml
<property>
<name>mapred.map.tasks</name>
<value>2</value>
<description>The default number of map tasks per job. Typically set
to a prime several times greater than number of available hosts.
Ignored when mapred.job.tracker is "local".
</description>
</property>
However I still get only one map task and not two. Can someone suggest the solution to this problem.
Thanking you
Yours faithfully
Ranjan Banerjee
Re: Warning in running mapreduce jobs
Posted by Ranjan Banerjee <rb...@wisc.edu>.
My input is a common text file taken off the web.
Regards
Ranjan
On 04/16/12, Niels Basjes wrote:
> What does your input look like?
>
> --
> Met vriendelijke groet,
> Niels Basjes
> (Verstuurd vanaf mobiel )
> Op 17 apr. 2012 02:57 schreef "Ranjan Banerjee" <rb...@wisc.edu> het
> volgende:
>
> > Hello,
> > I am running the default map reduce job (word count) in Hadoop. The job
> > runs fine and I am able to delve into the statistics. However I get a
> > warning as follows
> > "No job jar file set. User classes may not be found. See JobConf(Class)
> > or JobConf#setJar(String)."
> > Also I set the following parameter in the core-site.xml
> >
> >
> > <property>
> > <name>mapred.map.tasks</name>
> > <value>2</value>
> > <description>The default number of map tasks per job. Typically set
> > to a prime several times greater than number of available hosts.
> > Ignored when mapred.job.tracker is "local".
> > </description>
> > </property>
> > However I still get only one map task and not two. Can someone suggest the
> > solution to this problem.
> >
> > Thanking you
> >
> > Yours faithfully
> > Ranjan Banerjee
> >
Re: Warning in running mapreduce jobs
Posted by Niels Basjes <ni...@basj.es>.
What does your input look like?
--
Met vriendelijke groet,
Niels Basjes
(Verstuurd vanaf mobiel )
Op 17 apr. 2012 02:57 schreef "Ranjan Banerjee" <rb...@wisc.edu> het
volgende:
> Hello,
> I am running the default map reduce job (word count) in Hadoop. The job
> runs fine and I am able to delve into the statistics. However I get a
> warning as follows
> "No job jar file set. User classes may not be found. See JobConf(Class)
> or JobConf#setJar(String)."
> Also I set the following parameter in the core-site.xml
>
>
> <property>
> <name>mapred.map.tasks</name>
> <value>2</value>
> <description>The default number of map tasks per job. Typically set
> to a prime several times greater than number of available hosts.
> Ignored when mapred.job.tracker is "local".
> </description>
> </property>
> However I still get only one map task and not two. Can someone suggest the
> solution to this problem.
>
> Thanking you
>
> Yours faithfully
> Ranjan Banerjee
>
Re: Warning in running mapreduce jobs
Posted by Harsh J <ha...@cloudera.com>.
Ensure you have a Job/JobConf.setJar (or setJarByClass) call in your
Driver, before launching the job.
On Tue, Apr 17, 2012 at 6:26 AM, Ranjan Banerjee <rb...@wisc.edu> wrote:
> Hello,
> I am running the default map reduce job (word count) in Hadoop. The job runs fine and I am able to delve into the statistics. However I get a warning as follows
> "No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String)."
> Also I set the following parameter in the core-site.xml
>
>
> <property>
> <name>mapred.map.tasks</name>
> <value>2</value>
> <description>The default number of map tasks per job. Typically set
> to a prime several times greater than number of available hosts.
> Ignored when mapred.job.tracker is "local".
> </description>
> </property>
> However I still get only one map task and not two. Can someone suggest the solution to this problem.
>
> Thanking you
>
> Yours faithfully
> Ranjan Banerjee
--
Harsh J