You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Steve Lewis <lo...@gmail.com> on 2010/10/29 20:10:56 UTC

Is there an any way to change the number of reduce jobs on a local hadoop job

I frequently run small versions of my Hadoop jobs as local single machine
jobs to debug my logic.
When I do this I always get only a single reduce task even when I call
setNumReduceTasks with, say 10.

I want to debug some issues with the number of reducers and wonder if there
is a way to force local
hadoop jobs to emulate multiple reduce tasks

-- 
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA

Re: Is there an any way to change the number of reduce jobs on a local hadoop job

Posted by "Gangl, Michael E (388K)" <Mi...@jpl.nasa.gov>.
I believe you can only do this if you run in pseudo-distributed mode:

http://hadoop.apache.org/common/docs/r0.20.0/quickstart.html#PseudoDistributed

-Mike


On 10/29/10 11:10 AM, "Steve Lewis" <lo...@gmail.com> wrote:

I frequently run small versions of my Hadoop jobs as local single machine
jobs to debug my logic.
When I do this I always get only a single reduce task even when I call
setNumReduceTasks with, say 10.

I want to debug some issues with the number of reducers and wonder if there
is a way to force local
hadoop jobs to emulate multiple reduce tasks

--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA