You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Farhan Husain <fa...@csebuet.org> on 2010/01/04 18:52:47 UTC

Cannot pass dynamic values by Configuration.Set()

Hello all,

I am using hadoop-0.20.1. I need to know the input file name in my map
processes and pass an integer and a string to my reducer processes. I used
the following method calls for that:

config.set("tag1", "string_value");
config.setInt("tag2", int_value);

In setup() method of mapper:
String filename =
context.getConfiguration().get("map.input.file")            // returns null

In setup() method of reducer:
String val =
context.getConfiguration().get("tag1");                                //
returns null
int n = context.getConfiguration().getInt("tag2",
def_val);                        // returns def_val

Can anyone please tell me what may be wrong with this code or anything
related to it? Is it a bug of this version of Hadoop? Is there any
alternative way to accomplish the same objective? I am stuck with this
problem for about one week. I would appreciate if someone would shed some
light on it.

Thanks,
Farhan

Re: Cannot pass dynamic values by Configuration.Set()

Posted by Farhan Husain <fa...@csebuet.org>.
Thanks Steve. I could solve the problem by moving the set() methods before
job creation, as Amogh suggested. However, I will also try your solution.

On Tue, Jan 5, 2010 at 1:24 PM, Steve Kuo <ku...@gmail.com> wrote:

> There seemed to be a change between 0.20 and 0.19 API in that 0.20 no
> longer
> set "map.input.file".  config.set(), as far as I can tell, should work.  I
> however use the following to pass the parameters.
>
> String[] params = new String[] { "-D", "tag1=string_value", ...}
>
> ToolRunner(new Configuration(), someJob.class, params);
>
>
> On Mon, Jan 4, 2010 at 9:52 AM, Farhan Husain <farhan.husain@csebuet.org
> >wrote:
>
> > Hello all,
> >
> > I am using hadoop-0.20.1. I need to know the input file name in my map
> > processes and pass an integer and a string to my reducer processes. I
> used
> > the following method calls for that:
> >
> > config.set("tag1", "string_value");
> > config.setInt("tag2", int_value);
> >
> > In setup() method of mapper:
> > String filename =
> > context.getConfiguration().get("map.input.file")            // returns
> null
> >
> > In setup() method of reducer:
> > String val =
> > context.getConfiguration().get("tag1");                                //
> > returns null
> > int n = context.getConfiguration().getInt("tag2",
> > def_val);                        // returns def_val
> >
> > Can anyone please tell me what may be wrong with this code or anything
> > related to it? Is it a bug of this version of Hadoop? Is there any
> > alternative way to accomplish the same objective? I am stuck with this
> > problem for about one week. I would appreciate if someone would shed some
> > light on it.
> >
> > Thanks,
> > Farhan
> >
>

Re: Cannot pass dynamic values by Configuration.Set()

Posted by Steve Kuo <ku...@gmail.com>.
There seemed to be a change between 0.20 and 0.19 API in that 0.20 no longer
set "map.input.file".  config.set(), as far as I can tell, should work.  I
however use the following to pass the parameters.

String[] params = new String[] { "-D", "tag1=string_value", ...}

ToolRunner(new Configuration(), someJob.class, params);


On Mon, Jan 4, 2010 at 9:52 AM, Farhan Husain <fa...@csebuet.org>wrote:

> Hello all,
>
> I am using hadoop-0.20.1. I need to know the input file name in my map
> processes and pass an integer and a string to my reducer processes. I used
> the following method calls for that:
>
> config.set("tag1", "string_value");
> config.setInt("tag2", int_value);
>
> In setup() method of mapper:
> String filename =
> context.getConfiguration().get("map.input.file")            // returns null
>
> In setup() method of reducer:
> String val =
> context.getConfiguration().get("tag1");                                //
> returns null
> int n = context.getConfiguration().getInt("tag2",
> def_val);                        // returns def_val
>
> Can anyone please tell me what may be wrong with this code or anything
> related to it? Is it a bug of this version of Hadoop? Is there any
> alternative way to accomplish the same objective? I am stuck with this
> problem for about one week. I would appreciate if someone would shed some
> light on it.
>
> Thanks,
> Farhan
>