You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Abhinav M Kulkarni <ab...@gmail.com> on 2013/04/02 05:55:43 UTC

Provide context to map function

Hi,

I have a following scenario:

  * Two mappers (acting on two different files) and one reducer
  * The mapper code for two different files is the same, except for
    minor change which depends on which file is being read
  * Essentially assume there is an if statement - if first file is being
    read do this else do this

So how do I provide this context to map function i.e. file name or say a 
boolean flag variable indicating the file being read?

Thanks,
Abhinav

Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
Thanks all who replied.

I was accidentally using old API hence could not find context argument 
to the map function.

This is solved.

On 04/02/2013 01:20 AM, Dino Kečo wrote:
>
> You should check multiple input format class which enables you to have 
> more input formats for same mapper.
>
> Regards,
> Dino
>
> On Apr 2, 2013 9:49 AM, "Yanbo Liang" <yanbohappy@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     protected void map(KEYIN key, VALUEIN value,
>                          Context context) throws IOException,
>     InterruptedException {
>         context.write((KEYOUT) key, (VALUEOUT) value);
>       }
>
>     Context is a parameter that the execute environment will pass to
>     the map() function.
>     You can just use it in the map() function.
>
>
>     2013/4/2 Abhinav M Kulkarni <abhinavkulkarni@gmail.com
>     <ma...@gmail.com>>
>
>         To be precise, I am using Hadoop 1.0.4.
>
>         There is no local variable or argument named context in the
>         map function.
>
>         Thanks,
>         Abhinav
>
>
>         On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>         I supposed your input splits are FileSplit, if not, you need to:
>>
>>         InputSplit split = context.getInputSplit();
>>
>>         if (split instanceof FileSplit){
>>           Path path = ((FileSplit)split).getPath();
>>         }
>>
>>
>>
>>
>>         On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu
>>         <azuryyyu@gmail.com <ma...@gmail.com>> wrote:
>>
>>             In your map function add following:
>>
>>             Path currentInput =
>>             ((FileSplit)context.getInputSplit()).getPath();
>>
>>             then:
>>
>>             if (currentInput is first ){
>>             ................
>>             }
>>             else{
>>             ..................
>>             }
>>
>>
>>
>>
>>             On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>>             <abhinavkulkarni@gmail.com
>>             <ma...@gmail.com>> wrote:
>>
>>                 Hi,
>>
>>                 I have a following scenario:
>>
>>                   * Two mappers (acting on two different files) and
>>                     one reducer
>>                   * The mapper code for two different files is the
>>                     same, except for minor change which depends on
>>                     which file is being read
>>                   * Essentially assume there is an if statement - if
>>                     first file is being read do this else do this
>>
>>                 So how do I provide this context to map function i.e.
>>                 file name or say a boolean flag variable indicating
>>                 the file being read?
>>
>>                 Thanks,
>>                 Abhinav
>>
>>
>>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
Thanks all who replied.

I was accidentally using old API hence could not find context argument 
to the map function.

This is solved.

On 04/02/2013 01:20 AM, Dino Kečo wrote:
>
> You should check multiple input format class which enables you to have 
> more input formats for same mapper.
>
> Regards,
> Dino
>
> On Apr 2, 2013 9:49 AM, "Yanbo Liang" <yanbohappy@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     protected void map(KEYIN key, VALUEIN value,
>                          Context context) throws IOException,
>     InterruptedException {
>         context.write((KEYOUT) key, (VALUEOUT) value);
>       }
>
>     Context is a parameter that the execute environment will pass to
>     the map() function.
>     You can just use it in the map() function.
>
>
>     2013/4/2 Abhinav M Kulkarni <abhinavkulkarni@gmail.com
>     <ma...@gmail.com>>
>
>         To be precise, I am using Hadoop 1.0.4.
>
>         There is no local variable or argument named context in the
>         map function.
>
>         Thanks,
>         Abhinav
>
>
>         On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>         I supposed your input splits are FileSplit, if not, you need to:
>>
>>         InputSplit split = context.getInputSplit();
>>
>>         if (split instanceof FileSplit){
>>           Path path = ((FileSplit)split).getPath();
>>         }
>>
>>
>>
>>
>>         On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu
>>         <azuryyyu@gmail.com <ma...@gmail.com>> wrote:
>>
>>             In your map function add following:
>>
>>             Path currentInput =
>>             ((FileSplit)context.getInputSplit()).getPath();
>>
>>             then:
>>
>>             if (currentInput is first ){
>>             ................
>>             }
>>             else{
>>             ..................
>>             }
>>
>>
>>
>>
>>             On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>>             <abhinavkulkarni@gmail.com
>>             <ma...@gmail.com>> wrote:
>>
>>                 Hi,
>>
>>                 I have a following scenario:
>>
>>                   * Two mappers (acting on two different files) and
>>                     one reducer
>>                   * The mapper code for two different files is the
>>                     same, except for minor change which depends on
>>                     which file is being read
>>                   * Essentially assume there is an if statement - if
>>                     first file is being read do this else do this
>>
>>                 So how do I provide this context to map function i.e.
>>                 file name or say a boolean flag variable indicating
>>                 the file being read?
>>
>>                 Thanks,
>>                 Abhinav
>>
>>
>>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
Thanks all who replied.

I was accidentally using old API hence could not find context argument 
to the map function.

This is solved.

On 04/02/2013 01:20 AM, Dino Kečo wrote:
>
> You should check multiple input format class which enables you to have 
> more input formats for same mapper.
>
> Regards,
> Dino
>
> On Apr 2, 2013 9:49 AM, "Yanbo Liang" <yanbohappy@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     protected void map(KEYIN key, VALUEIN value,
>                          Context context) throws IOException,
>     InterruptedException {
>         context.write((KEYOUT) key, (VALUEOUT) value);
>       }
>
>     Context is a parameter that the execute environment will pass to
>     the map() function.
>     You can just use it in the map() function.
>
>
>     2013/4/2 Abhinav M Kulkarni <abhinavkulkarni@gmail.com
>     <ma...@gmail.com>>
>
>         To be precise, I am using Hadoop 1.0.4.
>
>         There is no local variable or argument named context in the
>         map function.
>
>         Thanks,
>         Abhinav
>
>
>         On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>         I supposed your input splits are FileSplit, if not, you need to:
>>
>>         InputSplit split = context.getInputSplit();
>>
>>         if (split instanceof FileSplit){
>>           Path path = ((FileSplit)split).getPath();
>>         }
>>
>>
>>
>>
>>         On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu
>>         <azuryyyu@gmail.com <ma...@gmail.com>> wrote:
>>
>>             In your map function add following:
>>
>>             Path currentInput =
>>             ((FileSplit)context.getInputSplit()).getPath();
>>
>>             then:
>>
>>             if (currentInput is first ){
>>             ................
>>             }
>>             else{
>>             ..................
>>             }
>>
>>
>>
>>
>>             On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>>             <abhinavkulkarni@gmail.com
>>             <ma...@gmail.com>> wrote:
>>
>>                 Hi,
>>
>>                 I have a following scenario:
>>
>>                   * Two mappers (acting on two different files) and
>>                     one reducer
>>                   * The mapper code for two different files is the
>>                     same, except for minor change which depends on
>>                     which file is being read
>>                   * Essentially assume there is an if statement - if
>>                     first file is being read do this else do this
>>
>>                 So how do I provide this context to map function i.e.
>>                 file name or say a boolean flag variable indicating
>>                 the file being read?
>>
>>                 Thanks,
>>                 Abhinav
>>
>>
>>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
Thanks all who replied.

I was accidentally using old API hence could not find context argument 
to the map function.

This is solved.

On 04/02/2013 01:20 AM, Dino Kečo wrote:
>
> You should check multiple input format class which enables you to have 
> more input formats for same mapper.
>
> Regards,
> Dino
>
> On Apr 2, 2013 9:49 AM, "Yanbo Liang" <yanbohappy@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     protected void map(KEYIN key, VALUEIN value,
>                          Context context) throws IOException,
>     InterruptedException {
>         context.write((KEYOUT) key, (VALUEOUT) value);
>       }
>
>     Context is a parameter that the execute environment will pass to
>     the map() function.
>     You can just use it in the map() function.
>
>
>     2013/4/2 Abhinav M Kulkarni <abhinavkulkarni@gmail.com
>     <ma...@gmail.com>>
>
>         To be precise, I am using Hadoop 1.0.4.
>
>         There is no local variable or argument named context in the
>         map function.
>
>         Thanks,
>         Abhinav
>
>
>         On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>         I supposed your input splits are FileSplit, if not, you need to:
>>
>>         InputSplit split = context.getInputSplit();
>>
>>         if (split instanceof FileSplit){
>>           Path path = ((FileSplit)split).getPath();
>>         }
>>
>>
>>
>>
>>         On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu
>>         <azuryyyu@gmail.com <ma...@gmail.com>> wrote:
>>
>>             In your map function add following:
>>
>>             Path currentInput =
>>             ((FileSplit)context.getInputSplit()).getPath();
>>
>>             then:
>>
>>             if (currentInput is first ){
>>             ................
>>             }
>>             else{
>>             ..................
>>             }
>>
>>
>>
>>
>>             On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>>             <abhinavkulkarni@gmail.com
>>             <ma...@gmail.com>> wrote:
>>
>>                 Hi,
>>
>>                 I have a following scenario:
>>
>>                   * Two mappers (acting on two different files) and
>>                     one reducer
>>                   * The mapper code for two different files is the
>>                     same, except for minor change which depends on
>>                     which file is being read
>>                   * Essentially assume there is an if statement - if
>>                     first file is being read do this else do this
>>
>>                 So how do I provide this context to map function i.e.
>>                 file name or say a boolean flag variable indicating
>>                 the file being read?
>>
>>                 Thanks,
>>                 Abhinav
>>
>>
>>
>
>


Re: Provide context to map function

Posted by Dino Kečo <di...@gmail.com>.
You should check multiple input format class which enables you to have more
input formats for same mapper.

Regards,
Dino
On Apr 2, 2013 9:49 AM, "Yanbo Liang" <ya...@gmail.com> wrote:

> protected void map(KEYIN key, VALUEIN value,
>                      Context context) throws IOException,
> InterruptedException {
>     context.write((KEYOUT) key, (VALUEOUT) value);
>   }
>
> Context is a parameter that the execute environment will pass to the map()
> function.
> You can just use it in the map() function.
>
>
> 2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>
>
>>  To be precise, I am using Hadoop 1.0.4.
>>
>> There is no local variable or argument named context in the map function.
>>
>> Thanks,
>> Abhinav
>>
>>
>> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>
>>   I supposed your input splits are FileSplit, if not, you need to:
>>
>>  InputSplit split = context.getInputSplit();
>>
>>  if (split instanceof FileSplit){
>>    Path path = ((FileSplit)split).getPath();
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>>
>>>  In your map function add following:
>>>
>>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>>
>>>  then:
>>>
>>>  if (currentInput is first ){
>>> ................
>>> }
>>>  else{
>>> ..................
>>> }
>>>
>>>
>>>
>>>
>>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>>> abhinavkulkarni@gmail.com> wrote:
>>>
>>>>  Hi,
>>>>
>>>> I have a following scenario:
>>>>
>>>>
>>>>    - Two mappers (acting on two different files) and one reducer
>>>>    - The mapper code for two different files is the same, except for
>>>>    minor change which depends on which file is being read
>>>>    - Essentially assume there is an if statement - if first file is
>>>>    being read do this else do this
>>>>
>>>> So how do I provide this context to map function i.e. file name or say
>>>> a boolean flag variable indicating the file being read?
>>>>
>>>> Thanks,
>>>> Abhinav
>>>>
>>>
>>>
>>
>>
>

Re: Provide context to map function

Posted by Dino Kečo <di...@gmail.com>.
You should check multiple input format class which enables you to have more
input formats for same mapper.

Regards,
Dino
On Apr 2, 2013 9:49 AM, "Yanbo Liang" <ya...@gmail.com> wrote:

> protected void map(KEYIN key, VALUEIN value,
>                      Context context) throws IOException,
> InterruptedException {
>     context.write((KEYOUT) key, (VALUEOUT) value);
>   }
>
> Context is a parameter that the execute environment will pass to the map()
> function.
> You can just use it in the map() function.
>
>
> 2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>
>
>>  To be precise, I am using Hadoop 1.0.4.
>>
>> There is no local variable or argument named context in the map function.
>>
>> Thanks,
>> Abhinav
>>
>>
>> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>
>>   I supposed your input splits are FileSplit, if not, you need to:
>>
>>  InputSplit split = context.getInputSplit();
>>
>>  if (split instanceof FileSplit){
>>    Path path = ((FileSplit)split).getPath();
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>>
>>>  In your map function add following:
>>>
>>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>>
>>>  then:
>>>
>>>  if (currentInput is first ){
>>> ................
>>> }
>>>  else{
>>> ..................
>>> }
>>>
>>>
>>>
>>>
>>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>>> abhinavkulkarni@gmail.com> wrote:
>>>
>>>>  Hi,
>>>>
>>>> I have a following scenario:
>>>>
>>>>
>>>>    - Two mappers (acting on two different files) and one reducer
>>>>    - The mapper code for two different files is the same, except for
>>>>    minor change which depends on which file is being read
>>>>    - Essentially assume there is an if statement - if first file is
>>>>    being read do this else do this
>>>>
>>>> So how do I provide this context to map function i.e. file name or say
>>>> a boolean flag variable indicating the file being read?
>>>>
>>>> Thanks,
>>>> Abhinav
>>>>
>>>
>>>
>>
>>
>

Re: Provide context to map function

Posted by Dino Kečo <di...@gmail.com>.
You should check multiple input format class which enables you to have more
input formats for same mapper.

Regards,
Dino
On Apr 2, 2013 9:49 AM, "Yanbo Liang" <ya...@gmail.com> wrote:

> protected void map(KEYIN key, VALUEIN value,
>                      Context context) throws IOException,
> InterruptedException {
>     context.write((KEYOUT) key, (VALUEOUT) value);
>   }
>
> Context is a parameter that the execute environment will pass to the map()
> function.
> You can just use it in the map() function.
>
>
> 2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>
>
>>  To be precise, I am using Hadoop 1.0.4.
>>
>> There is no local variable or argument named context in the map function.
>>
>> Thanks,
>> Abhinav
>>
>>
>> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>
>>   I supposed your input splits are FileSplit, if not, you need to:
>>
>>  InputSplit split = context.getInputSplit();
>>
>>  if (split instanceof FileSplit){
>>    Path path = ((FileSplit)split).getPath();
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>>
>>>  In your map function add following:
>>>
>>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>>
>>>  then:
>>>
>>>  if (currentInput is first ){
>>> ................
>>> }
>>>  else{
>>> ..................
>>> }
>>>
>>>
>>>
>>>
>>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>>> abhinavkulkarni@gmail.com> wrote:
>>>
>>>>  Hi,
>>>>
>>>> I have a following scenario:
>>>>
>>>>
>>>>    - Two mappers (acting on two different files) and one reducer
>>>>    - The mapper code for two different files is the same, except for
>>>>    minor change which depends on which file is being read
>>>>    - Essentially assume there is an if statement - if first file is
>>>>    being read do this else do this
>>>>
>>>> So how do I provide this context to map function i.e. file name or say
>>>> a boolean flag variable indicating the file being read?
>>>>
>>>> Thanks,
>>>> Abhinav
>>>>
>>>
>>>
>>
>>
>

Re: Provide context to map function

Posted by Dino Kečo <di...@gmail.com>.
You should check multiple input format class which enables you to have more
input formats for same mapper.

Regards,
Dino
On Apr 2, 2013 9:49 AM, "Yanbo Liang" <ya...@gmail.com> wrote:

> protected void map(KEYIN key, VALUEIN value,
>                      Context context) throws IOException,
> InterruptedException {
>     context.write((KEYOUT) key, (VALUEOUT) value);
>   }
>
> Context is a parameter that the execute environment will pass to the map()
> function.
> You can just use it in the map() function.
>
>
> 2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>
>
>>  To be precise, I am using Hadoop 1.0.4.
>>
>> There is no local variable or argument named context in the map function.
>>
>> Thanks,
>> Abhinav
>>
>>
>> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>>
>>   I supposed your input splits are FileSplit, if not, you need to:
>>
>>  InputSplit split = context.getInputSplit();
>>
>>  if (split instanceof FileSplit){
>>    Path path = ((FileSplit)split).getPath();
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>>
>>>  In your map function add following:
>>>
>>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>>
>>>  then:
>>>
>>>  if (currentInput is first ){
>>> ................
>>> }
>>>  else{
>>> ..................
>>> }
>>>
>>>
>>>
>>>
>>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>>> abhinavkulkarni@gmail.com> wrote:
>>>
>>>>  Hi,
>>>>
>>>> I have a following scenario:
>>>>
>>>>
>>>>    - Two mappers (acting on two different files) and one reducer
>>>>    - The mapper code for two different files is the same, except for
>>>>    minor change which depends on which file is being read
>>>>    - Essentially assume there is an if statement - if first file is
>>>>    being read do this else do this
>>>>
>>>> So how do I provide this context to map function i.e. file name or say
>>>> a boolean flag variable indicating the file being read?
>>>>
>>>> Thanks,
>>>> Abhinav
>>>>
>>>
>>>
>>
>>
>

Re: Provide context to map function

Posted by Yanbo Liang <ya...@gmail.com>.
protected void map(KEYIN key, VALUEIN value,
                     Context context) throws IOException,
InterruptedException {
    context.write((KEYOUT) key, (VALUEOUT) value);
  }

Context is a parameter that the execute environment will pass to the map()
function.
You can just use it in the map() function.


2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>

>  To be precise, I am using Hadoop 1.0.4.
>
> There is no local variable or argument named context in the map function.
>
> Thanks,
> Abhinav
>
>
> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>
>   I supposed your input splits are FileSplit, if not, you need to:
>
>  InputSplit split = context.getInputSplit();
>
>  if (split instanceof FileSplit){
>    Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>
>>  In your map function add following:
>>
>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>
>>  then:
>>
>>  if (currentInput is first ){
>> ................
>> }
>>  else{
>> ..................
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>> abhinavkulkarni@gmail.com> wrote:
>>
>>>  Hi,
>>>
>>> I have a following scenario:
>>>
>>>
>>>    - Two mappers (acting on two different files) and one reducer
>>>    - The mapper code for two different files is the same, except for
>>>    minor change which depends on which file is being read
>>>    - Essentially assume there is an if statement - if first file is
>>>    being read do this else do this
>>>
>>> So how do I provide this context to map function i.e. file name or say a
>>> boolean flag variable indicating the file being read?
>>>
>>> Thanks,
>>> Abhinav
>>>
>>
>>
>
>

Re: Provide context to map function

Posted by Yanbo Liang <ya...@gmail.com>.
protected void map(KEYIN key, VALUEIN value,
                     Context context) throws IOException,
InterruptedException {
    context.write((KEYOUT) key, (VALUEOUT) value);
  }

Context is a parameter that the execute environment will pass to the map()
function.
You can just use it in the map() function.


2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>

>  To be precise, I am using Hadoop 1.0.4.
>
> There is no local variable or argument named context in the map function.
>
> Thanks,
> Abhinav
>
>
> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>
>   I supposed your input splits are FileSplit, if not, you need to:
>
>  InputSplit split = context.getInputSplit();
>
>  if (split instanceof FileSplit){
>    Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>
>>  In your map function add following:
>>
>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>
>>  then:
>>
>>  if (currentInput is first ){
>> ................
>> }
>>  else{
>> ..................
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>> abhinavkulkarni@gmail.com> wrote:
>>
>>>  Hi,
>>>
>>> I have a following scenario:
>>>
>>>
>>>    - Two mappers (acting on two different files) and one reducer
>>>    - The mapper code for two different files is the same, except for
>>>    minor change which depends on which file is being read
>>>    - Essentially assume there is an if statement - if first file is
>>>    being read do this else do this
>>>
>>> So how do I provide this context to map function i.e. file name or say a
>>> boolean flag variable indicating the file being read?
>>>
>>> Thanks,
>>> Abhinav
>>>
>>
>>
>
>

Re: Provide context to map function

Posted by Yanbo Liang <ya...@gmail.com>.
protected void map(KEYIN key, VALUEIN value,
                     Context context) throws IOException,
InterruptedException {
    context.write((KEYOUT) key, (VALUEOUT) value);
  }

Context is a parameter that the execute environment will pass to the map()
function.
You can just use it in the map() function.


2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>

>  To be precise, I am using Hadoop 1.0.4.
>
> There is no local variable or argument named context in the map function.
>
> Thanks,
> Abhinav
>
>
> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>
>   I supposed your input splits are FileSplit, if not, you need to:
>
>  InputSplit split = context.getInputSplit();
>
>  if (split instanceof FileSplit){
>    Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>
>>  In your map function add following:
>>
>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>
>>  then:
>>
>>  if (currentInput is first ){
>> ................
>> }
>>  else{
>> ..................
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>> abhinavkulkarni@gmail.com> wrote:
>>
>>>  Hi,
>>>
>>> I have a following scenario:
>>>
>>>
>>>    - Two mappers (acting on two different files) and one reducer
>>>    - The mapper code for two different files is the same, except for
>>>    minor change which depends on which file is being read
>>>    - Essentially assume there is an if statement - if first file is
>>>    being read do this else do this
>>>
>>> So how do I provide this context to map function i.e. file name or say a
>>> boolean flag variable indicating the file being read?
>>>
>>> Thanks,
>>> Abhinav
>>>
>>
>>
>
>

Re: Provide context to map function

Posted by Yanbo Liang <ya...@gmail.com>.
protected void map(KEYIN key, VALUEIN value,
                     Context context) throws IOException,
InterruptedException {
    context.write((KEYOUT) key, (VALUEOUT) value);
  }

Context is a parameter that the execute environment will pass to the map()
function.
You can just use it in the map() function.


2013/4/2 Abhinav M Kulkarni <ab...@gmail.com>

>  To be precise, I am using Hadoop 1.0.4.
>
> There is no local variable or argument named context in the map function.
>
> Thanks,
> Abhinav
>
>
> On 04/01/2013 09:06 PM, Azuryy Yu wrote:
>
>   I supposed your input splits are FileSplit, if not, you need to:
>
>  InputSplit split = context.getInputSplit();
>
>  if (split instanceof FileSplit){
>    Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:
>
>>  In your map function add following:
>>
>> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>>
>>  then:
>>
>>  if (currentInput is first ){
>> ................
>> }
>>  else{
>> ..................
>> }
>>
>>
>>
>>
>> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
>> abhinavkulkarni@gmail.com> wrote:
>>
>>>  Hi,
>>>
>>> I have a following scenario:
>>>
>>>
>>>    - Two mappers (acting on two different files) and one reducer
>>>    - The mapper code for two different files is the same, except for
>>>    minor change which depends on which file is being read
>>>    - Essentially assume there is an if statement - if first file is
>>>    being read do this else do this
>>>
>>> So how do I provide this context to map function i.e. file name or say a
>>> boolean flag variable indicating the file being read?
>>>
>>> Thanks,
>>> Abhinav
>>>
>>
>>
>
>

Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <azuryyyu@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <abhinavkulkarni@gmail.com <ma...@gmail.com>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <azuryyyu@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <abhinavkulkarni@gmail.com <ma...@gmail.com>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <azuryyyu@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <abhinavkulkarni@gmail.com <ma...@gmail.com>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>


Re: Provide context to map function

Posted by Abhinav M Kulkarni <ab...@gmail.com>.
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <azuryyyu@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <abhinavkulkarni@gmail.com <ma...@gmail.com>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>


Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
I supposed your input splits are FileSplit, if not, you need to:

InputSplit split = context.getInputSplit();

if (split instanceof FileSplit){
  Path path = ((FileSplit)split).getPath();
}




On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:

> In your map function add following:
>
> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
> then:
>
> if (currentInput is first ){
> ................
> }
> else{
> ..................
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
> abhinavkulkarni@gmail.com> wrote:
>
>>  Hi,
>>
>> I have a following scenario:
>>
>>
>>    - Two mappers (acting on two different files) and one reducer
>>    - The mapper code for two different files is the same, except for
>>    minor change which depends on which file is being read
>>    - Essentially assume there is an if statement - if first file is
>>    being read do this else do this
>>
>> So how do I provide this context to map function i.e. file name or say a
>> boolean flag variable indicating the file being read?
>>
>> Thanks,
>> Abhinav
>>
>
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
I supposed your input splits are FileSplit, if not, you need to:

InputSplit split = context.getInputSplit();

if (split instanceof FileSplit){
  Path path = ((FileSplit)split).getPath();
}




On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:

> In your map function add following:
>
> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
> then:
>
> if (currentInput is first ){
> ................
> }
> else{
> ..................
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
> abhinavkulkarni@gmail.com> wrote:
>
>>  Hi,
>>
>> I have a following scenario:
>>
>>
>>    - Two mappers (acting on two different files) and one reducer
>>    - The mapper code for two different files is the same, except for
>>    minor change which depends on which file is being read
>>    - Essentially assume there is an if statement - if first file is
>>    being read do this else do this
>>
>> So how do I provide this context to map function i.e. file name or say a
>> boolean flag variable indicating the file being read?
>>
>> Thanks,
>> Abhinav
>>
>
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
I supposed your input splits are FileSplit, if not, you need to:

InputSplit split = context.getInputSplit();

if (split instanceof FileSplit){
  Path path = ((FileSplit)split).getPath();
}




On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:

> In your map function add following:
>
> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
> then:
>
> if (currentInput is first ){
> ................
> }
> else{
> ..................
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
> abhinavkulkarni@gmail.com> wrote:
>
>>  Hi,
>>
>> I have a following scenario:
>>
>>
>>    - Two mappers (acting on two different files) and one reducer
>>    - The mapper code for two different files is the same, except for
>>    minor change which depends on which file is being read
>>    - Essentially assume there is an if statement - if first file is
>>    being read do this else do this
>>
>> So how do I provide this context to map function i.e. file name or say a
>> boolean flag variable indicating the file being read?
>>
>> Thanks,
>> Abhinav
>>
>
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
I supposed your input splits are FileSplit, if not, you need to:

InputSplit split = context.getInputSplit();

if (split instanceof FileSplit){
  Path path = ((FileSplit)split).getPath();
}




On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <az...@gmail.com> wrote:

> In your map function add following:
>
> Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
> then:
>
> if (currentInput is first ){
> ................
> }
> else{
> ..................
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
> abhinavkulkarni@gmail.com> wrote:
>
>>  Hi,
>>
>> I have a following scenario:
>>
>>
>>    - Two mappers (acting on two different files) and one reducer
>>    - The mapper code for two different files is the same, except for
>>    minor change which depends on which file is being read
>>    - Essentially assume there is an if statement - if first file is
>>    being read do this else do this
>>
>> So how do I provide this context to map function i.e. file name or say a
>> boolean flag variable indicating the file being read?
>>
>> Thanks,
>> Abhinav
>>
>
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
In your map function add following:

Path currentInput = ((FileSplit)context.getInputSplit()).getPath();

then:

if (currentInput is first ){
................
}
else{
..................
}




On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
abhinavkulkarni@gmail.com> wrote:

>  Hi,
>
> I have a following scenario:
>
>
>    - Two mappers (acting on two different files) and one reducer
>    - The mapper code for two different files is the same, except for
>    minor change which depends on which file is being read
>    - Essentially assume there is an if statement - if first file is being
>    read do this else do this
>
> So how do I provide this context to map function i.e. file name or say a
> boolean flag variable indicating the file being read?
>
> Thanks,
> Abhinav
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
In your map function add following:

Path currentInput = ((FileSplit)context.getInputSplit()).getPath();

then:

if (currentInput is first ){
................
}
else{
..................
}




On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
abhinavkulkarni@gmail.com> wrote:

>  Hi,
>
> I have a following scenario:
>
>
>    - Two mappers (acting on two different files) and one reducer
>    - The mapper code for two different files is the same, except for
>    minor change which depends on which file is being read
>    - Essentially assume there is an if statement - if first file is being
>    read do this else do this
>
> So how do I provide this context to map function i.e. file name or say a
> boolean flag variable indicating the file being read?
>
> Thanks,
> Abhinav
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
In your map function add following:

Path currentInput = ((FileSplit)context.getInputSplit()).getPath();

then:

if (currentInput is first ){
................
}
else{
..................
}




On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
abhinavkulkarni@gmail.com> wrote:

>  Hi,
>
> I have a following scenario:
>
>
>    - Two mappers (acting on two different files) and one reducer
>    - The mapper code for two different files is the same, except for
>    minor change which depends on which file is being read
>    - Essentially assume there is an if statement - if first file is being
>    read do this else do this
>
> So how do I provide this context to map function i.e. file name or say a
> boolean flag variable indicating the file being read?
>
> Thanks,
> Abhinav
>

Re: Provide context to map function

Posted by Azuryy Yu <az...@gmail.com>.
In your map function add following:

Path currentInput = ((FileSplit)context.getInputSplit()).getPath();

then:

if (currentInput is first ){
................
}
else{
..................
}




On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni <
abhinavkulkarni@gmail.com> wrote:

>  Hi,
>
> I have a following scenario:
>
>
>    - Two mappers (acting on two different files) and one reducer
>    - The mapper code for two different files is the same, except for
>    minor change which depends on which file is being read
>    - Essentially assume there is an if statement - if first file is being
>    read do this else do this
>
> So how do I provide this context to map function i.e. file name or say a
> boolean flag variable indicating the file being read?
>
> Thanks,
> Abhinav
>