You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by "bit1129@163.com" <bi...@163.com> on 2015/02/15 11:11:13 UTC

Question about mapp Task and reducer Task

Hi, Hadoopers,

I am pretty newbie to Hadoop, I got a question:  when a job runs, Will each mapper or reducer task take up a JVM process or only a thread?
I hear that the answer is the Process. That is, say, one job contains 5 mappers and 2 reducers , then there will be 7 JVM processes?
Thanks.



bit1129@163.com

Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
I think so

2015-02-15 18:11 GMT+08:00 bit1129@163.com <bi...@163.com>:

> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------
> bit1129@163.com
>

Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
I think so

2015-02-15 18:11 GMT+08:00 bit1129@163.com <bi...@163.com>:

> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------
> bit1129@163.com
>

Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

As a general rule you should follow distro's recommendations, especially 
if you have paid support from Hortonworks.

I have a non-critical, unsupported production cluster onto which I'll 
run small jobs and intend to test the feature but I haven't tried yet so 
I can't give you feedback right now, sorry

Ulul

Le 16/02/2015 00:47, 杨浩 a écrit :
> hi ulul
>  thank you for explanation. I have googled the feature, and 
> hortonworks said
>
> This feature is a technical preview and considered under development. 
> Do not use this feature in your production systems.
>
>  can we use it in production env?
>
>
> 2015-02-15 20:15 GMT+08:00 Ulul <hadoop@ulul.org 
> <ma...@ulul.org>>:
>
>     Hi
>
>     Actually it depends : in MR1 each mapper or reducer will be
>     exezcuted in its own JVM, in MR2 you can activate uberjobs that
>     will let the framework serialize small jobs' mappers and reducers
>     in the applicationmaster JVM.
>
>     Look for mapreduce.job.ubertask.* properties
>
>     Ulul
>
>     Le 15/02/2015 11:11, bit1129@163.com <ma...@163.com> a
>     écrit :
>>     Hi, Hadoopers,
>>
>>     I am pretty newbie to Hadoop, I got a question:  when a job runs,
>>     Will each mapper or reducer task take up a JVM process or only a
>>     thread?
>>     I hear that the answer is the Process. That is, say, one job
>>     contains 5 mappers and 2 reducers , then there will be 7 JVM
>>     processes?
>>     Thanks.
>>
>>     ------------------------------------------------------------------------
>>     bit1129@163.com <ma...@163.com>
>
>


Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

As a general rule you should follow distro's recommendations, especially 
if you have paid support from Hortonworks.

I have a non-critical, unsupported production cluster onto which I'll 
run small jobs and intend to test the feature but I haven't tried yet so 
I can't give you feedback right now, sorry

Ulul

Le 16/02/2015 00:47, 杨浩 a écrit :
> hi ulul
>  thank you for explanation. I have googled the feature, and 
> hortonworks said
>
> This feature is a technical preview and considered under development. 
> Do not use this feature in your production systems.
>
>  can we use it in production env?
>
>
> 2015-02-15 20:15 GMT+08:00 Ulul <hadoop@ulul.org 
> <ma...@ulul.org>>:
>
>     Hi
>
>     Actually it depends : in MR1 each mapper or reducer will be
>     exezcuted in its own JVM, in MR2 you can activate uberjobs that
>     will let the framework serialize small jobs' mappers and reducers
>     in the applicationmaster JVM.
>
>     Look for mapreduce.job.ubertask.* properties
>
>     Ulul
>
>     Le 15/02/2015 11:11, bit1129@163.com <ma...@163.com> a
>     écrit :
>>     Hi, Hadoopers,
>>
>>     I am pretty newbie to Hadoop, I got a question:  when a job runs,
>>     Will each mapper or reducer task take up a JVM process or only a
>>     thread?
>>     I hear that the answer is the Process. That is, say, one job
>>     contains 5 mappers and 2 reducers , then there will be 7 JVM
>>     processes?
>>     Thanks.
>>
>>     ------------------------------------------------------------------------
>>     bit1129@163.com <ma...@163.com>
>
>


Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

As a general rule you should follow distro's recommendations, especially 
if you have paid support from Hortonworks.

I have a non-critical, unsupported production cluster onto which I'll 
run small jobs and intend to test the feature but I haven't tried yet so 
I can't give you feedback right now, sorry

Ulul

Le 16/02/2015 00:47, 杨浩 a écrit :
> hi ulul
>  thank you for explanation. I have googled the feature, and 
> hortonworks said
>
> This feature is a technical preview and considered under development. 
> Do not use this feature in your production systems.
>
>  can we use it in production env?
>
>
> 2015-02-15 20:15 GMT+08:00 Ulul <hadoop@ulul.org 
> <ma...@ulul.org>>:
>
>     Hi
>
>     Actually it depends : in MR1 each mapper or reducer will be
>     exezcuted in its own JVM, in MR2 you can activate uberjobs that
>     will let the framework serialize small jobs' mappers and reducers
>     in the applicationmaster JVM.
>
>     Look for mapreduce.job.ubertask.* properties
>
>     Ulul
>
>     Le 15/02/2015 11:11, bit1129@163.com <ma...@163.com> a
>     écrit :
>>     Hi, Hadoopers,
>>
>>     I am pretty newbie to Hadoop, I got a question:  when a job runs,
>>     Will each mapper or reducer task take up a JVM process or only a
>>     thread?
>>     I hear that the answer is the Process. That is, say, one job
>>     contains 5 mappers and 2 reducers , then there will be 7 JVM
>>     processes?
>>     Thanks.
>>
>>     ------------------------------------------------------------------------
>>     bit1129@163.com <ma...@163.com>
>
>


Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

As a general rule you should follow distro's recommendations, especially 
if you have paid support from Hortonworks.

I have a non-critical, unsupported production cluster onto which I'll 
run small jobs and intend to test the feature but I haven't tried yet so 
I can't give you feedback right now, sorry

Ulul

Le 16/02/2015 00:47, 杨浩 a écrit :
> hi ulul
>  thank you for explanation. I have googled the feature, and 
> hortonworks said
>
> This feature is a technical preview and considered under development. 
> Do not use this feature in your production systems.
>
>  can we use it in production env?
>
>
> 2015-02-15 20:15 GMT+08:00 Ulul <hadoop@ulul.org 
> <ma...@ulul.org>>:
>
>     Hi
>
>     Actually it depends : in MR1 each mapper or reducer will be
>     exezcuted in its own JVM, in MR2 you can activate uberjobs that
>     will let the framework serialize small jobs' mappers and reducers
>     in the applicationmaster JVM.
>
>     Look for mapreduce.job.ubertask.* properties
>
>     Ulul
>
>     Le 15/02/2015 11:11, bit1129@163.com <ma...@163.com> a
>     écrit :
>>     Hi, Hadoopers,
>>
>>     I am pretty newbie to Hadoop, I got a question:  when a job runs,
>>     Will each mapper or reducer task take up a JVM process or only a
>>     thread?
>>     I hear that the answer is the Process. That is, say, one job
>>     contains 5 mappers and 2 reducers , then there will be 7 JVM
>>     processes?
>>     Thanks.
>>
>>     ------------------------------------------------------------------------
>>     bit1129@163.com <ma...@163.com>
>
>


Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
hi ulul
 thank you for explanation. I have googled the feature, and hortonworks
said

This feature is a technical preview and considered under development. Do
not use this feature in your production systems.

 can we use it in production env?


2015-02-15 20:15 GMT+08:00 Ulul <ha...@ulul.org>:

>  Hi
>
> Actually it depends : in MR1 each mapper or reducer will be exezcuted in
> its own JVM, in MR2 you can activate uberjobs that will let the framework
> serialize small jobs' mappers and reducers in the applicationmaster JVM.
>
> Look for mapreduce.job.ubertask.* properties
>
> Ulul
>
> Le 15/02/2015 11:11, bit1129@163.com a écrit :
>
> Hi, Hadoopers,
>
>  I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
>  ------------------------------
>  bit1129@163.com
>
>
>

Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
hi ulul
 thank you for explanation. I have googled the feature, and hortonworks
said

This feature is a technical preview and considered under development. Do
not use this feature in your production systems.

 can we use it in production env?


2015-02-15 20:15 GMT+08:00 Ulul <ha...@ulul.org>:

>  Hi
>
> Actually it depends : in MR1 each mapper or reducer will be exezcuted in
> its own JVM, in MR2 you can activate uberjobs that will let the framework
> serialize small jobs' mappers and reducers in the applicationmaster JVM.
>
> Look for mapreduce.job.ubertask.* properties
>
> Ulul
>
> Le 15/02/2015 11:11, bit1129@163.com a écrit :
>
> Hi, Hadoopers,
>
>  I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
>  ------------------------------
>  bit1129@163.com
>
>
>

Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
hi ulul
 thank you for explanation. I have googled the feature, and hortonworks
said

This feature is a technical preview and considered under development. Do
not use this feature in your production systems.

 can we use it in production env?


2015-02-15 20:15 GMT+08:00 Ulul <ha...@ulul.org>:

>  Hi
>
> Actually it depends : in MR1 each mapper or reducer will be exezcuted in
> its own JVM, in MR2 you can activate uberjobs that will let the framework
> serialize small jobs' mappers and reducers in the applicationmaster JVM.
>
> Look for mapreduce.job.ubertask.* properties
>
> Ulul
>
> Le 15/02/2015 11:11, bit1129@163.com a écrit :
>
> Hi, Hadoopers,
>
>  I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
>  ------------------------------
>  bit1129@163.com
>
>
>

Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
hi ulul
 thank you for explanation. I have googled the feature, and hortonworks
said

This feature is a technical preview and considered under development. Do
not use this feature in your production systems.

 can we use it in production env?


2015-02-15 20:15 GMT+08:00 Ulul <ha...@ulul.org>:

>  Hi
>
> Actually it depends : in MR1 each mapper or reducer will be exezcuted in
> its own JVM, in MR2 you can activate uberjobs that will let the framework
> serialize small jobs' mappers and reducers in the applicationmaster JVM.
>
> Look for mapreduce.job.ubertask.* properties
>
> Ulul
>
> Le 15/02/2015 11:11, bit1129@163.com a écrit :
>
> Hi, Hadoopers,
>
>  I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
>  ------------------------------
>  bit1129@163.com
>
>
>

Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

Actually it depends : in MR1 each mapper or reducer will be exezcuted in 
its own JVM, in MR2 you can activate uberjobs that will let the 
framework serialize small jobs' mappers and reducers in the 
applicationmaster JVM.

Look for mapreduce.job.ubertask.* properties

Ulul

Le 15/02/2015 11:11, bit1129@163.com a écrit :
> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will 
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 
> 5 mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------------------------------------------------
> bit1129@163.com


Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

Actually it depends : in MR1 each mapper or reducer will be exezcuted in 
its own JVM, in MR2 you can activate uberjobs that will let the 
framework serialize small jobs' mappers and reducers in the 
applicationmaster JVM.

Look for mapreduce.job.ubertask.* properties

Ulul

Le 15/02/2015 11:11, bit1129@163.com a écrit :
> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will 
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 
> 5 mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------------------------------------------------
> bit1129@163.com


Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
I think so

2015-02-15 18:11 GMT+08:00 bit1129@163.com <bi...@163.com>:

> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------
> bit1129@163.com
>

Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

Actually it depends : in MR1 each mapper or reducer will be exezcuted in 
its own JVM, in MR2 you can activate uberjobs that will let the 
framework serialize small jobs' mappers and reducers in the 
applicationmaster JVM.

Look for mapreduce.job.ubertask.* properties

Ulul

Le 15/02/2015 11:11, bit1129@163.com a écrit :
> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will 
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 
> 5 mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------------------------------------------------
> bit1129@163.com


Re: Question about mapp Task and reducer Task

Posted by 杨浩 <ya...@gmail.com>.
I think so

2015-02-15 18:11 GMT+08:00 bit1129@163.com <bi...@163.com>:

> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------
> bit1129@163.com
>

Re: Question about mapp Task and reducer Task

Posted by Ulul <ha...@ulul.org>.
Hi

Actually it depends : in MR1 each mapper or reducer will be exezcuted in 
its own JVM, in MR2 you can activate uberjobs that will let the 
framework serialize small jobs' mappers and reducers in the 
applicationmaster JVM.

Look for mapreduce.job.ubertask.* properties

Ulul

Le 15/02/2015 11:11, bit1129@163.com a écrit :
> Hi, Hadoopers,
>
> I am pretty newbie to Hadoop, I got a question:  when a job runs, Will 
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 
> 5 mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
> ------------------------------------------------------------------------
> bit1129@163.com