You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Haruyasu Ueda <ha...@jp.fujitsu.com> on 2011/04/07 02:40:43 UTC

How to abort a job in a map task

Hi all,

I'm writing M/R java program.

I want to abort a job itself in a map task, when the map task found
irregular data.

I have two idea to do so.
 1. execulte "bin/hadoop -kill jobID" in map task, from slave machine.
 2. raise an IOException to abort.

I want to know which is better way. 
Or, whether there is  better/recommended programming idiom.

If you have any experience about this, please share your case.

 --HAL
========================================================================
Haruyasu Ueda, Senior Researcher
  Research Center for Cloud Computing
  FUJITSU LABORATORIES LTD.
E-mail: hal_ueda@jp.fujitsu.com
Tel: +81 44 754 2575
Ken-S602, 4-1-1, Kamikodanaka, Nakahara-ku, Kawasaki, 211-8588 Japan
========================================================================



Re: How to abort a job in a map task

Posted by Mehmet Tepedelenlioglu <me...@gmail.com>.
It might be better to keep a counter of bad data and terminate regularly.
I would be hesitant to shoot down the mother-ship.

Mehmet

On Apr 6, 2011, at 5:40 PM, Haruyasu Ueda wrote:

> Hi all,
> 
> I'm writing M/R java program.
> 
> I want to abort a job itself in a map task, when the map task found
> irregular data.
> 
> I have two idea to do so.
> 1. execulte "bin/hadoop -kill jobID" in map task, from slave machine.
> 2. raise an IOException to abort.
> 
> I want to know which is better way. 
> Or, whether there is  better/recommended programming idiom.
> 
> If you have any experience about this, please share your case.
> 
> --HAL
> ========================================================================
> Haruyasu Ueda, Senior Researcher
>  Research Center for Cloud Computing
>  FUJITSU LABORATORIES LTD.
> E-mail: hal_ueda@jp.fujitsu.com
> Tel: +81 44 754 2575
> Ken-S602, 4-1-1, Kamikodanaka, Nakahara-ku, Kawasaki, 211-8588 Japan
> ========================================================================
> 
> 


Re: How to abort a job in a map task

Posted by David Rosenstrauch <da...@darose.net>.
On 04/06/2011 08:40 PM, Haruyasu Ueda wrote:
> Hi all,
>
> I'm writing M/R java program.
>
> I want to abort a job itself in a map task, when the map task found
> irregular data.
>
> I have two idea to do so.
>   1. execulte "bin/hadoop -kill jobID" in map task, from slave machine.
>   2. raise an IOException to abort.
>
> I want to know which is better way.
> Or, whether there is  better/recommended programming idiom.
>
> If you have any experience about this, please share your case.
>
>   --HAL

I'd go with throwing the exception.  That way the cause of the job crash 
will get displayed right in the Hadoop GUI.

DR