You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "Alan Gates (JIRA)" <ji...@apache.org> on 2011/09/14 02:10:09 UTC

[jira] [Updated] (PIG-604) Kill the Pig job should kill all associated Hadoop Jobs

     [ https://issues.apache.org/jira/browse/PIG-604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Alan Gates updated PIG-604:
---------------------------

    Release Note: Previous to this fix Pig would not start any new MapReduce jobs once the Pig controlling process was killed, but it would not stop any currently running MapReduce jobs.  Now it will attempt to kill running MapReduce jobs also.  Note that under certain conditions this will still fail, such as when Pig is killed with a KILL signal and does not have a chance to call its shutdown hooks.

> Kill the Pig job should kill all associated Hadoop Jobs
> -------------------------------------------------------
>
>                 Key: PIG-604
>                 URL: https://issues.apache.org/jira/browse/PIG-604
>             Project: Pig
>          Issue Type: Improvement
>          Components: grunt
>            Reporter: Yiping Han
>            Assignee: Daniel Dai
>            Priority: Minor
>             Fix For: 0.10
>
>         Attachments: PIG-604-1.patch
>
>
> Current if we kill the pig job on the client machine, those hadoop jobs already launched still keep running. We have to kill these jobs manually.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira