You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Olga Natkovich (JIRA)" <ji...@apache.org> on 2009/03/02 19:58:56 UTC

[jira] Commented: (HADOOP-4996) JobControl does not report killed jobs

    [ https://issues.apache.org/jira/browse/HADOOP-4996?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12678060#action_12678060 ] 

Olga Natkovich commented on HADOOP-4996:
----------------------------------------

Pig uses getFailedJobs API to retrieve any job that did not succeed. As long as this API returns both killed and failed jobs, our code would work fine.

> JobControl does not report killed jobs
> --------------------------------------
>
>                 Key: HADOOP-4996
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4996
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: mapred
>    Affects Versions: 0.19.0
>            Reporter: Olga Natkovich
>            Assignee: Amareshwari Sriramadasu
>            Priority: Blocker
>             Fix For: 0.20.0
>
>         Attachments: patch-4996-testcase.txt
>
>
> After speaking with Arun and Owen, my understanding of the situation is that separate killed job tracking was added in hadoop 18: http://issues.apache.org/jira/browse/HADOOP-3924.
> However, it does not look like this change was integrated into JobControl class. While I have not verified this yet, it looks like, applications that use JobControl would no way of knowing if one of the jobs was killed.
> This would be blocker for Pig to move to Hadoop 19.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.