You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Prashant Sharma (JIRA)" <ji...@apache.org> on 2018/11/29 07:34:00 UTC

[jira] [Comment Edited] (SPARK-26059) Spark standalone mode, does not correctly record a failed Spark Job.

    [ https://issues.apache.org/jira/browse/SPARK-26059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16702809#comment-16702809 ] 

Prashant Sharma edited comment on SPARK-26059 at 11/29/18 7:33 AM:
-------------------------------------------------------------------

This will be a won't fix, as to fix it, 

1) One would have to make a rest call on a failure of job i.e. on catch of an exception. This bug is only applicable to client mode of standalone deploy mode. Since the outcome of the rest call itself is not guaranteed, it can never be assured that the correct state of failed application will always be reported.
2) A lot of things have to change. For example, currently standalone master does not track the progress of a job submitted in client mode. Introducing a new rest call, which allows marking the status of the Job, in client mode, etc.



was (Author: prashant_):
This will be a won't fix, as to fix it, 

1) One would have to make a rest call on a failure of job, when inside client mode. Since the outcome of the rest call itself is not guaranteed, it can never be assured that the correct stated of failed application will always be reported.
2) A lot of things have to change. For example, currently standalone master does not track the progress of a job submitted in client mode. Introducing a new rest call, which allows marking the status of the Job, in client mode, etc.


> Spark standalone mode, does not correctly record a failed Spark Job.
> --------------------------------------------------------------------
>
>                 Key: SPARK-26059
>                 URL: https://issues.apache.org/jira/browse/SPARK-26059
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Prashant Sharma
>            Priority: Major
>
> In order to reproduce submit a failing job to spark standalone master. The status for the failed job is shown as FINISHED, irrespective of the fact it failed or succeeded. 
> EDIT: It happens only when deploy-mode is client, and when deploy mode is cluster it works as expected.
> - Reported by: Surbhi Bakhtiyar.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org