You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@livy.apache.org by "Michal Sankot (Jira)" <ji...@apache.org> on 2019/11/12 16:46:00 UTC

[jira] [Updated] (LIVY-712) EMR 5.23/5.27 - Livy does not recognise that Spark job failed

     [ https://issues.apache.org/jira/browse/LIVY-712?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michal Sankot updated LIVY-712:
-------------------------------
    Labels: EMR api spark  (was: )

> EMR 5.23/5.27 - Livy does not recognise that Spark job failed
> -------------------------------------------------------------
>
>                 Key: LIVY-712
>                 URL: https://issues.apache.org/jira/browse/LIVY-712
>             Project: Livy
>          Issue Type: Bug
>          Components: API
>    Affects Versions: 0.5.0, 0.6.0
>         Environment: AWS EMR 5.23/5.27, Scala
>            Reporter: Michal Sankot
>            Priority: Major
>              Labels: EMR, api, spark
>
> We've upgraded from AWS EMR 5.13 -> 5.23 (Livy 0.4.0 -> 0.5.0, Spark 2.3.0 -> 2.4.0) and an issue appears that when there is an exception thrown during Spark job execution, Spark shuts down as if there was no problem and job appears as Completed in EMR. So we're not notified when system crashes. The same problem appears in EMR 5.27 (Livy 0.6.0, Spark 2.4.4).
> Is it something with Spark? Or a known issue with Livy?
> In Livy logs I see that spark-submit exists with error code 1
> {quote}{{05:34:59 WARN BatchSession$: spark-submit exited with code 1}}
> {quote}
>  And then Livy API states that batch state is
> {quote}{{"state": "success"}}
> {quote}
> How can it be made work again?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)