You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Brett Meyer <Br...@crowdstrike.com> on 2014/11/11 20:47:15 UTC

Failed jobs showing as SUCCEEDED on web UI

I¹m running a Python script using spark-submit on YARN in an EMR cluster,
and if I have a job that fails due to ExecutorLostFailure or if I kill the
job, it still shows up on the web UI with a FinalStatus of SUCCEEDED.  Is
this due to PySpark, or is there potentially some other issue with the job
failure status not propagating to the logs?