You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/08/09 05:49:00 UTC

[jira] [Resolved] (SPARK-21503) Spark UI shows incorrect task status for a killed Executor Process

     [ https://issues.apache.org/jira/browse/SPARK-21503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-21503.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0
                   2.2.1

Issue resolved by pull request 18707
[https://github.com/apache/spark/pull/18707]

> Spark UI shows incorrect task status for a killed Executor Process
> ------------------------------------------------------------------
>
>                 Key: SPARK-21503
>                 URL: https://issues.apache.org/jira/browse/SPARK-21503
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Parth Gandhi
>            Assignee: Parth Gandhi
>            Priority: Minor
>             Fix For: 2.2.1, 2.3.0
>
>
> The executor tab on Spark UI page shows task as completed when an executor process that is running that task is killed using the kill command.
> Steps:
> 1. Run a big Spark job. As an example, I ran a pyspark job with the following command:
> $SPARK_HOME/bin/spark-submit --master yarn --deploy-mode cluster --queue default --num-executors 10 --driver-memory 2G --conf spark.pyspark.driver.python=./Python3/bin/python --conf spark.pyspark.python=./Python3/bin/python --archives hdfs:///user/USERNAME/Python3.zip#Python3 ~/pi.py
> 2. Go to the UI to see which executors are running.
> 3. Do an ssh to each of the executor hosts and kill the java process running on the respective port mentioned in the UI using the following command:
> kill <pid> OR kill -9 <pid>



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org