You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "sichun zhai (Jira)" <ji...@apache.org> on 2022/04/18 08:51:00 UTC

[jira] [Updated] (SPARK-38930) spark Executors status always is KILLED

     [ https://issues.apache.org/jira/browse/SPARK-38930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sichun zhai updated SPARK-38930:
--------------------------------
    Attachment: spark-default.conf

> spark  Executors  status always is KILLED
> -----------------------------------------
>
>                 Key: SPARK-38930
>                 URL: https://issues.apache.org/jira/browse/SPARK-38930
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.2, 3.1.3
>            Reporter: sichun zhai
>            Priority: Major
>         Attachments: spark-default.conf, spark-env.sh, spark-ui.png
>
>
> In  standalone deploy mode, try run spark org.apache.spark.examples.SparkPi or other spark program ,the ui always show Executors  status is killed
> haved patch  [https://github.com/apache/spark/pull/12012]
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org