You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "sichun zhai (Jira)" <ji...@apache.org> on 2022/04/18 09:04:00 UTC
[jira] [Updated] (SPARK-38930) Spark Executors status always is KILLED
[ https://issues.apache.org/jira/browse/SPARK-38930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
sichun zhai updated SPARK-38930:
--------------------------------
Summary: Spark Executors status always is KILLED (was: spark Executors status always is KILLED)
> Spark Executors status always is KILLED
> -----------------------------------------
>
> Key: SPARK-38930
> URL: https://issues.apache.org/jira/browse/SPARK-38930
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 3.1.2, 3.1.3
> Reporter: sichun zhai
> Priority: Major
> Attachments: spark-default.conf, spark-env.sh, spark-ui.png, stderr
>
>
> In standalone deploy mode, try run spark org.apache.spark.examples.SparkPi or other spark program ,the ui always show Executors status is killed
> haved patch [https://github.com/apache/spark/pull/12012]
> run SparkPi commad:
> /opt/app/applications/bd-spark/bin/run-example --class org.apache.spark.examples.SparkPi --master spark://10.205.90.120:7077,10.205.90.131:7077 --deploy-mode cluster --driver-java-options "-Dlog4j.configuration=file:/opt/app/applications/bd-spark/conf/log4j.properties" --conf spark.executor.extraJavaOptions="-Dlog4j.configuration=file:/opt/app/applications/bd-spark/conf/log4j.properties"
>
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org