You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/23 05:32:19 UTC
[GitHub] [spark] daugraph commented on pull request #34046: [SPARK-36804][YARN] Using the verbose parameter in yarn mode would cause application submission failure
daugraph commented on pull request #34046:
URL: https://github.com/apache/spark/pull/34046#issuecomment-925517526
### Source code repository:
```bash
https://github.com/apache/spark.git -r 4ea54e8672757c0dbe3dd57c81763afdffcbcc1b
```
### Submit script/config:
```bash
export SPARK_PRINT_LAUNCH_COMMAND="1"
export SPARK_PREPEND_CLASSES="1"
export HADOOP_CONF_DIR=/path/to/hadoop/conf
export SPARK_SUBMIT_OPTS="-Djava.security.krb5.conf=/etc/krb5.conf"
spark-submit \
--master yarn \
--deploy-mode cluster \
--verbose \
--conf spark.kerberos.keytab=/path/to/keytab/file \
--conf spark.kerberos.principal=user_principal \
--conf spark.yarn.queue=root.user_queue \
--conf spark.yarn.maxAppAttempts=1 \
--class com.example.Main \
target/examples-1.0-SNAPSHOT.jar
```
### Output
```bash
NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes ahead of assembly.
Spark Command: /Library/Java/JavaVirtualMachines/jdk1.8.0_271.jdk/Contents/Home/bin/java -cp /Users/lijianmeng/github/spark/conf/:/Users/lijianmeng/github/spark/common/kvstore/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/network-common/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/network-shuffle/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/network-yarn/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/sketch/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/tags/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/common/unsafe/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/core/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/examples/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/graphx/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/launcher/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/mllib/target/scala-2.12/classes/:/Users/lijian
meng/github/spark/repl/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/resource-managers/mesos/target/scala-2.12/classes:/Users/lijianmeng/github/spark/resource-managers/yarn/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/sql/catalyst/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/sql/core/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/sql/hive/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/sql/hive-thriftserver/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/streaming/target/scala-2.12/classes/:/Users/lijianmeng/github/spark/core/target/jars/*:/Users/lijianmeng/github/spark/mllib/target/jars/*:/Users/lijianmeng/github/spark/assembly/target/scala-2.12/jars/*:/path/to/hadoop/conf/ -Djava.security.krb5.conf=/etc/krb5.conf org.apache.spark.deploy.SparkSubmit --master yarn --deploy-mode cluster --conf spark.kerberos.keytab=/path/to/keytab/file --conf spark.yarn.maxAppAttempts=1 --conf spark.kerberos.principal=user_principa
l --conf spark.yarn.queue=root.user_queue --class com.example.Main --verbose target/examples-1.0-SNAPSHOT.jar
========================================
Using properties file: null
Parsed arguments:
master yarn
deployMode cluster
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue root.user_queue
numExecutors null
files null
pyFiles null
archives null
mainClass com.example.Main
primaryResource file:/Users/lijianmeng/bigdata/examples/target/examples-1.0-SNAPSHOT.jar
name com.example.Main
childArgs []
jars null
packages null
packagesExclusions null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file null:
(spark.yarn.queue,root.user_queue)
(spark.yarn.maxAppAttempts,1)
(spark.kerberos.principal,user_principal)
(spark.kerberos.keytab,/path/to/keytab/file)
21/09/23 13:17:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Main class:
org.apache.spark.deploy.yarn.YarnClusterApplication
Arguments:
--jar
file:/Users/lijianmeng/bigdata/examples/target/examples-1.0-SNAPSHOT.jar
--class
com.example.Main
--verbose
Spark config:
(spark.kerberos.keytab,/path/to/keytab/file)
(spark.yarn.queue,root.user_queue)
(spark.app.name,com.example.Main)
(spark.kerberos.principal,user_principal)
(spark.submit.pyFiles,)
(spark.submit.deployMode,cluster)
(spark.yarn.maxAppAttempts,1)
(spark.master,yarn)
Classpath elements:
file:/Users/lijianmeng/bigdata/examples/target/examples-1.0-SNAPSHOT.jar
Exception in thread "main" java.lang.IllegalArgumentException: Unknown/unsupported param List(--verbose)
Usage: org.apache.spark.deploy.yarn.Client [options]
Options:
--jar JAR_PATH Path to your application's JAR file (required in YARN cluster
mode)
--class CLASS_NAME Name of your application's main class (required)
--primary-py-file A main Python file
--primary-r-file A main R file
--arg ARG Argument to be passed to your application's main class.
Multiple invocations are possible, each will be passed in order.
at org.apache.spark.deploy.yarn.ClientArguments.parseArgs(ClientArguments.scala:61)
at org.apache.spark.deploy.yarn.ClientArguments.<init>(ClientArguments.scala:31)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1701)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/09/23 13:17:38 INFO ShutdownHookManager: Shutdown hook called
21/09/23 13:17:38 INFO ShutdownHookManager: Deleting directory /private/var/folders/q4/bs9293m948l_rt31tm932s7w0000gn/T/spark-37708240-12b1-4013-9765-b7cc2c4d486e
```
@HyukjinKwon This is self-reproducer console log, cloud you review it? thanks for your time.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org