You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/03/30 15:11:41 UTC

[jira] [Resolved] (SPARK-20096) Expose the real queue name not null while using --verbose

     [ https://issues.apache.org/jira/browse/SPARK-20096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-20096.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 2.2.0

Issue resolved by pull request 17430
[https://github.com/apache/spark/pull/17430]

> Expose the real queue name not null while using --verbose
> ---------------------------------------------------------
>
>                 Key: SPARK-20096
>                 URL: https://issues.apache.org/jira/browse/SPARK-20096
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.2.0
>            Reporter: Kent Yao
>            Priority: Minor
>             Fix For: 2.2.0
>
>
> while submit apps with -v or --verboseļ¼Œ we can print the right queue name, but if we set a queue name with `spark.yarn.queue` by --conf or in the spark-default.conf, we just got `null` for the queue in Parsed arguments.
> {code}
> bin/spark-shell -v --conf spark.yarn.queue=thequeue
> Using properties file: /home/hadoop/spark-2.1.0-bin-apache-hdp2.7.3/conf/spark-defaults.conf
> ....
> Adding default property: spark.yarn.queue=default
> Parsed arguments:
>   master                  yarn
>   deployMode              client
>   ...
>   queue                   null
>   ....
>   verbose                 true
> Spark properties used, including those specified through
>  --conf and those from the properties file /home/hadoop/spark-2.1.0-bin-apache-hdp2.7.3/conf/spark-defaults.conf:
>   spark.yarn.queue -> thequeue
>   ....
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org