You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2015/07/23 07:44:05 UTC

[jira] [Comment Edited] (SPARK-9270) spark.app.name is not honored by spark-shell and pyspark

    [ https://issues.apache.org/jira/browse/SPARK-9270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14638187#comment-14638187 ] 

Cheolsoo Park edited comment on SPARK-9270 at 7/23/15 5:43 AM:
---------------------------------------------------------------

Oh it is. I missed it. So spark-shell is fixed. Then should I fix pyspark in this jira?


was (Author: cheolsoo):
Oh it is. I missed it. So spark-shell is fixed. Then should I fixed pyspark in this jira?

> spark.app.name is not honored by spark-shell and pyspark
> --------------------------------------------------------
>
>                 Key: SPARK-9270
>                 URL: https://issues.apache.org/jira/browse/SPARK-9270
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Spark Shell
>    Affects Versions: 1.4.1, 1.5.0
>            Reporter: Cheolsoo Park
>            Priority: Minor
>
> Currently, the app name is hardcoded in spark-shell and pyspark as "SparkShell" and "PySparkShell" respectively, and the {{spark.app.name}} property is not honored.
> But being able to set the app name is quite handy for various cluster operations. For eg, filter jobs whose app name is "X" on YARN RM page, etc.
> SPARK-8650 fixed this issue for spark-sql, but it didn't for spark-shell and pyspark. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when intializing {{SparkContext}}.
> In summary-
> ||shell||support --conf spark.app.name||
> |spark-shell|no|
> |pyspark|no|
> |spark-sql|yes|
> |sparkR|n/a| 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org