You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2015/07/23 08:23:05 UTC

[jira] [Updated] (SPARK-9270) spark.app.name is not honored by pyspark

     [ https://issues.apache.org/jira/browse/SPARK-9270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cheolsoo Park updated SPARK-9270:
---------------------------------
    Description: 
Currently, the app name is hardcoded in pyspark as "PySparkShell", and the {{spark.app.name}} property is not honored.

SPARK-8650 and SPARK-9180 fixed this issue for spark-sql and spark-shell, but pyspark is not fixed yet. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}.

In summary-
||shell||support --conf spark.app.name||
|pyspark|no|
|spark-shell|yes|
|spark-sql|yes|
|sparkR|n/a| 

  was:
Currently, the app name is hardcoded in spark-shell and pyspark as "SparkShell" and "PySparkShell" respectively, and the {{spark.app.name}} property is not honored.

But being able to set the app name is quite handy for various cluster operations. For eg, filter jobs whose app name is "X" on YARN RM page, etc.

SPARK-8650 fixed this issue for spark-sql, but it didn't for spark-shell and pyspark. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when intializing {{SparkContext}}.

In summary-
||shell||support --conf spark.app.name||
|spark-shell|no|
|pyspark|no|
|spark-sql|yes|
|sparkR|n/a| 

    Component/s:     (was: Spark Shell)
        Summary: spark.app.name is not honored by pyspark  (was: spark.app.name is not honored by spark-shell and pyspark)

> spark.app.name is not honored by pyspark
> ----------------------------------------
>
>                 Key: SPARK-9270
>                 URL: https://issues.apache.org/jira/browse/SPARK-9270
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.4.1, 1.5.0
>            Reporter: Cheolsoo Park
>            Priority: Minor
>
> Currently, the app name is hardcoded in pyspark as "PySparkShell", and the {{spark.app.name}} property is not honored.
> SPARK-8650 and SPARK-9180 fixed this issue for spark-sql and spark-shell, but pyspark is not fixed yet. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}.
> In summary-
> ||shell||support --conf spark.app.name||
> |pyspark|no|
> |spark-shell|yes|
> |spark-sql|yes|
> |sparkR|n/a| 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org