You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2015/07/23 07:28:05 UTC

[jira] [Created] (SPARK-9270) spark.app.name is not honored by spark-shell and pyspark

Cheolsoo Park created SPARK-9270:
------------------------------------

             Summary: spark.app.name is not honored by spark-shell and pyspark
                 Key: SPARK-9270
                 URL: https://issues.apache.org/jira/browse/SPARK-9270
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Shell
    Affects Versions: 1.4.1, 1.5.0
            Reporter: Cheolsoo Park
            Priority: Minor


Currently, the app name is hardcoded in spark-shell and pyspark as "SparkShell" and "PySparkShell" respectively, and the {{spark.app.name}} property is not honored.

But being able to set the app name is quite handy for various cluster operations. For eg, filter jobs whose app name is "X" on YARN RM page, etc.

SPARK-8650 fixed this issue for spark-sql, but it didn't for spark-shell and pyspark. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when intializing {{SparkContext}}.

In summary-
||shell||support --conf spark.app.name||
|spark-shell|no|
|pyspark|no|
|spark-sql|yes|
|sparkR|n/a| 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org