You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2015/07/23 09:43:06 UTC

[jira] [Updated] (SPARK-9270) Allow --name option in pyspark

     [ https://issues.apache.org/jira/browse/SPARK-9270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cheolsoo Park updated SPARK-9270:
---------------------------------
    Description: 
Currently, the app name is hardcoded in pyspark as "PySparkShell", and the app name cannot be set.

SPARK-8650 fixed this issue for spark-sql, but pyspark is not fixed.

SPARK-9180 introduced a new option {{--name}} for spark-shell, but the {{spark.app.name}} property isn't honored in spark-shell.

sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}.

In summary-
||shell||able to set app name||
|pyspark|no|
|spark-shell|yes via --name|
|spark-sql|yes via --conf spark.app.name|
|sparkR|n/a| 

  was:
Currently, the app name is hardcoded in pyspark as "PySparkShell", and the {{spark.app.name}} property is not honored.

SPARK-8650 fixed this issue for spark-sql, but pyspark is not fixed.

SPARK-9180 introduced a new option {{--name}} for spark-shell, but the {{spark.app.name}} property isn't honored in spark-shell.

sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}.

In summary-
||shell||support --conf spark.app.name||
|pyspark|no|
|spark-shell|no, but --name has the same result|
|spark-sql|yes|
|sparkR|n/a| 

        Summary: Allow --name option in pyspark  (was: spark.app.name is not honored by pyspark)

> Allow --name option in pyspark
> ------------------------------
>
>                 Key: SPARK-9270
>                 URL: https://issues.apache.org/jira/browse/SPARK-9270
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.4.1, 1.5.0
>            Reporter: Cheolsoo Park
>            Priority: Minor
>
> Currently, the app name is hardcoded in pyspark as "PySparkShell", and the app name cannot be set.
> SPARK-8650 fixed this issue for spark-sql, but pyspark is not fixed.
> SPARK-9180 introduced a new option {{--name}} for spark-shell, but the {{spark.app.name}} property isn't honored in spark-shell.
> sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}.
> In summary-
> ||shell||able to set app name||
> |pyspark|no|
> |spark-shell|yes via --name|
> |spark-sql|yes via --conf spark.app.name|
> |sparkR|n/a| 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org