You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2016/11/17 02:05:58 UTC
[jira] [Assigned] (SPARK-18449) Name option is being ignored when
submitting an R application via spark-submit
[ https://issues.apache.org/jira/browse/SPARK-18449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Felix Cheung reassigned SPARK-18449:
------------------------------------
Assignee: Felix Cheung
> Name option is being ignored when submitting an R application via spark-submit
> ------------------------------------------------------------------------------
>
> Key: SPARK-18449
> URL: https://issues.apache.org/jira/browse/SPARK-18449
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit, SparkR
> Affects Versions: 2.0.0, 2.0.1, 2.0.2
> Reporter: Alexander Eckert
> Assignee: Felix Cheung
> Priority: Minor
>
> The value of the _--name_ parameter is being ignored, when submitting an R script via _spark-submit_ to a Standalone Spark Cluster.
> This is the case when the R script starts a _sparkR.session()_ as well as when no sparkR.session() is used. I would expect that the value of the --name parameter should be used when no appName was specified in the sparkR.session() function or the session() function has not been called at all. And when the appName was specified, I would expect the same behaviour like in Scala or Python applications. Currently {{SparkR}} is displayed when no appName was specified.
> Example:
> 1. Edit _examples/src/main/r/dataframe.R_
> 2. Replace _sparkR.session(appName = "SparkR-DataFrame-example")_ with _sparkR.session()_
> 3. Submit dataframe.R
> {code:none}
> bin/spark-submit --master spark://ubuntu:7077 --name MyApp examples/src/main/r/dataframe.R
> {code}
> 4. Compare application names with names in the Spark WebUI.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org