You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:22:04 UTC
[jira] [Updated] (SPARK-18449) Name option is being ignored when
submitting an R application via spark-submit
[ https://issues.apache.org/jira/browse/SPARK-18449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-18449:
---------------------------------
Labels: bulk-closed (was: )
> Name option is being ignored when submitting an R application via spark-submit
> ------------------------------------------------------------------------------
>
> Key: SPARK-18449
> URL: https://issues.apache.org/jira/browse/SPARK-18449
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit, SparkR
> Affects Versions: 2.0.0, 2.0.1, 2.0.2
> Reporter: Alexander Eckert
> Assignee: Felix Cheung
> Priority: Minor
> Labels: bulk-closed
>
> The value of the _--name_ parameter is being ignored, when submitting an R script via _spark-submit_ to a Standalone Spark Cluster.
> This is the case when the R script starts a _sparkR.session()_ as well as when no sparkR.session() is used. I would expect that the value of the --name parameter should be used when no appName was specified in the sparkR.session() function or the session() function has not been called at all. And when the appName was specified, I would expect the same behaviour like in Scala or Python applications. Currently {{SparkR}} is displayed when no appName was specified.
> Example:
> 1. Edit _examples/src/main/r/dataframe.R_
> 2. Replace _sparkR.session(appName = "SparkR-DataFrame-example")_ with _sparkR.session()_
> 3. Submit dataframe.R
> {code:none}
> bin/spark-submit --master spark://ubuntu:7077 --name MyApp examples/src/main/r/dataframe.R
> {code}
> 4. Compare application names with names in the Spark WebUI.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org