You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2016/07/15 21:23:20 UTC
[jira] [Commented] (SPARK-16579) Add a spark install function
[ https://issues.apache.org/jira/browse/SPARK-16579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15380138#comment-15380138 ]
Shivaram Venkataraman commented on SPARK-16579:
-----------------------------------------------
cc [~junyangq]
> Add a spark install function
> ----------------------------
>
> Key: SPARK-16579
> URL: https://issues.apache.org/jira/browse/SPARK-16579
> Project: Spark
> Issue Type: Sub-task
> Components: SparkR
> Reporter: Shivaram Venkataraman
>
> As described in the design doc we need to introduce a function to install Spark in case the user directly downloads SparkR from CRAN.
> To do that we can introduce a install_spark function that takes in the following arguments
> {code}
> hadoop_version
> url_to_use # defaults to apache
> local_dir # defaults to a cache dir
> {code}
> Further more I think we can automatically run this from sparkR.init if we find Spark home and the JARs missing.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org