You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2016/08/10 18:24:20 UTC

[jira] [Resolved] (SPARK-16579) Add a spark install function

     [ https://issues.apache.org/jira/browse/SPARK-16579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shivaram Venkataraman resolved SPARK-16579.
-------------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.1.0
                   2.0.1

Issue resolved by pull request 14258
[https://github.com/apache/spark/pull/14258]

> Add a spark install function
> ----------------------------
>
>                 Key: SPARK-16579
>                 URL: https://issues.apache.org/jira/browse/SPARK-16579
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SparkR
>            Reporter: Shivaram Venkataraman
>            Assignee: Junyang Qian
>             Fix For: 2.0.1, 2.1.0
>
>
> As described in the design doc we need to introduce a function to install Spark in case the user directly downloads SparkR from CRAN.
> To do that we can introduce a install_spark function that takes in the following arguments
> {code}
> hadoop_version
> url_to_use # defaults to apache
> local_dir # defaults to a cache dir
> {code} 
> Further more I think we can automatically run this from sparkR.init if we find Spark home and the JARs missing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org