You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vincent Warmerdam (JIRA)" <ji...@apache.org> on 2015/07/02 11:10:04 UTC

[jira] [Comment Edited] (SPARK-8684) Update R version in Spark EC2 AMI

    [ https://issues.apache.org/jira/browse/SPARK-8684?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14611680#comment-14611680 ] 

Vincent Warmerdam edited comment on SPARK-8684 at 7/2/15 9:09 AM:
------------------------------------------------------------------

Mhm... I've tried multiple approaches. My collegue even had a look at it and left him without a clue. 

Made a stackoverflow question for advice. 

http://stackoverflow.com/questions/31180061/r-3-2-on-aws-ami

I get the impression that the amazon AMI forces you to use the amazon repos if the package you need is also available in the amazon package system... which only have the old versions. 


was (Author: cantdutchthis):
Mhm... I've tried multiple approaches. My collegue even had a look at it and left him without a clue. 

Make a stackoverflow question for advice. 

http://stackoverflow.com/questions/31180061/r-3-2-on-aws-ami

I get the impression that the amazon AMI forces you to use the amazon repos if the package you need is also available in the amazon package system... which only have the old versions. 

> Update R version in Spark EC2 AMI
> ---------------------------------
>
>                 Key: SPARK-8684
>                 URL: https://issues.apache.org/jira/browse/SPARK-8684
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2, SparkR
>            Reporter: Shivaram Venkataraman
>            Priority: Minor
>
> Right now the R version in the AMI is 3.1 -- However a number of R libraries need R version 3.2 and it will be good to update the R version on the AMI while launching a EC2 cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org