You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/11/01 07:09:59 UTC

[jira] [Closed] (SPARK-18190) Fix R version to not the latest in AppVeyor

     [ https://issues.apache.org/jira/browse/SPARK-18190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon closed SPARK-18190.
--------------------------------
    Resolution: Not A Problem

This is not a problem as R 3.3.2 for Windows is now released and it seems fine now.

> Fix R version to not the latest in AppVeyor
> -------------------------------------------
>
>                 Key: SPARK-18190
>                 URL: https://issues.apache.org/jira/browse/SPARK-18190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, SparkR
>            Reporter: Hyukjin Kwon
>
> Currently, Spark supports the test on Windows via AppVeyor but not it seems failing to download R 3.3.1 after R 3.3.2 is released.
> It downloads given R version after checking if that is the latest or not via http://rversions.r-pkg.org/r-release because the URL.
> For example, the latest one has the URL as below:
> https://cran.r-project.org/bin/windows/base/R-3.3.1-win.exe
> and the old one has the URL as below.
> https://cran.r-project.org/bin/windows/base/old/3.3.0/R-3.3.0-win.exe
> The problem is, it seems the versions of R on Windows are not always synced with the latest versions.
> Please check https://cloud.r-project.org
> So, currently, AppVeyor tries to find https://cran.r-project.org/bin/windows/base/old/3.3.1/R-3.3.1-win.exe (which is the URL for old versions) as 3.3.2 is released but does not exist because it seems R 3.3.2 for Windows is not there.
> It seems safer to lower the version as SparkR supports 3.1+ if I remember correctly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org