You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/06/24 02:16:49 UTC

[spark] branch branch-2.4 updated: [SPARK-32073][R] Drop R < 3.5 support

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new 77006b2  [SPARK-32073][R] Drop R < 3.5 support
77006b2 is described below

commit 77006b2c65e0e4b6b9facddbb13aa88a264adbe2
Author: HyukjinKwon <gu...@apache.org>
AuthorDate: Wed Jun 24 11:05:27 2020 +0900

    [SPARK-32073][R] Drop R < 3.5 support
    
    Spark 3.0 accidentally dropped R < 3.5. It is built by R 3.6.3 which not support R < 3.5:
    
    ```
    Error in readRDS(pfile) : cannot read workspace version 3 written by R 3.6.3; need R 3.5.0 or newer version.
    ```
    
    In fact, with SPARK-31918, we will have to drop R < 3.5 entirely to support R 4.0.0. This is inevitable to release on CRAN because they require to make the tests pass with the latest R.
    
    To show the supported versions correctly, and support R 4.0.0 to unblock the releases.
    
    In fact, no because Spark 3.0.0 already does not work with R < 3.5.
    Compared to Spark 2.4, yes. R < 3.5 would not work.
    
    Jenkins should test it out.
    
    Closes #28908 from HyukjinKwon/SPARK-32073.
    
    Authored-by: HyukjinKwon <gu...@apache.org>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
    (cherry picked from commit b62e2536db9def0d11605ceac8990f72a515e9a0)
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 R/WINDOWS.md      | 2 +-
 R/pkg/DESCRIPTION | 2 +-
 docs/index.md     | 2 +-
 3 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/R/WINDOWS.md b/R/WINDOWS.md
index da668a6..73f8a21 100644
--- a/R/WINDOWS.md
+++ b/R/WINDOWS.md
@@ -2,7 +2,7 @@
 
 To build SparkR on Windows, the following steps are required
 
-1. Install R (>= 3.1) and [Rtools](http://cran.r-project.org/bin/windows/Rtools/). Make sure to
+1. Install R (>= 3.5) and [Rtools](http://cran.r-project.org/bin/windows/Rtools/). Make sure to
 include Rtools and R in `PATH`.
 
 2. Install
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index b70014d..2940d04 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -15,7 +15,7 @@ URL: https://www.apache.org/ https://spark.apache.org/
 BugReports: https://spark.apache.org/contributing.html
 SystemRequirements: Java (== 8)
 Depends:
-    R (>= 3.0),
+    R (>= 3.5),
     methods
 Suggests:
     knitr,
diff --git a/docs/index.md b/docs/index.md
index 52f1a5a..73cb57a 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -31,7 +31,7 @@ Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS). It's easy
 locally on one machine --- all you need is to have `java` installed on your system `PATH`,
 or the `JAVA_HOME` environment variable pointing to a Java installation.
 
-Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark {{site.SPARK_VERSION}}
+Spark runs on Java 8, Python 2.7+/3.4+ and R 3.5+. For the Scala API, Spark {{site.SPARK_VERSION}}
 uses Scala {{site.SCALA_BINARY_VERSION}}. You will need to use a compatible Scala version
 ({{site.SCALA_BINARY_VERSION}}.x).
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org