You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/06/24 02:10:35 UTC
[spark] branch branch-3.0 updated: [SPARK-32073][R] Drop R < 3.5
support
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new da8133f [SPARK-32073][R] Drop R < 3.5 support
da8133f is described below
commit da8133f8f1df7d2ddfa995974d9a7db06ff4cd5a
Author: HyukjinKwon <gu...@apache.org>
AuthorDate: Wed Jun 24 11:05:27 2020 +0900
[SPARK-32073][R] Drop R < 3.5 support
### What changes were proposed in this pull request?
Spark 3.0 accidentally dropped R < 3.5. It is built by R 3.6.3 which not support R < 3.5:
```
Error in readRDS(pfile) : cannot read workspace version 3 written by R 3.6.3; need R 3.5.0 or newer version.
```
In fact, with SPARK-31918, we will have to drop R < 3.5 entirely to support R 4.0.0. This is inevitable to release on CRAN because they require to make the tests pass with the latest R.
### Why are the changes needed?
To show the supported versions correctly, and support R 4.0.0 to unblock the releases.
### Does this PR introduce _any_ user-facing change?
In fact, no because Spark 3.0.0 already does not work with R < 3.5.
Compared to Spark 2.4, yes. R < 3.5 would not work.
### How was this patch tested?
Jenkins should test it out.
Closes #28908 from HyukjinKwon/SPARK-32073.
Authored-by: HyukjinKwon <gu...@apache.org>
Signed-off-by: HyukjinKwon <gu...@apache.org>
(cherry picked from commit b62e2536db9def0d11605ceac8990f72a515e9a0)
Signed-off-by: HyukjinKwon <gu...@apache.org>
---
R/WINDOWS.md | 4 ++--
R/pkg/DESCRIPTION | 2 +-
R/pkg/inst/profile/general.R | 4 ----
R/pkg/inst/profile/shell.R | 4 ----
docs/index.md | 3 +--
5 files changed, 4 insertions(+), 13 deletions(-)
diff --git a/R/WINDOWS.md b/R/WINDOWS.md
index dbc2717..9fe4a22b 100644
--- a/R/WINDOWS.md
+++ b/R/WINDOWS.md
@@ -22,8 +22,8 @@ To build SparkR on Windows, the following steps are required
1. Make sure `bash` is available and in `PATH` if you already have a built-in `bash` on Windows. If you do not have, install [Cygwin](https://www.cygwin.com/).
-2. Install R (>= 3.1) and [Rtools](https://cloud.r-project.org/bin/windows/Rtools/). Make sure to
-include Rtools and R in `PATH`. Note that support for R prior to version 3.4 is deprecated as of Spark 3.0.0.
+2. Install R (>= 3.5) and [Rtools](https://cloud.r-project.org/bin/windows/Rtools/). Make sure to
+include Rtools and R in `PATH`.
3. Install JDK that SparkR supports (see `R/pkg/DESCRIPTION`), and set `JAVA_HOME` in the system environment variables.
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 21f3eaa..86514f2 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -15,7 +15,7 @@ URL: https://www.apache.org/ https://spark.apache.org/
BugReports: https://spark.apache.org/contributing.html
SystemRequirements: Java (>= 8, < 12)
Depends:
- R (>= 3.1),
+ R (>= 3.5),
methods
Suggests:
knitr,
diff --git a/R/pkg/inst/profile/general.R b/R/pkg/inst/profile/general.R
index 3efb460..8c75c19 100644
--- a/R/pkg/inst/profile/general.R
+++ b/R/pkg/inst/profile/general.R
@@ -16,10 +16,6 @@
#
.First <- function() {
- if (utils::compareVersion(paste0(R.version$major, ".", R.version$minor), "3.4.0") == -1) {
- warning("Support for R prior to version 3.4 is deprecated since Spark 3.0.0")
- }
-
packageDir <- Sys.getenv("SPARKR_PACKAGE_DIR")
dirs <- strsplit(packageDir, ",")[[1]]
.libPaths(c(dirs, .libPaths()))
diff --git a/R/pkg/inst/profile/shell.R b/R/pkg/inst/profile/shell.R
index e4e0d03..f6c20e1 100644
--- a/R/pkg/inst/profile/shell.R
+++ b/R/pkg/inst/profile/shell.R
@@ -16,10 +16,6 @@
#
.First <- function() {
- if (utils::compareVersion(paste0(R.version$major, ".", R.version$minor), "3.4.0") == -1) {
- warning("Support for R prior to version 3.4 is deprecated since Spark 3.0.0")
- }
-
home <- Sys.getenv("SPARK_HOME")
.libPaths(c(file.path(home, "R", "lib"), .libPaths()))
Sys.setenv(NOAWT = 1)
diff --git a/docs/index.md b/docs/index.md
index 38f12dd4..c0771ca 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -44,10 +44,9 @@ source, visit [Building Spark](building-spark.html).
Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a supported version of Java. This should include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- all you need is to have `java` installed on your system `PATH`, or the `JAVA_HOME` environment variable pointing to a Java installation.
-Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.1+.
+Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.5+.
Java 8 prior to version 8u92 support is deprecated as of Spark 3.0.0.
Python 2 and Python 3 prior to version 3.6 support is deprecated as of Spark 3.0.0.
-R prior to version 3.4 support is deprecated as of Spark 3.0.0.
For the Scala API, Spark {{site.SPARK_VERSION}}
uses Scala {{site.SCALA_BINARY_VERSION}}. You will need to use a compatible Scala version
({{site.SCALA_BINARY_VERSION}}.x).
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org