You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by fe...@apache.org on 2019/01/13 00:41:34 UTC

[spark] branch branch-2.3 updated: [SPARK-26010][R] fix vignette eval with Java 11

This is an automated email from the ASF dual-hosted git repository.

felixcheung pushed a commit to branch branch-2.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.3 by this push:
     new a9a1bc7  [SPARK-26010][R] fix vignette eval with Java 11
a9a1bc7 is described below

commit a9a1bc78e619997007853be568a134ce3d1bbb14
Author: Felix Cheung <fe...@hotmail.com>
AuthorDate: Mon Nov 12 19:03:30 2018 -0800

    [SPARK-26010][R] fix vignette eval with Java 11
    
    ## What changes were proposed in this pull request?
    
    changes in vignette only to disable eval
    
    ## How was this patch tested?
    
    Jenkins
    
    Author: Felix Cheung <fe...@hotmail.com>
    
    Closes #23007 from felixcheung/rjavavervig.
    
    (cherry picked from commit 88c82627267a9731b2438f0cc28dd656eb3dc834)
    Signed-off-by: Felix Cheung <fe...@apache.org>
---
 R/pkg/vignettes/sparkr-vignettes.Rmd | 14 ++++++++++++++
 1 file changed, 14 insertions(+)

diff --git a/R/pkg/vignettes/sparkr-vignettes.Rmd b/R/pkg/vignettes/sparkr-vignettes.Rmd
index d4713de..70970bd 100644
--- a/R/pkg/vignettes/sparkr-vignettes.Rmd
+++ b/R/pkg/vignettes/sparkr-vignettes.Rmd
@@ -57,6 +57,20 @@ First, let's load and attach the package.
 library(SparkR)
 ```
 
+```{r, include=FALSE}
+# disable eval if java version not supported
+override_eval <- tryCatch(!is.numeric(SparkR:::checkJavaVersion()),
+          error = function(e) { TRUE },
+          warning = function(e) { TRUE })
+
+if (override_eval) {
+  opts_hooks$set(eval = function(options) {
+    options$eval = FALSE
+    options
+  })
+}
+```
+
 `SparkSession` is the entry point into SparkR which connects your R program to a Spark cluster. You can create a `SparkSession` using `sparkR.session` and pass in options such as the application name, any Spark packages depended on, etc.
 
 We use default settings in which it runs in local mode. It auto downloads Spark package in the background if no previous installation is found. For more details about setup, see [Spark Session](#SetupSparkSession).


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org