You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2019/04/06 18:03:00 UTC

[jira] [Updated] (SPARK-24535) Fix java version parsing in SparkR on Windows

     [ https://issues.apache.org/jira/browse/SPARK-24535?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Felix Cheung updated SPARK-24535:
---------------------------------
    Issue Type: Sub-task  (was: Bug)
        Parent: SPARK-15799

> Fix java version parsing in SparkR on Windows
> ---------------------------------------------
>
>                 Key: SPARK-24535
>                 URL: https://issues.apache.org/jira/browse/SPARK-24535
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SparkR
>    Affects Versions: 2.3.1, 2.4.0
>            Reporter: Shivaram Venkataraman
>            Assignee: Felix Cheung
>            Priority: Blocker
>             Fix For: 2.3.2, 2.4.0
>
>
> We see errors on CRAN of the formĀ 
> {code:java}
>   java version "1.8.0_144"
>   Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
>   Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
>   Picked up _JAVA_OPTIONS: -XX:-UsePerfData 
>   -- 1. Error: create DataFrame from list or data.frame (@test_basic.R#21)  ------
>   subscript out of bounds
>   1: sparkR.session(master = sparkRTestMaster, enableHiveSupport = FALSE, sparkConfig = sparkRTestConfig) at D:/temp/RtmpIJ8Cc3/RLIBS_3242c713c3181/SparkR/tests/testthat/test_basic.R:21
>   2: sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, sparkExecutorEnvMap, 
>          sparkJars, sparkPackages)
>   3: checkJavaVersion()
>   4: strsplit(javaVersionFilter[[1]], "[\"]")
> {code}
> The complete log file is at http://home.apache.org/~shivaram/SparkR_2.3.1_check_results/Windows/00check.log



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org