You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/12/15 00:09:58 UTC

[jira] [Assigned] (SPARK-18867) Throw cause if IsolatedClientLoad can't create client

     [ https://issues.apache.org/jira/browse/SPARK-18867?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-18867:
------------------------------------

    Assignee: Apache Spark

> Throw cause if IsolatedClientLoad can't create client
> -----------------------------------------------------
>
>                 Key: SPARK-18867
>                 URL: https://issues.apache.org/jira/browse/SPARK-18867
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.0, 2.0.0
>         Environment: RStudio 1.0.44 + SparkR (Spark 2.0.2)
>            Reporter: Wei-Chiu Chuang
>            Assignee: Apache Spark
>            Priority: Minor
>
> If IsolatedClientLoader can't instantiate a class object, it throws {{InvocationTargetException}}. But the caller doesn't need to know this exception. Instead, it should throw the exception that causes the {{InvocationTargetException}}, so that the caller may be able to handle it.
> This exception is reproducible if I run the following code snippet in two RStudio consoles without cleaning sessions. (This is a RStudio issue after all but in general it may be exhibited in other ways)
> {code}
> Sys.setenv(SPARK_HOME="/Users/weichiu/Downloads/spark-2.0.2-bin-hadoop2.7")
> library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))
> sparkR.session(master = "local[*]", sparkConfig = list(spark.driver.memory = "2g"))
> df <- as.DataFrame(faithful)
> sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org