You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2015/10/20 02:52:27 UTC

[jira] [Created] (SPARK-11199) Improve R context management story and add getOrCreate

Felix Cheung created SPARK-11199:
------------------------------------

             Summary: Improve R context management story and add getOrCreate
                 Key: SPARK-11199
                 URL: https://issues.apache.org/jira/browse/SPARK-11199
             Project: Spark
          Issue Type: Sub-task
          Components: R
    Affects Versions: 1.5.1
            Reporter: Felix Cheung
            Priority: Minor


Similar to SPARK-11114

Also from discussion in SPARK-10903:
"
Hossein Falaki added a comment - 08/Oct/15 13:06
+1 We have seen a lot of questions from new SparkR users about the life cycle of the context. 
My question is: are we going to remove or deprecate sparkRSQL.init()? I suggest we should, because right now calling that method creates a new Java SQLContext object, and having two of them prevents users form viewing temp tables.

Felix Cheung added a comment - 08/Oct/15 17:13
+1 perhaps sparkR.init() should create sqlContext and/or hiveCtx together.
But Hossein Falaki, as of now calling sparkRSQL.init() should return the same one as you can see https://github.com/apache/spark/blob/master/R/pkg/R/sparkR.R#L224

Hossein Falaki added a comment - 08/Oct/15 17:16
I meant the SQL Context: https://github.com/apache/spark/blob/master/R/pkg/R/sparkR.R#L236
This call should have been "getOrCreate."
"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org