You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "Kapil Malik (JIRA)" <ji...@apache.org> on 2016/02/14 18:25:18 UTC

[jira] [Commented] (TOREE-166) sqlContext not shared with PySpark and sparkR

    [ https://issues.apache.org/jira/browse/TOREE-166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15146651#comment-15146651 ] 

Kapil Malik commented on TOREE-166:
-----------------------------------

Has this been incorporated in the code?

> sqlContext not shared with PySpark and sparkR
> ---------------------------------------------
>
>                 Key: TOREE-166
>                 URL: https://issues.apache.org/jira/browse/TOREE-166
>             Project: TOREE
>          Issue Type: Bug
>            Reporter: nimbusgo
>             Fix For: DEC_2015
>
>
> The scala interpreter and sql interpreter appear to share the same sqlContext and you can select tables in the sql interpreter that were registered in the scala interpreter. However, It appears that the PySpark and SparkR interpreters each create their own sqlContext on construction, and dataframes registered on those sqlContext will not be shared with the sqlContext in other interpreters. Would it be possible to change it so that the python and R interpreters were instantiated with the same sqlContext as the scala interpreter?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)