You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@systemml.apache.org by "Mike Dusenberry (JIRA)" <ji...@apache.org> on 2016/05/09 17:48:12 UTC
[jira] [Updated] (SYSTEMML-668) Python MLOutput.getDF() Can't
Access JVM SQLContext
[ https://issues.apache.org/jira/browse/SYSTEMML-668?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mike Dusenberry updated SYSTEMML-668:
-------------------------------------
Description: In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance -has changed from {{sqlContext._scala_SQLContext}} to {{sqlContext._ssql_ctx}}- has always been official exposed via {{sqlContext._ssql_ctx}}. However, we have been using an unofficial variable, {{sqlContext._scala_SQLContext}}, which has been renamed in 2.0, breaking any previous code using the former construct, such as our Python {{MLOutput.getDF(...)}} method. Therefore, we just need to update our PySpark API to use the official access point. (was: In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance has changed from {{sqlContext._scala_SQLContext}} to {{sqlContext._ssql_ctx}}, breaking any previous code using the former construct, such as our Python {{MLOutput.getDF(...)}} method.)
> Python MLOutput.getDF() Can't Access JVM SQLContext
> ---------------------------------------------------
>
> Key: SYSTEMML-668
> URL: https://issues.apache.org/jira/browse/SYSTEMML-668
> Project: SystemML
> Issue Type: Bug
> Reporter: Mike Dusenberry
> Assignee: Mike Dusenberry
> Priority: Minor
>
> In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance -has changed from {{sqlContext._scala_SQLContext}} to {{sqlContext._ssql_ctx}}- has always been official exposed via {{sqlContext._ssql_ctx}}. However, we have been using an unofficial variable, {{sqlContext._scala_SQLContext}}, which has been renamed in 2.0, breaking any previous code using the former construct, such as our Python {{MLOutput.getDF(...)}} method. Therefore, we just need to update our PySpark API to use the official access point.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)