You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Nguyen (JIRA)" <ji...@apache.org> on 2016/03/08 00:13:40 UTC

[jira] [Commented] (SPARK-13725) Spark 1.6.0 stopping working for HiveThriftServer2 and registerTempTable

    [ https://issues.apache.org/jira/browse/SPARK-13725?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15183964#comment-15183964 ] 

Michael Nguyen commented on SPARK-13725:
----------------------------------------

I typically do not set issues to Blocker. I set this issue to Blocker, because these specified APIs used to work in earlier versions of Spark up to 1.5.2, and there are existing code that relies on that and now fails because of this issue in Spark 1.6.0. 

> Spark 1.6.0 stopping working for HiveThriftServer2 and registerTempTable
> ------------------------------------------------------------------------
>
>                 Key: SPARK-13725
>                 URL: https://issues.apache.org/jira/browse/SPARK-13725
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>         Environment: Spark 1.6.0 with DataFrame.registerTempTable and HiveThriftServer2
>            Reporter: Michael Nguyen
>
> In Spark 1.5.2, DataFrame.registerTempTable API works correctly and HiveThriftServer2 sees and returns temp tables that are registered via that API.
> In Spark 1.6.0, that stopped working.  registerTempTable API does not return an error so it is a false positive, and HiveThriftServer2  does not see such tables. And hiveContext.table(registerTableName) indicates it does not see those tables either.
> Is there a temporary work-around solution in Spark 1.6.0 ? When would it be fixed ?
> Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org