You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Lian (JIRA)" <ji...@apache.org> on 2015/05/29 18:00:25 UTC

[jira] [Created] (SPARK-7950) HiveThriftServer2.startWithContext() doesn't set "spark.sql.hive.version"

Cheng Lian created SPARK-7950:
---------------------------------

             Summary: HiveThriftServer2.startWithContext() doesn't set "spark.sql.hive.version"
                 Key: SPARK-7950
                 URL: https://issues.apache.org/jira/browse/SPARK-7950
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.1, 1.3.0, 1.4.0
         Environment: Simba Spark SQL ODBC driver 1.0.8.1006
            Reporter: Cheng Lian
            Assignee: Cheng Lian
            Priority: Critical


While testing the newly released Simba Spark SQL ODBC driver 1.0.8.1006 against 1.4.0-SNAPSHOT, we found that if {{HiveThriftServer2}} is started with {{HiveThriftServer2.startWithContext()}}, then simple queries like
{code:sql}
SELECT * FROM src
{code}
fail with the following error message (need to turn on ODBC trace log):
{noformat}
		DIAG [S0002] [Simba][SQLEngine] (31740) Table or view not found: SPARK..src
{noformat}
However, JDBC client like Beeline is fine. Also, if the server is started via {{sbin/start-thriftserver.sh}}, both ODBC and JDBC work fine.

The reason for this failure is that, {{HiveThriftServer2.startWithContext()}} doesn't properly set the "spark.sql.hive.version" property. It seems that Simba ODBC driver 1.0.8.1006 behaves differently when this property is missing. What I observed is that, in this case, the ODBC driver issues a {{GetColumns}} command, which isn't overriden in Spark {{HiveThriftServer2}}, and this falls back to original Hive code path, which results in unexpected behavior.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org