You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/03/22 16:34:41 UTC
[jira] [Resolved] (SPARK-19927) SparkThriftServer2 can not get
''--hivevar" variables in spark 2.1
[ https://issues.apache.org/jira/browse/SPARK-19927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-19927.
-------------------------------
Resolution: Duplicate
> SparkThriftServer2 can not get ''--hivevar" variables in spark 2.1
> ------------------------------------------------------------------
>
> Key: SPARK-19927
> URL: https://issues.apache.org/jira/browse/SPARK-19927
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.1, 2.1.0
> Environment: CentOS 6.5,spark 2.1 build with mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Dscala-2.11
> Reporter: bruce xu
>
> suppose the content of file test1.sql:
> -------------------------------------------------
> USE ${hivevar:db_name};
> -------------------------------------------------
>
> when execute command: bin/spark-sql -f /tmp/test.sql --hivevar db_name=offline
> the output is:
> ------------------------------------------------------------------------------------
> Error: org.apache.spark.sql.catalyst.parser.ParseException:
> no viable alternative at input '<EOF>'(line 1, pos 4)
> == SQL ==
> use
> ----^^^ (state=,code=0)
> -------------------------------------------------------------------------------------
> so the parameter --hivevar can not be read from CLI.
> the bug still appears with beeline command: bin/beeline -f /tmp/test2.sql --hivevar db_name=offline with test2.sql:
> ----------------------------------------
> !connect jdbc:hive2://localhost:10000 test test
> USE ${hivevar:db_name};
> --------------------------------------------------------------------------------------
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org