You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nataraj Gorantla (JIRA)" <ji...@apache.org> on 2016/06/08 07:03:20 UTC
[jira] [Created] (SPARK-15817) Spark client picking hive 1.2.1 by
default which failed to alter a table name
Nataraj Gorantla created SPARK-15817:
----------------------------------------
Summary: Spark client picking hive 1.2.1 by default which failed to alter a table name
Key: SPARK-15817
URL: https://issues.apache.org/jira/browse/SPARK-15817
Project: Spark
Issue Type: Bug
Components: Spark Shell
Affects Versions: 1.6.1
Reporter: Nataraj Gorantla
Some of our scala scripts are failing with below error.
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. Invalid
method name: 'alter_table_with_cascade'
msg: org.apache.spark.sql.execution.QueryExecutionException: FAILED:
Spark when invoked is trying to initiate Hive 1.2.1 by default. We have Hive 0.14 installed. Some backgroud investigation from our side explained this.
Analysis
"alter_table_with_cascade" error occurs because of metastore version mismatch of Spark.
To correct this error set proper version of metastore in Spark config.
I tried to add a couple of parameters to spark-default-conf file.
spark.sql.hive.metastore.version 0.14.0
#spark.sql.hive.metastore.jars maven
spark.sql.hive.metastore.jars =/usr/hdp/current/hive-client/lib
Still I see issues. Can you please let me know if you have any alternative to fix this issue.
Thanks,
Nataraj G
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org