You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sunil Rangwani (JIRA)" <ji...@apache.org> on 2018/11/17 15:28:00 UTC

[jira] [Comment Edited] (SPARK-14492) Spark SQL 1.6.0 does not work with external Hive metastore version lower than 1.2.0; its not backwards compatible with earlier version

    [ https://issues.apache.org/jira/browse/SPARK-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16690595#comment-16690595 ] 

Sunil Rangwani edited comment on SPARK-14492 at 11/17/18 3:27 PM:
------------------------------------------------------------------

[~srowen] This is exactly not that. It is not about building with varying versions of Hive. Please refer to the discussions above.

The {{java.lang.NoSuchFieldError}} is a runtime error. The version used at runtime does not have this field!

Really the documentation should be updated to say the minimum supported version of Hive is 1.2.x or this bug should be fixed to support different versions of Hive as the documentation states.


was (Author: sunil.rangwani):
[~srowen] This is exactly not that. It is not about building with varying versions of Hive. Please refer to the discussions above.

The {{java.lang.NoSuchFieldError}} is a runtime error.

Really the documentation should be updated to say the minimum supported version of Hive is 1.2.x or this bug should be fixed to support different versions of Hive as the documentation states. 



> Spark SQL 1.6.0 does not work with external Hive metastore version lower than 1.2.0; its not backwards compatible with earlier version
> --------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-14492
>                 URL: https://issues.apache.org/jira/browse/SPARK-14492
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Sunil Rangwani
>            Priority: Critical
>
> Spark SQL when configured with a Hive version lower than 1.2.0 throws a java.lang.NoSuchFieldError for the field METASTORE_CLIENT_SOCKET_LIFETIME because this field was introduced in Hive 1.2.0 so its not possible to use Hive metastore version lower than 1.2.0 with Spark. The details of the Hive changes can be found here: https://issues.apache.org/jira/browse/HIVE-9508 
> {code:java}
> Exception in thread "main" java.lang.NoSuchFieldError: METASTORE_CLIENT_SOCKET_LIFETIME
> 	at org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:500)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:250)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
> 	at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> 	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> 	at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:267)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:139)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org