You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Otto (JIRA)" <ji...@apache.org> on 2018/11/13 15:33:00 UTC

[jira] [Commented] (SPARK-14492) Spark SQL 1.6.0 does not work with external Hive metastore version lower than 1.2.0; its not backwards compatible with earlier version

    [ https://issues.apache.org/jira/browse/SPARK-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685375#comment-16685375 ] 

Andrew Otto commented on SPARK-14492:
-------------------------------------

FWIW, I am experiencing the same problem with Spark 2.3.1 and Hive 1.1.0 (from CDH 5.15.0).  I've tried setting both spark.sql.hive.metastore.version and spark.sql.hive.metastore.jars (although I'm not sure I've got the right classpath for that one), and am still experiencing this problem.
{code:java}
18/11/13 14:31:12 ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchFieldError: METASTORE_CLIENT_SOCKET_LIFETIME
java.lang.NoSuchFieldError: METASTORE_CLIENT_SOCKET_LIFETIME
 at org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:195)
 at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
 at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
 at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
 at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
 at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
 at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
 at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
 at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
 at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
 at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
 at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
 at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
 at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
 at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
 at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
 at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
 at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
 at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
 at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
 at org.apache.spark.sql.Dataset.<init>(Dataset.scala:172)
 at org.apache.spark.sql.Dataset.<init>(Dataset.scala:178)
 at org.apache.spark.sql.Dataset$.apply(Dataset.scala:65)
 at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:488){code}

> Spark SQL 1.6.0 does not work with external Hive metastore version lower than 1.2.0; its not backwards compatible with earlier version
> --------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-14492
>                 URL: https://issues.apache.org/jira/browse/SPARK-14492
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Sunil Rangwani
>            Priority: Critical
>
> Spark SQL when configured with a Hive version lower than 1.2.0 throws a java.lang.NoSuchFieldError for the field METASTORE_CLIENT_SOCKET_LIFETIME because this field was introduced in Hive 1.2.0 so its not possible to use Hive metastore version lower than 1.2.0 with Spark. The details of the Hive changes can be found here: https://issues.apache.org/jira/browse/HIVE-9508 
> {code:java}
> Exception in thread "main" java.lang.NoSuchFieldError: METASTORE_CLIENT_SOCKET_LIFETIME
> 	at org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:500)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:250)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
> 	at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> 	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> 	at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:267)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:139)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org