You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "pin_zhang (JIRA)" <ji...@apache.org> on 2019/05/08 09:03:00 UTC
[jira] [Comment Edited] (SPARK-27600) Unable to start Spark Hive
Thrift Server when multiple hive server server share the same metastore
[ https://issues.apache.org/jira/browse/SPARK-27600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16835437#comment-16835437 ]
pin_zhang edited comment on SPARK-27600 at 5/8/19 9:02 AM:
-----------------------------------------------------------
[~hyukjin.kwon] I think this is relate to a hive bug https://issues.apache.org/jira/browse/HIVE-6113
It shows "The exception appears when there are several processes working with Hive concurrently." In hive's fix upgrade third-party datanucleus.
Is it a spark's bug if spark use the hive 1.2.1?
was (Author: pin_zhang):
I think this is relate to a hive bug https://issues.apache.org/jira/browse/HIVE-6113
It shows "The exception appears when there are several processes working with Hive concurrently." In hive's fix upgrade third-party datanucleus.
Is it a spark's bug if spark use the hive 1.2.1?
> Unable to start Spark Hive Thrift Server when multiple hive server server share the same metastore
> --------------------------------------------------------------------------------------------------
>
> Key: SPARK-27600
> URL: https://issues.apache.org/jira/browse/SPARK-27600
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: pin_zhang
> Priority: Major
>
> When start ten or more spark hive thrift servers at the same time, more than one version saved to table VERSION when meet exception WARN [DataNucleus.Query] (main:) Query for candidates of org.apache.hadoop.hive.metastore.model.MVersionTable and subclasses resulted in no possible candidates
> Exception thrown obtaining schema column information from datastore
> org.datanucleus.exceptions.NucleusDataStoreException: Exception thrown obtaining schema column information from datastore
> Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'via_ms.deleteme1556239494724' doesn't exist
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
> at com.mysql.jdbc.Util.getInstance(Util.java:408)
> at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:944)
> at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3978)
> at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3914)
> at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2530)
> at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2683)
> at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2491)
> at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2449)
> at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1381)
> at com.mysql.jdbc.DatabaseMetaData$2.forEach(DatabaseMetaData.java:2441)
> at com.mysql.jdbc.DatabaseMetaData$2.forEach(DatabaseMetaData.java:2339)
> at com.mysql.jdbc.IterateBlock.doForAll(IterateBlock.java:50)
> at com.mysql.jdbc.DatabaseMetaData.getColumns(DatabaseMetaData.java:2337)
> at org.apache.commons.dbcp.DelegatingDatabaseMetaData.getColumns(DelegatingDatabaseMetaData.java:218)
> at org.datanucleus.store.rdbms.adapter.BaseDatastoreAdapter.getColumns(BaseDatastoreAdapter.java:1532)
> at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.refreshTableData(RDBMSSchemaHandler.java:921)
> Then cannot start hive server any more because of MetaException(message:Metastore contains multiple versions (2)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org