You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "zzshine (Jira)" <ji...@apache.org> on 2021/05/08 10:03:00 UTC

[jira] [Commented] (HIVE-18382) Duplicate entry key when create_table/add_partition

    [ https://issues.apache.org/jira/browse/HIVE-18382?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17341265#comment-17341265 ] 

zzshine commented on HIVE-18382:
--------------------------------

I encountered the same situation, please,  how to fix it.

> Duplicate entry key when create_table/add_partition 
> ----------------------------------------------------
>
>                 Key: HIVE-18382
>                 URL: https://issues.apache.org/jira/browse/HIVE-18382
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 1.2.1
>         Environment: Hive: 1.2.1
> Hadoop: 2.7.1
> metadb: Mysql, version:5.1.40
>            Reporter: Biao Wu
>            Assignee: Biao Wu
>            Priority: Critical
>
> Add_partitions and create_table often fails.
> Here is the HMS log.
> {code:java}
> 2018-01-03 03:43:55,541 ERROR [pool-10-thread-76716]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(173)) - Retrying HMSHandler after 2000 ms (attempt 1 of 10) with error: javax.jdo.JDOData
> StoreException: Get request failed : SELECT `A0`.`PARAM_VALUE` FROM `SERDE_PARAMS` `A0` WHERE `A0`.`SERDE_ID` = ? AND `A0`.`PARAM_KEY` = ?
>         at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
>         at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:859)
>         at org.apache.hadoop.hive.metastore.ObjectStoreWithBIMapping.createTable(ObjectStoreWithBIMapping.java:174)
>         at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>         at com.sun.proxy.$Proxy11.createTable(Unknown Source)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1522)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1555)
>         at sun.reflect.GeneratedMethodAccessor87.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>         at com.sun.proxy.$Proxy13.create_table_with_environment_context(Unknown Source)
>         at sun.reflect.GeneratedMethodAccessor87.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$MetricHMSProxy.invoke(HiveMetaStore.java:6098)
>         at com.sun.proxy.$Proxy13.create_table_with_environment_context(Unknown Source)
>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9216)
>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9200)
>         at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>         at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:731)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:726)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1690)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:726)
>         at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> NestedThrowablesStackTrace:
> java.sql.BatchUpdateException: Duplicate entry '508649089' for key 'PRIMARY'
>         at com.mysql.jdbc.SQLError.createBatchUpdateException(SQLError.java:1167)
>         at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1773)
>         at com.mysql.jdbc.PreparedStatement.executeBatchInternal(PreparedStatement.java:1257)
>         at com.mysql.jdbc.StatementImpl.executeBatch(StatementImpl.java:958)
>         at com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:424)
>         at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:366)
>         at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:676)
>         at org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:319)
>         at org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:295)
>         at org.datanucleus.store.rdbms.scostore.JoinMapStore.getValue(JoinMapStore.java:690)
>         at org.datanucleus.store.rdbms.scostore.JoinMapStore.putAll(JoinMapStore.java:194)
>         at org.datanucleus.store.rdbms.mapping.java.MapMapping.postInsert(MapMapping.java:135)
>         at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:522)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
>         at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
>         at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
>         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
>         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2171)
>         at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
>         at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:321)
>         at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:191)
>         at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1460)
>         at org.datanucleus.state.StateManagerImpl.providedObjectField(StateManagerImpl.java:120)
>         at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.dnProvideField(MStorageDescriptor.java)
>         at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.dnProvideFields(MStorageDescriptor.java)
>         at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:1170)
>         at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:292)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
>         at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
>         at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
>         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
>         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2171)
>         at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
>         at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:321)
>         at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:191)
>         at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1460)
>         at org.datanucleus.state.StateManagerImpl.providedObjectField(StateManagerImpl.java:120)
>         at org.apache.hadoop.hive.metastore.model.MTable.dnProvideField(MTable.java)
>         at org.apache.hadoop.hive.metastore.model.MTable.dnProvideFields(MTable.java)
>         at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:1170)
>         at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:292)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
>         at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
>         at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
>         at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
>         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
>         at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1922)
>         at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1777)
>         at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:715)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
>         at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:859)
>         at org.apache.hadoop.hive.metastore.ObjectStoreWithBIMapping.createTable(ObjectStoreWithBIMapping.java:174)
>         at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>         at com.sun.proxy.$Proxy11.createTable(Unknown Source)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1522)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1555)
>         at sun.reflect.GeneratedMethodAccessor87.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>         at com.sun.proxy.$Proxy13.create_table_with_environment_context(Unknown Source)
>         at sun.reflect.GeneratedMethodAccessor87.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.hive.metastore.HiveMetaStore$MetricHMSProxy.invoke(HiveMetaStore.java:6098)
>         at com.sun.proxy.$Proxy13.create_table_with_environment_context(Unknown Source)
>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9216)
>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9200)
>         at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>         at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:731)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:726)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1690)
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:726)
>         at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '508649089' for key 'PRIMARY'
>         at sun.reflect.GeneratedConstructorAccessor111.newInstance(Unknown Source)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
>         at com.mysql.jdbc.Util.getInstance(Util.java:408)
>         at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:935)
>         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
>         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
>         at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2524)
>         at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2677)
>         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2549)
>         at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1861)
>         at com.mysql.jdbc.PreparedStatement.executeUpdateInternal(PreparedStatement.java:2073)
>         at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1751)
>         ... 84 more
> ----
> 2018-01-04 05:33:06,908 ERROR [pool-10-thread-90808]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(173)) - Retrying HMSHandler after 2000 ms (attempt 1 of 10) with error: javax.jdo.JDOData
> StoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MTable@651b773f" using statement "INSERT INTO `TBLS` (`TBL_ID`,`CREATE_TIME`,`DB_ID`,`LAST_ACCESS_TIME`,`OWNER`,`RETENTION`,`SD_ID`
> ,`TBL_NAME`,`TBL_TYPE`,`VIEW_EXPANDED_TEXT`,`VIEW_ORIGINAL_TEXT`) VALUES (?,?,?,?,?,?,?,?,?,?,?)" failed : Duplicate entry '57504902' for key 'PRIMARY'
>         at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720)
>         at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)