You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Jander g <ja...@gmail.com> on 2014/01/23 04:22:20 UTC

Hive 0.12.0 mysql metastore exception

Hi, guys

I upgrade hive 0.70 to hive 0.12.0 recently. We resolve some problems, but
the one as follows puzzles me.

We have thousands jobs every day and that will have about 40 this errors. I
increase mysql max connections from 2000 to 4000 and I found the history
max connection less than 2000.

Any suggestions will be greatly appreciated. Thanks in advance.


2014-01-23 06:35:47,724 WARN  bonecp.BoneCPConfig
(BoneCPConfig.java:sanitize(1537)) - Max Connections < 1. Setting to 20
2014-01-23 06:35:48,433 ERROR metastore.RetryingRawStore
(RetryingRawStore.java:invoke(146)) - JDO datastore error. Retrying
metastore command after 1000 ms (attempt 1 of 1)
2014-01-23 06:35:49,467 ERROR exec.DDLTask (DDLTask.java:execute(435)) -
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1143)
        at
org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1128)
        at
org.apache.hadoop.hive.ql.exec.DDLTask.switchDatabase(DDLTask.java:3479)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:237)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
        at
org.apache.hadoop.hive.cli.NewCliDriver.processCmd(NewCliDriver.java:166)
        at
org.apache.hadoop.hive.cli.NewCliDriver.processLine(NewCliDriver.java:243)
        at
org.apache.hadoop.hive.cli.NewCliDriver.main(NewCliDriver.java:427)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
/ERROR

        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
        ... 27 more
Caused by: java.sql.BatchUpdateException: Duplicate entry 'default' for key
2
        at
com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2020)
        at
com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1451)
        at
com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
        at
org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
        at
org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
        at
org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
        at
org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
        at
org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
        at
org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
        at org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
        at org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
        at
org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
        ... 43 more

2014-01-23 06:35:49,468 ERROR ql.Driver (SessionState.java:printError(436))
- FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable
to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient


-- 
Thanks,
Jander

Re: Hive 0.12.0 mysql metastore exception

Posted by Jander g <ja...@gmail.com>.
Hi, jov

Thanks for your attention.

we use hive like this:
hive -e "use abc; insert overwrite ....."
Here, abc is a hive schema which already exists in hive metastore. But,
from the log we can see, hive ddl switch database is failed.
That is to say, it doesn't read the exist schema?

2014-01-23 06:35:49,467 ERROR exec.DDLTask (DDLTask.java:execute(435)) -
org.apache.hadoop.hive.ql.
metadata.HiveException: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1143)
        at
org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1128)
        at org.apache.hadoop.hive.ql.exec.DDLTask.switchDatabase
(DDLTask.java:3479)



Best Regards,

On Thu, Jan 23, 2014 at 1:27 PM, Jov <am...@amutu.com> wrote:

>
> 2014/1/23 Jander g <ja...@gmail.com>
>
>> Caused by: java.sql.BatchUpdateException: Duplicate entry 'default' for
>> key 2
>
>
> what is the HQL you run? it look like hive try to insert 'default' to a
> meta table which violate the unique key.
>
>
> Jov
> blog: http:amutu.com/blog <http://amutu.com/blog>
>



-- 
Thanks,
Jander

Re: Hive 0.12.0 mysql metastore exception

Posted by Jov <am...@amutu.com>.
2014/1/23 Jander g <ja...@gmail.com>

> Caused by: java.sql.BatchUpdateException: Duplicate entry 'default' for
> key 2


what is the HQL you run? it look like hive try to insert 'default' to a
meta table which violate the unique key.


Jov
blog: http:amutu.com/blog <http://amutu.com/blog>