You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Jae Lee <jl...@gmail.com> on 2011/03/14 15:27:23 UTC

how to load hive table schema programatically?

Hi,

I've had this code below working with Hive 0.5

String databaseName = "default";
String tableName = "foobar";
List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable = new
HiveMetaStoreClient(new HiveConf(new Configuration(),
SessionState.class)).getSchema(databaseName, tableName);


to produce list of FieldSchema for a table foobar in default database

I've recently upgraded hive to 0.7, and the same code now generates an error
messages such as

11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown while
adding/validating class(es) : Required columns missing from table
"`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
enabled "datanucleus.autoCreateColumns".
Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
MetaData is incorrect, or you havent enabled
"datanucleus.autoCreateColumns".
org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData is
incorrect, or you havent enabled "datanucleus.autoCreateColumns".
 at
org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
 at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
 at
org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
 at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
 at
org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
 at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
 at
org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
 at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
at
HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
 at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
 at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
 at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
 at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
 at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
 at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)

11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown while
adding/validating class(es) : Required columns missing from table
"`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
enabled "datanucleus.autoCreateColumns".
Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
MetaData is incorrect, or you havent enabled
"datanucleus.autoCreateColumns".
org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your MetaData is
incorrect, or you havent enabled "datanucleus.autoCreateColumns".
 at
org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
 at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
 at
org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
 at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
 at
org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
 at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
 at
org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
 at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
at
HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
 at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
 at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
 at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
 at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
 at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
 at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)

11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown while
adding/validating class(es) : Expected primary key for table `SORT_COLS`
PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys PRIMARY KEY (`SD_ID`)
Expected primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
found in existing keys PRIMARY KEY (`SD_ID`)
org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in
existing keys PRIMARY KEY (`SD_ID`)
 at
org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
 at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
 at
org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
 at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
 at
org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
 at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
 at
org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
 at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
at
HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
 at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
 at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
 at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
 at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
 at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
 at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)

11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown while
adding/validating class(es) : Required columns missing from table
"`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
havent enabled "datanucleus.autoCreateColumns".
Required columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
MetaData is incorrect, or you havent enabled
"datanucleus.autoCreateColumns".
org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData
is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
 at
org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
 at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
 at
org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
 at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
 at
org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
 at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
 at
org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
 at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
at
HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
 at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
 at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
 at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
 at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
 at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
 at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)

11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown while
adding/validating class(es) : Expected primary key for table
`BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
PRIMARY KEY (`SD_ID`)
Expected primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
not found in existing keys PRIMARY KEY (`SD_ID`)
org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found
in existing keys PRIMARY KEY (`SD_ID`)
 at
org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
 at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
 at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
 at
org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
 at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
 at
org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at
org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
 at
org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
 at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
 at
org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
 at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
 at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
at
HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
 at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
 at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
 at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
 at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
 at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
 at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)

I've debugged through the code and found that those missing columns were
added by datanucleus JDO code. I'm not sure what this means, rest of the
hive applications are working perfectly fine.

Is there anything that I should try to figure out what's going on? or just
generally is it a right way to get a schema from hive table?

J

Re: how to load hive table schema programatically?

Posted by Edward Capriolo <ed...@gmail.com>.
On Mon, Mar 14, 2011 at 3:01 PM, Carl Steinbach <ca...@cloudera.com> wrote:
> Hi Ed,
> I'm pretty sure HiveMetaStoreClient is intended to be a public API.
>
> On Mon, Mar 14, 2011 at 11:49 AM, Edward Capriolo <ed...@gmail.com>
> wrote:
>>
>> On Mon, Mar 14, 2011 at 2:44 PM, Jae Lee <jl...@gmail.com> wrote:
>> > Ah... thanks alot... that worked :)
>> >
>> > Is there any other recommended way to load hive table meta data? I
>> > suppose
>> > accessing meta-store via HiveMetaStoreClient make it possible to change
>> > underlying data storage implementation choice.
>> >
>> > J
>> >
>> > On Mon, Mar 14, 2011 at 5:56 PM, Carl Steinbach <ca...@cloudera.com>
>> > wrote:
>> >
>> >> Hi Jae,
>> >>
>> >> Sounds like your problem is related to HIVE-1435 (
>> >> https://issues.apache.org/jira/browse/HIVE-1435). You need to make sure
>> >> that the Datanucleus ORM layer is getting initialized with the
>> >> configuration
>> >> property datanucleus.identifierFactory=datanucleus. Probably the
>> >> easiest way
>> >> to fix this problem is to make sure that the 0.7.0 version of
>> >> hive-default.xml is available on the CLASSPATH and is getting loaded
>> >> into
>> >> HiveConf. Try dumping the contents of your HiveConf object and make
>> >> sure
>> >> that the values match those that appear in the 0.7.0 version of
>> >> hive-default.xml
>> >>
>> >> Hope this helps.
>> >>
>> >> Carl
>> >>
>> >>
>> >> On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee <jl...@gmail.com> wrote:
>> >>
>> >>> just a bit more information from my debugging so far
>> >>>
>> >>> my mysql hive metastore have columns like
>> >>> "integer_idx" at "columns" table
>> >>> "integer_idx" at "sort_cols" table
>> >>>
>> >>> those columns looks pretty suspicious in that it is similar to "idx"
>> >>> columns
>> >>> that HiveMetaSotreClient complains missing.
>> >>>
>> >>> It looks like expectation of having "idx" column is auto-generated
>> >>> (not
>> >>> from
>> >>> package.jdo document)
>> >>> Can anybody tell me whether "integer_idx" column should have been
>> >>> "idx"
>> >>> column at "columns" table?
>> >>> or am I suppose to have custom package.jdo file that specify the index
>> >>> column name to "integer_idx" instead of "idx" column?
>> >>>
>> >>> J
>> >>>
>> >>> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:
>> >>>
>> >>> > Hi,
>> >>> >
>> >>> > I've had this code below working with Hive 0.5
>> >>> >
>> >>> > String databaseName = "default";
>> >>> > String tableName = "foobar";
>> >>> > List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable =
>> >>> > new
>> >>> > HiveMetaStoreClient(new HiveConf(new Configuration(),
>> >>> > SessionState.class)).getSchema(databaseName, tableName);
>> >>> >
>> >>> >
>> >>> > to produce list of FieldSchema for a table foobar in default
>> >>> > database
>> >>> >
>> >>> > I've recently upgraded hive to 0.7, and the same code now generates
>> >>> > an
>> >>> > error messages such as
>> >>> >
>> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
>> >>> > thrown
>> >>> > while adding/validating class(es) : Required columns missing from
>> >>> > table
>> >>> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you
>> >>> > havent
>> >>> > enabled "datanucleus.autoCreateColumns".
>> >>> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps
>> >>> > your
>> >>> > MetaData is incorrect, or you havent enabled
>> >>> > "datanucleus.autoCreateColumns".
>> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
>> >>> > Required
>> >>> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
>> >>> > MetaData is
>> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> >>> > at
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >>> >  at
>> >>> >
>> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> >>> > at
>> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> >>> > at
>> >>> >
>> >>>
>> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> >  at
>> >>> >
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >>> >  at
>> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> >>> > at
>> >>> >
>> >>>
>> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >>> >  at
>> >>> >
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> >>> > at
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >>> >
>> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
>> >>> > thrown
>> >>> > while adding/validating class(es) : Required columns missing from
>> >>> > table
>> >>> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
>> >>> > havent
>> >>> > enabled "datanucleus.autoCreateColumns".
>> >>> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps
>> >>> > your
>> >>> > MetaData is incorrect, or you havent enabled
>> >>> > "datanucleus.autoCreateColumns".
>> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
>> >>> > Required
>> >>> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
>> >>> > MetaData
>> >>> is
>> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> >>> > at
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >>> >  at
>> >>> >
>> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> >>> > at
>> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> >>> > at
>> >>> >
>> >>>
>> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> >  at
>> >>> >
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >>> >  at
>> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> >>> > at
>> >>> >
>> >>>
>> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >>> >  at
>> >>> >
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> >>> > at
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >>> >
>> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
>> >>> > thrown
>> >>> > while adding/validating class(es) : Expected primary key for table
>> >>> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
>> >>> PRIMARY
>> >>> > KEY (`SD_ID`)
>> >>> > Expected primary key for table `SORT_COLS` PRIMARY KEY
>> >>> > (`SD_ID`,`IDX`)
>> >>> not
>> >>> > found in existing keys PRIMARY KEY (`SD_ID`)
>> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>> >>> Expected
>> >>> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
>> >>> > found
>> >>> in
>> >>> > existing keys PRIMARY KEY (`SD_ID`)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>> >>> > at
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >>> >  at
>> >>> >
>> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> >>> > at
>> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> >>> > at
>> >>> >
>> >>>
>> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> >  at
>> >>> >
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >>> >  at
>> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> >>> > at
>> >>> >
>> >>>
>> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >>> >  at
>> >>> >
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> >>> > at
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >>> >
>> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
>> >>> > thrown
>> >>> > while adding/validating class(es) : Required columns missing from
>> >>> > table
>> >>> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or
>> >>> > you
>> >>> > havent enabled "datanucleus.autoCreateColumns".
>> >>> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`.
>> >>> > Perhaps
>> >>> > your MetaData is incorrect, or you havent enabled
>> >>> > "datanucleus.autoCreateColumns".
>> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
>> >>> > Required
>> >>> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
>> >>> MetaData
>> >>> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> >>> > at
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >>> >  at
>> >>> >
>> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> >>> > at
>> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> >>> > at
>> >>> >
>> >>>
>> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> >  at
>> >>> >
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >>> >  at
>> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> >>> > at
>> >>> >
>> >>>
>> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >>> >  at
>> >>> >
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> >>> > at
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >>> >
>> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
>> >>> > thrown
>> >>> > while adding/validating class(es) : Expected primary key for table
>> >>> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing
>> >>> > keys
>> >>> > PRIMARY KEY (`SD_ID`)
>> >>> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY
>> >>> (`SD_ID`,`IDX`)
>> >>> > not found in existing keys PRIMARY KEY (`SD_ID`)
>> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>> >>> Expected
>> >>> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
>> >>> > not
>> >>> found
>> >>> > in existing keys PRIMARY KEY (`SD_ID`)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>> >>> > at
>> >>>
>> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >>> >  at
>> >>> >
>> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> >>> > at
>> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> >>> > at
>> >>> >
>> >>>
>> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> >  at
>> >>> >
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >>> >  at
>> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> >>> > at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >>> >  at
>> >>> >
>> >>>
>> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> >>> > at
>> >>> >
>> >>>
>> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >>> >  at
>> >>> >
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> >>> > at
>> >>>
>> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >>> >
>> >>> > I've debugged through the code and found that those missing columns
>> >>> > were
>> >>> > added by datanucleus JDO code. I'm not sure what this means, rest of
>> >>> > the
>> >>> > hive applications are working perfectly fine.
>> >>> >
>> >>> > Is there anything that I should try to figure out what's going on?
>> >>> > or
>> >>> just
>> >>> > generally is it a right way to get a schema from hive table?
>> >>> >
>> >>> > J
>> >>> >
>> >>>
>> >>
>> >>
>> >
>>
>> Correct you should not interface with the Metastore this way because
>> it is not stable API you are working with.
>
>

I could go either way on this issue. It is public and their are thrift stubs.

http://hive.apache.org/docs/r0.6.0/api/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.html.

I would worry about users using RAW methods such as:

add_partition(Partition new_part)
          Add a partition to the table.

Users should try to interface through CLI | HiveServer if possible.
Anything below that is "internal" IMHO, but I am guilty for using it
directly.

Just be warned. It changes typically without any deprecation.

Edward

Re: how to load hive table schema programatically?

Posted by Carl Steinbach <ca...@cloudera.com>.
Hi Ed,

I'm pretty sure HiveMetaStoreClient is intended to be a public API.


On Mon, Mar 14, 2011 at 11:49 AM, Edward Capriolo <ed...@gmail.com>wrote:

> On Mon, Mar 14, 2011 at 2:44 PM, Jae Lee <jl...@gmail.com> wrote:
> > Ah... thanks alot... that worked :)
> >
> > Is there any other recommended way to load hive table meta data? I
> suppose
> > accessing meta-store via HiveMetaStoreClient make it possible to change
> > underlying data storage implementation choice.
> >
> > J
> >
> > On Mon, Mar 14, 2011 at 5:56 PM, Carl Steinbach <ca...@cloudera.com>
> wrote:
> >
> >> Hi Jae,
> >>
> >> Sounds like your problem is related to HIVE-1435 (
> >> https://issues.apache.org/jira/browse/HIVE-1435). You need to make sure
> >> that the Datanucleus ORM layer is getting initialized with the
> configuration
> >> property datanucleus.identifierFactory=datanucleus. Probably the easiest
> way
> >> to fix this problem is to make sure that the 0.7.0 version of
> >> hive-default.xml is available on the CLASSPATH and is getting loaded
> into
> >> HiveConf. Try dumping the contents of your HiveConf object and make sure
> >> that the values match those that appear in the 0.7.0 version of
> >> hive-default.xml
> >>
> >> Hope this helps.
> >>
> >> Carl
> >>
> >>
> >> On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee <jl...@gmail.com> wrote:
> >>
> >>> just a bit more information from my debugging so far
> >>>
> >>> my mysql hive metastore have columns like
> >>> "integer_idx" at "columns" table
> >>> "integer_idx" at "sort_cols" table
> >>>
> >>> those columns looks pretty suspicious in that it is similar to "idx"
> >>> columns
> >>> that HiveMetaSotreClient complains missing.
> >>>
> >>> It looks like expectation of having "idx" column is auto-generated (not
> >>> from
> >>> package.jdo document)
> >>> Can anybody tell me whether "integer_idx" column should have been "idx"
> >>> column at "columns" table?
> >>> or am I suppose to have custom package.jdo file that specify the index
> >>> column name to "integer_idx" instead of "idx" column?
> >>>
> >>> J
> >>>
> >>> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:
> >>>
> >>> > Hi,
> >>> >
> >>> > I've had this code below working with Hive 0.5
> >>> >
> >>> > String databaseName = "default";
> >>> > String tableName = "foobar";
> >>> > List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable =
> new
> >>> > HiveMetaStoreClient(new HiveConf(new Configuration(),
> >>> > SessionState.class)).getSchema(databaseName, tableName);
> >>> >
> >>> >
> >>> > to produce list of FieldSchema for a table foobar in default database
> >>> >
> >>> > I've recently upgraded hive to 0.7, and the same code now generates
> an
> >>> > error messages such as
> >>> >
> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
> thrown
> >>> > while adding/validating class(es) : Required columns missing from
> table
> >>> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you
> havent
> >>> > enabled "datanucleus.autoCreateColumns".
> >>> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
> >>> > MetaData is incorrect, or you havent enabled
> >>> > "datanucleus.autoCreateColumns".
> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
> Required
> >>> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData
> is
> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> >>> > at
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >>> >  at
> >>> >
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> >>> > at
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> >>> > at
> >>> >
> >>>
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> > at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> >  at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> >>> > at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> >>> > at
> >>> >
> >>>
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >>> >  at
> >>> >
> >>>
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> >>> > at
> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >>> >
> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
> thrown
> >>> > while adding/validating class(es) : Required columns missing from
> table
> >>> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
> havent
> >>> > enabled "datanucleus.autoCreateColumns".
> >>> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps
> your
> >>> > MetaData is incorrect, or you havent enabled
> >>> > "datanucleus.autoCreateColumns".
> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
> Required
> >>> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
> MetaData
> >>> is
> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> >>> > at
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >>> >  at
> >>> >
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> >>> > at
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> >>> > at
> >>> >
> >>>
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> > at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> >  at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> >>> > at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> >>> > at
> >>> >
> >>>
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >>> >  at
> >>> >
> >>>
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> >>> > at
> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >>> >
> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
> thrown
> >>> > while adding/validating class(es) : Expected primary key for table
> >>> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
> >>> PRIMARY
> >>> > KEY (`SD_ID`)
> >>> > Expected primary key for table `SORT_COLS` PRIMARY KEY
> (`SD_ID`,`IDX`)
> >>> not
> >>> > found in existing keys PRIMARY KEY (`SD_ID`)
> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
> >>> Expected
> >>> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
> found
> >>> in
> >>> > existing keys PRIMARY KEY (`SD_ID`)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> >>> > at
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >>> >  at
> >>> >
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> >>> > at
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> >>> > at
> >>> >
> >>>
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> > at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> >  at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> >>> > at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> >>> > at
> >>> >
> >>>
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >>> >  at
> >>> >
> >>>
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> >>> > at
> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >>> >
> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
> thrown
> >>> > while adding/validating class(es) : Required columns missing from
> table
> >>> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or
> you
> >>> > havent enabled "datanucleus.autoCreateColumns".
> >>> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`.
> Perhaps
> >>> > your MetaData is incorrect, or you havent enabled
> >>> > "datanucleus.autoCreateColumns".
> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException:
> Required
> >>> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
> >>> MetaData
> >>> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> >>> > at
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >>> >  at
> >>> >
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> >>> > at
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> >>> > at
> >>> >
> >>>
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> > at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> >  at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> >>> > at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> >>> > at
> >>> >
> >>>
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >>> >  at
> >>> >
> >>>
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> >>> > at
> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >>> >
> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was
> thrown
> >>> > while adding/validating class(es) : Expected primary key for table
> >>> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing
> keys
> >>> > PRIMARY KEY (`SD_ID`)
> >>> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY
> >>> (`SD_ID`,`IDX`)
> >>> > not found in existing keys PRIMARY KEY (`SD_ID`)
> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
> >>> Expected
> >>> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
> not
> >>> found
> >>> > in existing keys PRIMARY KEY (`SD_ID`)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> >>> > at
> >>>
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >>> >  at
> >>> >
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> >>> > at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >>> >  at
> >>> >
> >>>
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> >>> > at
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> >>> > at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >>> >  at
> >>> >
> >>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> >>> > at
> >>> >
> >>>
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> > at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> >  at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> >  at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> >>> > at
> >>> >
> >>>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> >>> > at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >>> >  at
> >>> >
> >>>
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> >>> > at
> >>> >
> >>>
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >>> >  at
> >>> >
> >>>
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> >>> > at
> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >>> >
> >>> > I've debugged through the code and found that those missing columns
> were
> >>> > added by datanucleus JDO code. I'm not sure what this means, rest of
> the
> >>> > hive applications are working perfectly fine.
> >>> >
> >>> > Is there anything that I should try to figure out what's going on? or
> >>> just
> >>> > generally is it a right way to get a schema from hive table?
> >>> >
> >>> > J
> >>> >
> >>>
> >>
> >>
> >
>
> Correct you should not interface with the Metastore this way because
> it is not stable API you are working with.
>

Re: how to load hive table schema programatically?

Posted by Edward Capriolo <ed...@gmail.com>.
On Mon, Mar 14, 2011 at 2:44 PM, Jae Lee <jl...@gmail.com> wrote:
> Ah... thanks alot... that worked :)
>
> Is there any other recommended way to load hive table meta data? I suppose
> accessing meta-store via HiveMetaStoreClient make it possible to change
> underlying data storage implementation choice.
>
> J
>
> On Mon, Mar 14, 2011 at 5:56 PM, Carl Steinbach <ca...@cloudera.com> wrote:
>
>> Hi Jae,
>>
>> Sounds like your problem is related to HIVE-1435 (
>> https://issues.apache.org/jira/browse/HIVE-1435). You need to make sure
>> that the Datanucleus ORM layer is getting initialized with the configuration
>> property datanucleus.identifierFactory=datanucleus. Probably the easiest way
>> to fix this problem is to make sure that the 0.7.0 version of
>> hive-default.xml is available on the CLASSPATH and is getting loaded into
>> HiveConf. Try dumping the contents of your HiveConf object and make sure
>> that the values match those that appear in the 0.7.0 version of
>> hive-default.xml
>>
>> Hope this helps.
>>
>> Carl
>>
>>
>> On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee <jl...@gmail.com> wrote:
>>
>>> just a bit more information from my debugging so far
>>>
>>> my mysql hive metastore have columns like
>>> "integer_idx" at "columns" table
>>> "integer_idx" at "sort_cols" table
>>>
>>> those columns looks pretty suspicious in that it is similar to "idx"
>>> columns
>>> that HiveMetaSotreClient complains missing.
>>>
>>> It looks like expectation of having "idx" column is auto-generated (not
>>> from
>>> package.jdo document)
>>> Can anybody tell me whether "integer_idx" column should have been "idx"
>>> column at "columns" table?
>>> or am I suppose to have custom package.jdo file that specify the index
>>> column name to "integer_idx" instead of "idx" column?
>>>
>>> J
>>>
>>> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:
>>>
>>> > Hi,
>>> >
>>> > I've had this code below working with Hive 0.5
>>> >
>>> > String databaseName = "default";
>>> > String tableName = "foobar";
>>> > List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable = new
>>> > HiveMetaStoreClient(new HiveConf(new Configuration(),
>>> > SessionState.class)).getSchema(databaseName, tableName);
>>> >
>>> >
>>> > to produce list of FieldSchema for a table foobar in default database
>>> >
>>> > I've recently upgraded hive to 0.7, and the same code now generates an
>>> > error messages such as
>>> >
>>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>>> > while adding/validating class(es) : Required columns missing from table
>>> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
>>> > enabled "datanucleus.autoCreateColumns".
>>> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
>>> > MetaData is incorrect, or you havent enabled
>>> > "datanucleus.autoCreateColumns".
>>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>>> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData is
>>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>>> > at
>>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>>> >  at
>>> >
>>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>>> > at
>>> >
>>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>>> >  at
>>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>>> > at
>>> >
>>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >  at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>> > at
>>> >
>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> >  at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>>> > at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>>> > at
>>> >
>>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>>> >  at
>>> >
>>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>>> > at
>>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>>> >
>>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>>> > while adding/validating class(es) : Required columns missing from table
>>> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
>>> > enabled "datanucleus.autoCreateColumns".
>>> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
>>> > MetaData is incorrect, or you havent enabled
>>> > "datanucleus.autoCreateColumns".
>>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>>> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your MetaData
>>> is
>>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>>> > at
>>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>>> >  at
>>> >
>>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>>> > at
>>> >
>>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>>> >  at
>>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>>> > at
>>> >
>>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >  at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>> > at
>>> >
>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> >  at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>>> > at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>>> > at
>>> >
>>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>>> >  at
>>> >
>>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>>> > at
>>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>>> >
>>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>>> > while adding/validating class(es) : Expected primary key for table
>>> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
>>> PRIMARY
>>> > KEY (`SD_ID`)
>>> > Expected primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
>>> not
>>> > found in existing keys PRIMARY KEY (`SD_ID`)
>>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>>> Expected
>>> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found
>>> in
>>> > existing keys PRIMARY KEY (`SD_ID`)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>>> > at
>>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>>> >  at
>>> >
>>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>>> > at
>>> >
>>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>>> >  at
>>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>>> > at
>>> >
>>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >  at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>> > at
>>> >
>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> >  at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>>> > at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>>> > at
>>> >
>>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>>> >  at
>>> >
>>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>>> > at
>>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>>> >
>>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>>> > while adding/validating class(es) : Required columns missing from table
>>> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
>>> > havent enabled "datanucleus.autoCreateColumns".
>>> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps
>>> > your MetaData is incorrect, or you havent enabled
>>> > "datanucleus.autoCreateColumns".
>>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>>> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
>>> MetaData
>>> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>>> > at
>>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>>> >  at
>>> >
>>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>>> > at
>>> >
>>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>>> >  at
>>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>>> > at
>>> >
>>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >  at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>> > at
>>> >
>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> >  at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>>> > at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>>> > at
>>> >
>>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>>> >  at
>>> >
>>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>>> > at
>>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>>> >
>>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>>> > while adding/validating class(es) : Expected primary key for table
>>> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
>>> > PRIMARY KEY (`SD_ID`)
>>> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY
>>> (`SD_ID`,`IDX`)
>>> > not found in existing keys PRIMARY KEY (`SD_ID`)
>>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>>> Expected
>>> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
>>> found
>>> > in existing keys PRIMARY KEY (`SD_ID`)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>>> > at
>>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>>> >  at
>>> >
>>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>>> > at
>>> >
>>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>>> >  at
>>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>>> > at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>>> >  at
>>> >
>>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>>> > at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>>> >  at
>>> >
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>>> > at
>>> >
>>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >  at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>> > at
>>> >
>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>> >  at
>>> >
>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> >  at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>>> > at
>>> >
>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>>> > at
>>> >
>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>> >  at
>>> >
>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>>> > at
>>> >
>>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>>> >  at
>>> >
>>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>>> > at
>>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>>> >
>>> > I've debugged through the code and found that those missing columns were
>>> > added by datanucleus JDO code. I'm not sure what this means, rest of the
>>> > hive applications are working perfectly fine.
>>> >
>>> > Is there anything that I should try to figure out what's going on? or
>>> just
>>> > generally is it a right way to get a schema from hive table?
>>> >
>>> > J
>>> >
>>>
>>
>>
>

Correct you should not interface with the Metastore this way because
it is not stable API you are working with.

Re: how to load hive table schema programatically?

Posted by Jae Lee <jl...@gmail.com>.
Ah... thanks alot... that worked :)

Is there any other recommended way to load hive table meta data? I suppose
accessing meta-store via HiveMetaStoreClient make it possible to change
underlying data storage implementation choice.

J

On Mon, Mar 14, 2011 at 5:56 PM, Carl Steinbach <ca...@cloudera.com> wrote:

> Hi Jae,
>
> Sounds like your problem is related to HIVE-1435 (
> https://issues.apache.org/jira/browse/HIVE-1435). You need to make sure
> that the Datanucleus ORM layer is getting initialized with the configuration
> property datanucleus.identifierFactory=datanucleus. Probably the easiest way
> to fix this problem is to make sure that the 0.7.0 version of
> hive-default.xml is available on the CLASSPATH and is getting loaded into
> HiveConf. Try dumping the contents of your HiveConf object and make sure
> that the values match those that appear in the 0.7.0 version of
> hive-default.xml
>
> Hope this helps.
>
> Carl
>
>
> On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee <jl...@gmail.com> wrote:
>
>> just a bit more information from my debugging so far
>>
>> my mysql hive metastore have columns like
>> "integer_idx" at "columns" table
>> "integer_idx" at "sort_cols" table
>>
>> those columns looks pretty suspicious in that it is similar to "idx"
>> columns
>> that HiveMetaSotreClient complains missing.
>>
>> It looks like expectation of having "idx" column is auto-generated (not
>> from
>> package.jdo document)
>> Can anybody tell me whether "integer_idx" column should have been "idx"
>> column at "columns" table?
>> or am I suppose to have custom package.jdo file that specify the index
>> column name to "integer_idx" instead of "idx" column?
>>
>> J
>>
>> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:
>>
>> > Hi,
>> >
>> > I've had this code below working with Hive 0.5
>> >
>> > String databaseName = "default";
>> > String tableName = "foobar";
>> > List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable = new
>> > HiveMetaStoreClient(new HiveConf(new Configuration(),
>> > SessionState.class)).getSchema(databaseName, tableName);
>> >
>> >
>> > to produce list of FieldSchema for a table foobar in default database
>> >
>> > I've recently upgraded hive to 0.7, and the same code now generates an
>> > error messages such as
>> >
>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>> > while adding/validating class(es) : Required columns missing from table
>> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
>> > enabled "datanucleus.autoCreateColumns".
>> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
>> > MetaData is incorrect, or you havent enabled
>> > "datanucleus.autoCreateColumns".
>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData is
>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >  at
>> >
>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> > at
>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> > at
>> >
>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >  at
>> >
>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> > at
>> >
>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >  at
>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> > at
>> >
>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> > at
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> > at
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> > at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >  at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> > at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> > at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> > at
>> >
>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >  at
>> >
>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> > at
>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >
>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>> > while adding/validating class(es) : Required columns missing from table
>> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
>> > enabled "datanucleus.autoCreateColumns".
>> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
>> > MetaData is incorrect, or you havent enabled
>> > "datanucleus.autoCreateColumns".
>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your MetaData
>> is
>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >  at
>> >
>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> > at
>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> > at
>> >
>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >  at
>> >
>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> > at
>> >
>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >  at
>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> > at
>> >
>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> > at
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> > at
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> > at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >  at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> > at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> > at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> > at
>> >
>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >  at
>> >
>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> > at
>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >
>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>> > while adding/validating class(es) : Expected primary key for table
>> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
>> PRIMARY
>> > KEY (`SD_ID`)
>> > Expected primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
>> not
>> > found in existing keys PRIMARY KEY (`SD_ID`)
>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>> Expected
>> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found
>> in
>> > existing keys PRIMARY KEY (`SD_ID`)
>> >  at
>> >
>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>> > at
>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> > at
>> >
>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >  at
>> >
>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> > at
>> >
>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >  at
>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> > at
>> >
>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> > at
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> > at
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> > at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >  at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> > at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> > at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> > at
>> >
>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >  at
>> >
>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> > at
>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >
>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>> > while adding/validating class(es) : Required columns missing from table
>> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
>> > havent enabled "datanucleus.autoCreateColumns".
>> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps
>> > your MetaData is incorrect, or you havent enabled
>> > "datanucleus.autoCreateColumns".
>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
>> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
>> MetaData
>> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>> >  at
>> >
>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
>> > at
>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> > at
>> >
>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >  at
>> >
>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> > at
>> >
>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >  at
>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> > at
>> >
>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> > at
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> > at
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> > at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >  at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> > at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> > at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> > at
>> >
>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >  at
>> >
>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> > at
>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >
>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
>> > while adding/validating class(es) : Expected primary key for table
>> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
>> > PRIMARY KEY (`SD_ID`)
>> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY
>> (`SD_ID`,`IDX`)
>> > not found in existing keys PRIMARY KEY (`SD_ID`)
>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException:
>> Expected
>> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
>> found
>> > in existing keys PRIMARY KEY (`SD_ID`)
>> >  at
>> >
>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
>> > at
>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
>> > at
>> >
>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>> >  at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>> >  at
>> >
>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
>> > at
>> >
>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
>> > at
>> >
>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>> >  at
>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
>> > at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>> >  at
>> >
>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
>> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
>> > at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>> >  at
>> >
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
>> > at
>> >
>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>> > at
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>> >  at
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>> > at
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> > at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> >  at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
>> > at
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
>> > at
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>> >  at
>> >
>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> > at
>> >
>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>> >  at
>> >
>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
>> > at
>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>> >
>> > I've debugged through the code and found that those missing columns were
>> > added by datanucleus JDO code. I'm not sure what this means, rest of the
>> > hive applications are working perfectly fine.
>> >
>> > Is there anything that I should try to figure out what's going on? or
>> just
>> > generally is it a right way to get a schema from hive table?
>> >
>> > J
>> >
>>
>
>

Re: how to load hive table schema programatically?

Posted by Carl Steinbach <ca...@cloudera.com>.
Hi Jae,

Sounds like your problem is related to HIVE-1435 (
https://issues.apache.org/jira/browse/HIVE-1435). You need to make sure that
the Datanucleus ORM layer is getting initialized with the configuration
property datanucleus.identifierFactory=datanucleus. Probably the easiest way
to fix this problem is to make sure that the 0.7.0 version of
hive-default.xml is available on the CLASSPATH and is getting loaded into
HiveConf. Try dumping the contents of your HiveConf object and make sure
that the values match those that appear in the 0.7.0 version of
hive-default.xml

Hope this helps.

Carl

On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee <jl...@gmail.com> wrote:

> just a bit more information from my debugging so far
>
> my mysql hive metastore have columns like
> "integer_idx" at "columns" table
> "integer_idx" at "sort_cols" table
>
> those columns looks pretty suspicious in that it is similar to "idx"
> columns
> that HiveMetaSotreClient complains missing.
>
> It looks like expectation of having "idx" column is auto-generated (not
> from
> package.jdo document)
> Can anybody tell me whether "integer_idx" column should have been "idx"
> column at "columns" table?
> or am I suppose to have custom package.jdo file that specify the index
> column name to "integer_idx" instead of "idx" column?
>
> J
>
> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:
>
> > Hi,
> >
> > I've had this code below working with Hive 0.5
> >
> > String databaseName = "default";
> > String tableName = "foobar";
> > List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable = new
> > HiveMetaStoreClient(new HiveConf(new Configuration(),
> > SessionState.class)).getSchema(databaseName, tableName);
> >
> >
> > to produce list of FieldSchema for a table foobar in default database
> >
> > I've recently upgraded hive to 0.7, and the same code now generates an
> > error messages such as
> >
> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> > while adding/validating class(es) : Required columns missing from table
> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
> > enabled "datanucleus.autoCreateColumns".
> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
> > MetaData is incorrect, or you havent enabled
> > "datanucleus.autoCreateColumns".
> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData is
> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >  at
> >
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> > at
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> > at
> >
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >  at
> >
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> > at
> >
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >  at
> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >  at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> > at
> >
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >  at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >  at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> > at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >  at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >  at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> > at
> >
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >  at
> >
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> > at
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >
> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> > while adding/validating class(es) : Required columns missing from table
> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
> > enabled "datanucleus.autoCreateColumns".
> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
> > MetaData is incorrect, or you havent enabled
> > "datanucleus.autoCreateColumns".
> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your MetaData
> is
> > incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >  at
> >
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> > at
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> > at
> >
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >  at
> >
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> > at
> >
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >  at
> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >  at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> > at
> >
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >  at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >  at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> > at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >  at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >  at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> > at
> >
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >  at
> >
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> > at
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >
> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> > while adding/validating class(es) : Expected primary key for table
> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
> PRIMARY
> > KEY (`SD_ID`)
> > Expected primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
> not
> > found in existing keys PRIMARY KEY (`SD_ID`)
> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found
> in
> > existing keys PRIMARY KEY (`SD_ID`)
> >  at
> >
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> > at
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> > at
> >
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >  at
> >
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> > at
> >
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >  at
> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >  at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> > at
> >
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >  at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >  at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> > at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >  at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >  at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> > at
> >
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >  at
> >
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> > at
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >
> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> > while adding/validating class(es) : Required columns missing from table
> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
> > havent enabled "datanucleus.autoCreateColumns".
> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps
> > your MetaData is incorrect, or you havent enabled
> > "datanucleus.autoCreateColumns".
> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your
> MetaData
> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
> >  at
> >
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> > at
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> > at
> >
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >  at
> >
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> > at
> >
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >  at
> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >  at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> > at
> >
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >  at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >  at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> > at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >  at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >  at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> > at
> >
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >  at
> >
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> > at
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >
> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> > while adding/validating class(es) : Expected primary key for table
> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
> > PRIMARY KEY (`SD_ID`)
> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY
> (`SD_ID`,`IDX`)
> > not found in existing keys PRIMARY KEY (`SD_ID`)
> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
> found
> > in existing keys PRIMARY KEY (`SD_ID`)
> >  at
> >
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> > at
> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> > at
> >
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> >  at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
> >  at
> >
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> > at
> >
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> > at
> >
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
> >  at
> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> > at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
> >  at
> >
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
> >  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
> >  at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
> >  at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> > at
> >
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >  at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> >  at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> > at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> >  at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
> >  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
> >  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> >  at
> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
> >  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> > at
> >
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
> >  at
> >
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> > at
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
> >
> > I've debugged through the code and found that those missing columns were
> > added by datanucleus JDO code. I'm not sure what this means, rest of the
> > hive applications are working perfectly fine.
> >
> > Is there anything that I should try to figure out what's going on? or
> just
> > generally is it a right way to get a schema from hive table?
> >
> > J
> >
>

Re: how to load hive table schema programatically?

Posted by Jae Lee <jl...@gmail.com>.
just a bit more information from my debugging so far

my mysql hive metastore have columns like
"integer_idx" at "columns" table
"integer_idx" at "sort_cols" table

those columns looks pretty suspicious in that it is similar to "idx" columns
that HiveMetaSotreClient complains missing.

It looks like expectation of having "idx" column is auto-generated (not from
package.jdo document)
Can anybody tell me whether "integer_idx" column should have been "idx"
column at "columns" table?
or am I suppose to have custom package.jdo file that specify the index
column name to "integer_idx" instead of "idx" column?

J

On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee <jl...@gmail.com> wrote:

> Hi,
>
> I've had this code below working with Hive 0.5
>
> String databaseName = "default";
> String tableName = "foobar";
> List<org.apache.hadoop.hive.metastore.api.FieldSchema> hiveTable = new
> HiveMetaStoreClient(new HiveConf(new Configuration(),
> SessionState.class)).getSchema(databaseName, tableName);
>
>
> to produce list of FieldSchema for a table foobar in default database
>
> I've recently upgraded hive to 0.7, and the same code now generates an
> error messages such as
>
> 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> while adding/validating class(es) : Required columns missing from table
> "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
> enabled "datanucleus.autoCreateColumns".
> Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps your
> MetaData is incorrect, or you havent enabled
> "datanucleus.autoCreateColumns".
> org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> columns missing from table "`COLUMNS`" : `IDX`. Perhaps your MetaData is
> incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>  at
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>  at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>  at
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>  at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>  at
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>  at
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> at
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>  at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>  at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>  at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>  at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>  at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>
> 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> while adding/validating class(es) : Required columns missing from table
> "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you havent
> enabled "datanucleus.autoCreateColumns".
> Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your
> MetaData is incorrect, or you havent enabled
> "datanucleus.autoCreateColumns".
> org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your MetaData is
> incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>  at
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>  at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>  at
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>  at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>  at
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>  at
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> at
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>  at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>  at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>  at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>  at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>  at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>
> 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> while adding/validating class(es) : Expected primary key for table
> `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys PRIMARY
> KEY (`SD_ID`)
> Expected primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not
> found in existing keys PRIMARY KEY (`SD_ID`)
> org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
> primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in
> existing keys PRIMARY KEY (`SD_ID`)
>  at
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>  at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>  at
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>  at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>  at
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>  at
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> at
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>  at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>  at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>  at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>  at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>  at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>
> 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> while adding/validating class(es) : Required columns missing from table
> "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you
> havent enabled "datanucleus.autoCreateColumns".
> Required columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps
> your MetaData is incorrect, or you havent enabled
> "datanucleus.autoCreateColumns".
> org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required
> columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData
> is incorrect, or you havent enabled "datanucleus.autoCreateColumns".
>  at
> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:282)
> at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:175)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>  at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>  at
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>  at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>  at
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>  at
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> at
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>  at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>  at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>  at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>  at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>  at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>
> 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was thrown
> while adding/validating class(es) : Expected primary key for table
> `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys
> PRIMARY KEY (`SD_ID`)
> Expected primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`)
> not found in existing keys PRIMARY KEY (`SD_ID`)
> org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: Expected
> primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found
> in existing keys PRIMARY KEY (`SD_ID`)
>  at
> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(TableImpl.java:368)
> at org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:180)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2711)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
> at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
>  at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
>  at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
> at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
>  at
> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
>  at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>  at
> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
> at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
> at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
>  at
> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
>  at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:775)
>  at
> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:709)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1076)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1073)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema(HiveMetaStore.java:1785)
>  at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveMetaStoreClient.java:857)
> at
> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreClientTest.java:10)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>  at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>  at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>  at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41)
>  at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>  at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:94)
>  at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:196)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:65)
>
> I've debugged through the code and found that those missing columns were
> added by datanucleus JDO code. I'm not sure what this means, rest of the
> hive applications are working perfectly fine.
>
> Is there anything that I should try to figure out what's going on? or just
> generally is it a right way to get a schema from hive table?
>
> J
>