You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Something Something <ma...@gmail.com> on 2010/02/21 01:36:44 UTC

Starting HiveServer

I started HiveServer for the first time using instructions from the
following page:

http://wiki.apache.org/hadoop/Hive/HiveServer


1) bin/hive --service hiveserver
2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true

Getting this error:

 org.apache.thrift.TApplicationException: Invalid method name:
'getThriftSchema'
    [junit]     at
org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
    [junit]     at
org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
    [junit]     at
org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
    [junit]     at
org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
    [junit]     at
org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
    [junit]     at
org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
    [junit]     at
org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at
junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:118)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
    [junit]     at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit]     at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit]     at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
    [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED


I am looking into it, but if you know why this is happening please let me
know.  Thanks.

Re: Starting HiveServer

Posted by Alex Kozlov <al...@cloudera.com>.
I saw errors like this when there are multiple copies of the same library in
the classpath, like core-3.1.1.jar or libfb303.jar (just an idea).  Try to
run the hive server with -hiveconf hive.root.logger=DEBUG,console flag and
look for the library loading errors.

On Mon, Feb 22, 2010 at 10:23 PM, Something Something <
mailinglists19@gmail.com> wrote:

> Thanks, made some progress, but now getting this...
>
> 10/02/22 22:19:50 INFO Datastore.Schema: Initialising Catalog "", Schema
> "APP" using "SchemaTable" auto-start option
> 10/02/22 22:19:50 INFO DataNucleus.Persistence: Managing Persistence of
> org.apache.hadoop.hive.metastore.model.MDatabase since it was managed
> previously
> 10/02/22 22:19:50 INFO DataNucleus.MetaData: Registering listener for
> metadata initialisation
> 10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
> error in file
> "jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
> at line 4, column 6 : cvc-elt.1: Cannot find the declaration of element
> 'jdo'. - Please check your specification of DTD and the validity of the
> MetaData XML that you have specified.
> 10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
> error in file
> "jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
> at line 291, column 13 : The content of element type "class" must match
> "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
> - Please check your specification of DTD and the validity of the MetaData
> XML that you have specified.
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 WARN DataNucleus.Persistence: Unknown Error during auto
> starter execution. : Exception thrown performing schema operation : Add
> classes to Catalog "", Schema "APP"
> Exception thrown performing schema operation : Add classes to Catalog "",
> Schema "APP"
> org.datanucleus.exceptions.NucleusDataStoreException: Exception thrown
> performing schema operation : Add classes to Catalog "", Schema "APP"
>     at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:152)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
>     at
> org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>     at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>     at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
>     at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
>     at
> org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
>     at
> org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
>     at
> org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
>     at
> org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
>     at
> org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
>     at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
>     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
>     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
>     at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>     at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
>     at
> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
>     at
> org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
>     at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>     at java.lang.Thread.run(Thread.java:619)
> Caused by: java.sql.SQLNonTransientConnectionException: No current
> connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
> Source)
>     at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
>     ... 42 more
> Caused by: java.sql.SQLException: No current connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
> Source)
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
> Source)
>     ... 49 more
> Nested Throwables StackTrace:
> java.sql.SQLNonTransientConnectionException: No current connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
> Source)
>     at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
>     at
> org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>     at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>     at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
>     at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
>     at
> org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
>     at
> org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
>     at
> org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
>     at
> org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
>     at
> org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
>     at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
>     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
>     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
>     at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>     at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
>     at
> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
>     at
> org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
>     at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>     at java.lang.Thread.run(Thread.java:619)
> Caused by: java.sql.SQLException: No current connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
> Source)
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
> Source)
>     ... 49 more
>
> 10/02/22 22:19:51 WARN DataNucleus.Persistence: Illegal state of AutoStart,
> disabling it. To enable it, resolve earlier errors.
> 10/02/22 22:19:51 INFO Datastore.Schema: Catalog "", Schema "APP"
> initialised - managing 0 classes
> 10/02/22 22:19:51 INFO metastore.ObjectStore: Initialized ObjectStore
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
> Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
> InheritanceStrategy : new-table]
> 10/02/22 22:19:51 ERROR server.TThreadPoolServer: Error occurred during
> processing of message.
> java.lang.RuntimeException: javax.jdo.JDODataStoreException: Exception
> thrown performing schema operation : Add classes to Catalog "", Schema "APP"
> NestedThrowables:
> java.sql.SQLNonTransientConnectionException: No current connection.
>     at
> org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:368)
>     at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>     at java.lang.Thread.run(Thread.java:619)
> Caused by: javax.jdo.JDODataStoreException: Exception thrown performing
> schema operation : Add classes to Catalog "", Schema "APP"
> NestedThrowables:
> java.sql.SQLNonTransientConnectionException: No current connection.
>     at
> org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
>     at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3741)
>     at
> org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
>     at
> org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
>     at
> org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
>     at
> org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
>     at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
>     at
> org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
>     at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
>     at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:242)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:293)
>     at
> org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:312)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
>     at
> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
>     at
> org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
>     ... 4 more
> Caused by: java.sql.SQLNonTransientConnectionException: No current
> connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>     at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
> Source)
>     at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
> Source)
>     at
> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
>     at
> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
>     at
> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
>     at
> org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
>     at
> org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
>     ... 19 more
> Caused by: java.sql.SQLException: No current connection.
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
> Source)
>     at
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
> Source)
>     ... 32 more
>
>
>
>
>
>
>
>
> On Mon, Feb 22, 2010 at 5:39 PM, Carl Steinbach <ca...@cloudera.com> wrote:
>
>> Ant is finding an older version of Ivy and using it instead of the newer
>> copy that the Hive build script automatically downloads. The older copy of
>> Ivy is probably located somewhere under $HOME/.ant and/or in $ANT_HOME/lib.
>> You need to locate these old versions of the ivy jar and remove them. Then
>> run the build script again and everything should work. Optionally, you can
>> also copy the new version of ivy located in hive-trunk/build/ivy/lib to the
>> places where you previously found the old versions of Ivy (do this after
>> building Hive).
>>
>> Carl
>>
>>
>>
>>
>> On Mon, Feb 22, 2010 at 5:20 PM, Something Something <
>> mailinglists19@gmail.com> wrote:
>>
>>> Carl is right.  I have /user/hivee/warehouse and /tmp created under HDFS.
>>>
>>> Carl - I am trying to follow your instructions.  Getting this...
>>>
>>> /home/training/hive/hive-trunk/build-common.xml:180: impossible to
>>> configure ivy:settings with given file:
>>> /home/training/hive/hive-trunk/ivy/ivysettings.xml :
>>> java.text.ParseException: failed to load settings from
>>> file:/home/training/hive/hive-trunk/ivy/ivysettings.xml: impossible to set
>>> defaultTTL to eternal on class
>>> org.apache.ivy.core.cache.DefaultRepositoryCacheManager
>>>
>>>
>>> when I run:  ant package -Dhadoop.version=0.20.1
>>>
>>> Any ideas?
>>>
>>>
>>> On Mon, Feb 22, 2010 at 5:12 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>>>
>>>>
>>>> I think the problem is that Hive server set hive.metastore.warehouse.dir
>>>>> to /user/hive/warehouse. So we have to create the directory before running
>>>>> TestJdbcDriver and TestHiveServer.
>>>>>
>>>>
>>>> Based on the output of HiveServer that something posted it looks like
>>>> /user/hive/warehouse already exists:
>>>>
>>>>
>>>>
>>>>
>>>> > 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>>>> hdfs://localhost:9000/user/
>>>> hive/warehouse/testhivedrivertable
>>>>
>>>> I also don't see any errors in the HiveServer output, which makes me
>>>> think that the problem is due to library skew on the client side.
>>>>
>>>> Carl
>>>>
>>>
>>>
>>
>

Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
Thanks, made some progress, but now getting this...

10/02/22 22:19:50 INFO Datastore.Schema: Initialising Catalog "", Schema
"APP" using "SchemaTable" auto-start option
10/02/22 22:19:50 INFO DataNucleus.Persistence: Managing Persistence of
org.apache.hadoop.hive.metastore.model.MDatabase since it was managed
previously
10/02/22 22:19:50 INFO DataNucleus.MetaData: Registering listener for
metadata initialisation
10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
at line 4, column 6 : cvc-elt.1: Cannot find the declaration of element
'jdo'. - Please check your specification of DTD and the validity of the
MetaData XML that you have specified.
10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
at line 291, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 WARN DataNucleus.Persistence: Unknown Error during auto
starter execution. : Exception thrown performing schema operation : Add
classes to Catalog "", Schema "APP"
Exception thrown performing schema operation : Add classes to Catalog "",
Schema "APP"
org.datanucleus.exceptions.NucleusDataStoreException: Exception thrown
performing schema operation : Add classes to Catalog "", Schema "APP"
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:152)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
    at
org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
    at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
    at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
    at
org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
    at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
    at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
    at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
    at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
    at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: java.sql.SQLNonTransientConnectionException: No current
connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    ... 42 more
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 49 more
Nested Throwables StackTrace:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
    at
org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
    at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
    at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
    at
org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
    at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
    at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
    at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
    at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
    at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 49 more

10/02/22 22:19:51 WARN DataNucleus.Persistence: Illegal state of AutoStart,
disabling it. To enable it, resolve earlier errors.
10/02/22 22:19:51 INFO Datastore.Schema: Catalog "", Schema "APP"
initialised - managing 0 classes
10/02/22 22:19:51 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 ERROR server.TThreadPoolServer: Error occurred during
processing of message.
java.lang.RuntimeException: javax.jdo.JDODataStoreException: Exception
thrown performing schema operation : Add classes to Catalog "", Schema "APP"
NestedThrowables:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:368)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: javax.jdo.JDODataStoreException: Exception thrown performing
schema operation : Add classes to Catalog "", Schema "APP"
NestedThrowables:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
    at
org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3741)
    at
org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
    at
org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
    at
org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
    at
org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
    at
org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
    at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:242)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:293)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:312)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    ... 4 more
Caused by: java.sql.SQLNonTransientConnectionException: No current
connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
    at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
    at
org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
    at
org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
    ... 19 more
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 32 more







On Mon, Feb 22, 2010 at 5:39 PM, Carl Steinbach <ca...@cloudera.com> wrote:

> Ant is finding an older version of Ivy and using it instead of the newer
> copy that the Hive build script automatically downloads. The older copy of
> Ivy is probably located somewhere under $HOME/.ant and/or in $ANT_HOME/lib.
> You need to locate these old versions of the ivy jar and remove them. Then
> run the build script again and everything should work. Optionally, you can
> also copy the new version of ivy located in hive-trunk/build/ivy/lib to the
> places where you previously found the old versions of Ivy (do this after
> building Hive).
>
> Carl
>
>
>
>
> On Mon, Feb 22, 2010 at 5:20 PM, Something Something <
> mailinglists19@gmail.com> wrote:
>
>> Carl is right.  I have /user/hivee/warehouse and /tmp created under HDFS.
>>
>> Carl - I am trying to follow your instructions.  Getting this...
>>
>> /home/training/hive/hive-trunk/build-common.xml:180: impossible to
>> configure ivy:settings with given file:
>> /home/training/hive/hive-trunk/ivy/ivysettings.xml :
>> java.text.ParseException: failed to load settings from
>> file:/home/training/hive/hive-trunk/ivy/ivysettings.xml: impossible to set
>> defaultTTL to eternal on class
>> org.apache.ivy.core.cache.DefaultRepositoryCacheManager
>>
>>
>> when I run:  ant package -Dhadoop.version=0.20.1
>>
>> Any ideas?
>>
>>
>> On Mon, Feb 22, 2010 at 5:12 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>>
>>>
>>> I think the problem is that Hive server set hive.metastore.warehouse.dir
>>>> to /user/hive/warehouse. So we have to create the directory before running
>>>> TestJdbcDriver and TestHiveServer.
>>>>
>>>
>>> Based on the output of HiveServer that something posted it looks like
>>> /user/hive/warehouse already exists:
>>>
>>>
>>>
>>>
>>> > 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>>> hdfs://localhost:9000/user/
>>> hive/warehouse/testhivedrivertable
>>>
>>> I also don't see any errors in the HiveServer output, which makes me
>>> think that the problem is due to library skew on the client side.
>>>
>>> Carl
>>>
>>
>>
>

Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
Ant is finding an older version of Ivy and using it instead of the newer
copy that the Hive build script automatically downloads. The older copy of
Ivy is probably located somewhere under $HOME/.ant and/or in $ANT_HOME/lib.
You need to locate these old versions of the ivy jar and remove them. Then
run the build script again and everything should work. Optionally, you can
also copy the new version of ivy located in hive-trunk/build/ivy/lib to the
places where you previously found the old versions of Ivy (do this after
building Hive).

Carl



On Mon, Feb 22, 2010 at 5:20 PM, Something Something <
mailinglists19@gmail.com> wrote:

> Carl is right.  I have /user/hivee/warehouse and /tmp created under HDFS.
>
> Carl - I am trying to follow your instructions.  Getting this...
>
> /home/training/hive/hive-trunk/build-common.xml:180: impossible to
> configure ivy:settings with given file:
> /home/training/hive/hive-trunk/ivy/ivysettings.xml :
> java.text.ParseException: failed to load settings from
> file:/home/training/hive/hive-trunk/ivy/ivysettings.xml: impossible to set
> defaultTTL to eternal on class
> org.apache.ivy.core.cache.DefaultRepositoryCacheManager
>
>
> when I run:  ant package -Dhadoop.version=0.20.1
>
> Any ideas?
>
>
> On Mon, Feb 22, 2010 at 5:12 PM, Carl Steinbach <ca...@cloudera.com> wrote:
>
>>
>> I think the problem is that Hive server set hive.metastore.warehouse.dir
>>> to /user/hive/warehouse. So we have to create the directory before running
>>> TestJdbcDriver and TestHiveServer.
>>>
>>
>> Based on the output of HiveServer that something posted it looks like
>> /user/hive/warehouse already exists:
>>
>>
>>
>>
>> > 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>> hdfs://localhost:9000/user/
>> hive/warehouse/testhivedrivertable
>>
>> I also don't see any errors in the HiveServer output, which makes me think
>> that the problem is due to library skew on the client side.
>>
>> Carl
>>
>
>

Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
Carl is right.  I have /user/hivee/warehouse and /tmp created under HDFS.

Carl - I am trying to follow your instructions.  Getting this...

/home/training/hive/hive-trunk/build-common.xml:180: impossible to configure
ivy:settings with given file:
/home/training/hive/hive-trunk/ivy/ivysettings.xml :
java.text.ParseException: failed to load settings from
file:/home/training/hive/hive-trunk/ivy/ivysettings.xml: impossible to set
defaultTTL to eternal on class
org.apache.ivy.core.cache.DefaultRepositoryCacheManager


when I run:  ant package -Dhadoop.version=0.20.1

Any ideas?

On Mon, Feb 22, 2010 at 5:12 PM, Carl Steinbach <ca...@cloudera.com> wrote:

>
> I think the problem is that Hive server set hive.metastore.warehouse.dir to
>> /user/hive/warehouse. So we have to create the directory before running
>> TestJdbcDriver and TestHiveServer.
>>
>
> Based on the output of HiveServer that something posted it looks like
> /user/hive/warehouse already exists:
>
>
>
>
> > 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
> hdfs://localhost:9000/user/
> hive/warehouse/testhivedrivertable
>
> I also don't see any errors in the HiveServer output, which makes me think
> that the problem is due to library skew on the client side.
>
> Carl
>

Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
> I think the problem is that Hive server set hive.metastore.warehouse.dir to
> /user/hive/warehouse. So we have to create the directory before running
> TestJdbcDriver and TestHiveServer.
>

Based on the output of HiveServer that something posted it looks like
/user/hive/warehouse already exists:



> 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
hdfs://localhost:9000/user/
hive/warehouse/testhivedrivertable

I also don't see any errors in the HiveServer output, which makes me think
that the problem is due to library skew on the client side.

Carl

Re: Starting HiveServer

Posted by Ning Zhang <nz...@facebook.com>.
I think the problem is that Hive server set hive.metastore.warehouse.dir to /user/hive/warehouse. So we have to create the directory before running TestJdbcDriver and TestHiveServer.

I don't think this is required previously, but I didn't find which patch made the change.

Ning

On Feb 22, 2010, at 3:59 PM, Carl Steinbach wrote:

The following steps work for me:

# Checkout and build Hive trunk:
% cd $HOME
% mkdir workspace
% cd workspace
% svn co http://svn.apache.org/repos/asf/hadoop/hive/trunk hive-trunk
% cd hive-trunk
% ant package -Dhadoop.version=0.20.1

# Run hive server:
% cd $HOME/workspace
% export HADOOP_HOME=<root of your 0.20.1 install>
% export PATH=$HADOOP_HOME/bin:$PATH
% export HIVE_HOME=$HOME/workspace/hive-trunk/build/dist
% export PATH=$HIVE_HOME/bin:$PATH
% hive --service hiveserver


# Run Jdbc tests:
% cd $HOME/workspace/hive-trunk
% export HADOOP_HOME=<root of your 0.20.1 install>
% export PATH=$HADOOP_HOME/bin:$PATH
% export HIVE_HOME=$HOME/workspace/hive-trunk/build/dist
% export PATH=$HIVE_HOME/bin:$PATH
% ant test -Dtestcase=TestJdbcDriver -Dstandalone=true -Dhadoop.version=0.20.1


On Mon, Feb 22, 2010 at 3:16 PM, Something Something <ma...@gmail.com>> wrote:
Ning - Yes, I tested TestJdbcDriver without -Dstandalone=true and that passes.

Carl - How did you get it to work?  Any suggestions for me?


On Mon, Feb 22, 2010 at 3:01 PM, Carl Steinbach <ca...@cloudera.com>> wrote:

Under Hive, though, it's in 6 different places:

This is normal.

and libfb303.jar is in 4 places:

This is also normal.

I have Hadoop running from outside Hive, from directory, /home/training/hadoop-0.20.1 (because I had it installed previously).  Is that okay?

This shouldn't be a problem either.

I was able to get TestHiveServer and TestJdbcDriver to pass in standalone mode after fixing an unrelated problem with my VCS.

Ning, what problem are you running into?

Carl






Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
The following steps work for me:

# Checkout and build Hive trunk:
% cd $HOME
% mkdir workspace
% cd workspace
% svn co http://svn.apache.org/repos/asf/hadoop/hive/trunk hive-trunk
% cd hive-trunk
% ant package -Dhadoop.version=0.20.1

# Run hive server:
% cd $HOME/workspace
% export HADOOP_HOME=<root of your 0.20.1 install>
% export PATH=$HADOOP_HOME/bin:$PATH
% export HIVE_HOME=$HOME/workspace/hive-trunk/build/dist
% export PATH=$HIVE_HOME/bin:$PATH
% hive --service hiveserver


# Run Jdbc tests:
% cd $HOME/workspace/hive-trunk
% export HADOOP_HOME=<root of your 0.20.1 install>
% export PATH=$HADOOP_HOME/bin:$PATH
% export HIVE_HOME=$HOME/workspace/hive-trunk/build/dist
% export PATH=$HIVE_HOME/bin:$PATH
% ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
-Dhadoop.version=0.20.1


On Mon, Feb 22, 2010 at 3:16 PM, Something Something <
mailinglists19@gmail.com> wrote:

> Ning - Yes, I tested TestJdbcDriver without -Dstandalone=true and that
> passes.
>
> Carl - How did you get it to work?  Any suggestions for me?
>
>
> On Mon, Feb 22, 2010 at 3:01 PM, Carl Steinbach <ca...@cloudera.com> wrote:
>
>>
>> Under Hive, though, it's in 6 different places:
>>>
>>
>> This is normal.
>>
>> and libfb303.jar is in 4 places:
>>>
>>
>> This is also normal.
>>
>>
>>> I have Hadoop running from outside Hive, from directory,
>>> /home/training/hadoop-0.20.1 (because I had it installed previously).  Is
>>> that okay?
>>>
>>
>> This shouldn't be a problem either.
>>
>> I was able to get TestHiveServer and TestJdbcDriver to pass in standalone
>> mode after fixing an unrelated problem with my VCS.
>>
>> Ning, what problem are you running into?
>>
>> Carl
>>
>>
>>
>

Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
Ning - Yes, I tested TestJdbcDriver without -Dstandalone=true and that
passes.

Carl - How did you get it to work?  Any suggestions for me?

On Mon, Feb 22, 2010 at 3:01 PM, Carl Steinbach <ca...@cloudera.com> wrote:

>
> Under Hive, though, it's in 6 different places:
>>
>
> This is normal.
>
> and libfb303.jar is in 4 places:
>>
>
> This is also normal.
>
>
>> I have Hadoop running from outside Hive, from directory,
>> /home/training/hadoop-0.20.1 (because I had it installed previously).  Is
>> that okay?
>>
>
> This shouldn't be a problem either.
>
> I was able to get TestHiveServer and TestJdbcDriver to pass in standalone
> mode after fixing an unrelated problem with my VCS.
>
> Ning, what problem are you running into?
>
> Carl
>
>
>

Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
> Under Hive, though, it's in 6 different places:
>

This is normal.

and libfb303.jar is in 4 places:
>

This is also normal.


> I have Hadoop running from outside Hive, from directory,
> /home/training/hadoop-0.20.1 (because I had it installed previously).  Is
> that okay?
>

This shouldn't be a problem either.

I was able to get TestHiveServer and TestJdbcDriver to pass in standalone
mode after fixing an unrelated problem with my VCS.

Ning, what problem are you running into?

Carl

Re: Starting HiveServer

Posted by Ning Zhang <nz...@facebook.com>.
The litthrift.jar under hadoop/contrib may be the cause of the problem. The multiple libthrift.jar or libfb303.jar files are OK: hive/lib/libthrift.jar should be the same as build/dist/lib/libthrift.jar. What's important is that there is one copy under build/dist which is the Hive installation directory. We probably should remove all the other jar files in hadoop*/src/contrib.

BTW, have you tested TestJdbcDriver without -Dstandalone=true? If so did that pass?

Thanks,
Ning


On Feb 22, 2010, at 2:35 PM, Something Something wrote:

Thanks for looking into this issue.  There was a libthrift.jar in the hadoop/contrib but I removed that.  The libfb303.jar wasn't in Hadoop.

Under Hive, though, it's in 6 different places:

training@training-vm:~$ find . -name 'libthrift.jar'
./hive/build/dist/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/hive/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/thriftfs/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/contrib/hive/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.20.0/src/contrib/thriftfs/lib/libthrift.jar
./hive/lib/libthrift.jar


and libfb303.jar is in 4 places:

training@training-vm:~$ find . -name 'libfb303.jar'
./hive/build/dist/lib/libfb303.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/hive/lib/libfb303.jar
./hive/build/hadoopcore/hadoop-0.19.0/contrib/hive/lib/libfb303.jar
./hive/lib/libfb303.jar


I have Hadoop running from outside Hive, from directory, /home/training/hadoop-0.20.1 (because I had it installed previously).  Is that okay?


On Mon, Feb 22, 2010 at 2:15 PM, Ning Zhang <nz...@facebook.com>> wrote:
I ran into a different error when running Jdbc test on standalone mode. I'm looking into that issue. It seems your error is due to thrift connection. Can you double check if you have another version of libthrift.jar or libfb303.jar in your classpath? This could be true if you have these two jars in your hadoop's lib directory.

Thanks,
Ning

On Feb 22, 2010, at 12:50 PM, Something Something wrote:

I used this command:  svn co http://svn.apache.org/repos/asf/hadoop/hive/trunk hive
So, AFAIK I got it from trunk around Sat, Feb 20, 2010 at 4:00 PM PST.

I also tried http://svn.apache.org/repos/asf/hadoop/hive/tags/release-0.5.0-rc1/      yesterday (Sunday afternoon), but ran into the same issue.

I have HIVE_HOME set to /home/training/hive, so I am running both commands from hive's root (installation) directory.

I am not getting the error message that you are getting.  It could be because I made the changes suggested by Vidyasagar in this email thread:  http://www.mail-archive.com/hive-user@hadoop.apache.org/msg02535.html

Greatly appreciate your help with this.  If I can't access Hive from a Java program I can't really use Hive so I am stuck at this point (unless of course I fire up the IDE and start debugging the Thrift code).


On Mon, Feb 22, 2010 at 12:05 PM, Carl Steinbach <ca...@cloudera.com>> wrote:
Hey,

I tried running the test on trunk and ran into this issue: http://issues.apache.org/jira/browse/HIVE-1188

Since you appear to be getting a little farther along than this I doubt that you are actually running the test on trunk (though it's possible that you are running on an older copy of trunk). When was the last time you updated your svn workspace? Also, which directory were you in when you ran "bin/hive --service hiveserver" and "ant test -Dtestcase=TestJdbcDriver"?

Thanks.

Carl


On Sun, Feb 21, 2010 at 11:29 AM, Something Something <ma...@gmail.com>> wrote:
I am following instructions on 'Getting Started' (http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from the trunk.   No error messages in Hiveserver log.

This is what I see:

10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table testHiveDriverTable
10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table : db=default tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.warehouse: deleting  hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
OK
10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:16 INFO ql.Driver: Starting command: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating tabletestHiveDriverTable
10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table testHiveDriverTable
10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32 key, string value}
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table: db=default tbl=testHiveDriverTable
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default tbl=testHiveDriverTable
OK
10/02/21 11:25:17 INFO ql.Driver: OK





On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com>> wrote:
Which version of Hive are you using? Also, what does the log output of the HiveServer process look like?

Thanks.

Carl


On Sat, Feb 20, 2010 at 4:36 PM, Something Something <ma...@gmail.com>> wrote:
I started HiveServer for the first time using instructions from the following page:

http://wiki.apache.org/hadoop/Hive/HiveServer


1) bin/hive --service hiveserver
2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true

Getting this error:

 org.apache.thrift.TApplicationException: Invalid method name: 'getThriftSchema'
    [junit]     at org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
    [junit]     at org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
    [junit]     at org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
    [junit]     at org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:118)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
    [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED


I am looking into it, but if you know why this is happening please let me know.  Thanks.









Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
Thanks for looking into this issue.  There was a libthrift.jar in the
hadoop/contrib but I removed that.  The libfb303.jar wasn't in Hadoop.

Under Hive, though, it's in 6 different places:

training@training-vm:~$ find . -name 'libthrift.jar'
./hive/build/dist/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/hive/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/thriftfs/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.19.0/contrib/hive/lib/libthrift.jar
./hive/build/hadoopcore/hadoop-0.20.0/src/contrib/thriftfs/lib/libthrift.jar
./hive/lib/libthrift.jar


and libfb303.jar is in 4 places:

training@training-vm:~$ find . -name 'libfb303.jar'
./hive/build/dist/lib/libfb303.jar
./hive/build/hadoopcore/hadoop-0.19.0/src/contrib/hive/lib/libfb303.jar
./hive/build/hadoopcore/hadoop-0.19.0/contrib/hive/lib/libfb303.jar
./hive/lib/libfb303.jar


I have Hadoop running from outside Hive, from directory,
/home/training/hadoop-0.20.1 (because I had it installed previously).  Is
that okay?


On Mon, Feb 22, 2010 at 2:15 PM, Ning Zhang <nz...@facebook.com> wrote:

> I ran into a different error when running Jdbc test on standalone mode. I'm
> looking into that issue. It seems your error is due to thrift connection.
> Can you double check if you have another version of libthrift.jar or
> libfb303.jar in your classpath? This could be true if you have these two
> jars in your hadoop's lib directory.
>
> Thanks,
> Ning
>
> On Feb 22, 2010, at 12:50 PM, Something Something wrote:
>
> I used this command:  svn co
> http://svn.apache.org/repos/asf/hadoop/hive/trunk hive
> So, AFAIK I got it from trunk around Sat, Feb 20, 2010 at 4:00 PM PST.
>
> I also tried
> http://svn.apache.org/repos/asf/hadoop/hive/tags/release-0.5.0-rc1/
>  yesterday (Sunday afternoon), but ran into the same issue.
>
> I have HIVE_HOME set to /home/training/hive, so I am running both commands
> from hive's root (installation) directory.
>
> I am not getting the error message that you are getting.  It could be
> because I made the changes suggested by Vidyasagar in this email thread:
> http://www.mail-archive.com/hive-user@hadoop.apache.org/msg02535.html
>
> Greatly appreciate your help with this.  If I can't access Hive from a Java
> program I can't really use Hive so I am stuck at this point (unless of
> course I fire up the IDE and start debugging the Thrift code).
>
>
> On Mon, Feb 22, 2010 at 12:05 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>
>> Hey,
>>
>> I tried running the test on trunk and ran into this issue:
>> http://issues.apache.org/jira/browse/HIVE-1188
>>
>> Since you appear to be getting a little farther along than this I doubt
>> that you are actually running the test on trunk (though it's possible that
>> you are running on an older copy of trunk). When was the last time you
>> updated your svn workspace? Also, which directory were you in when you ran
>> "bin/hive --service hiveserver" and "ant test -Dtestcase=TestJdbcDriver"?
>>
>> Thanks.
>>
>> Carl
>>
>>
>> On Sun, Feb 21, 2010 at 11:29 AM, Something Something <
>> mailinglists19@gmail.com> wrote:
>>
>>> I am following instructions on 'Getting Started' (
>>> http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from
>>> the trunk.   No error messages in Hiveserver log.
>>>
>>> This is what I see:
>>>
>>> 10/02/21 11:25:16 INFO ql.Driver: OK
>>> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table
>>> testHiveDriverTable
>>> 10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table
>>> testHiveDriverTable
>>> 10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table
>>> testHiveDriverTable
>>> 10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
>>> 10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
>>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table :
>>> db=default tbl=testHiveDriverTable
>>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with
>>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize
>>> called
>>> 10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
>>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default
>>> tbl=testHiveDriverTable
>>> 10/02/21 11:25:16 INFO metastore.warehouse: deleting
>>> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
>>> 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>>> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
>>> OK
>>> 10/02/21 11:25:16 INFO ql.Driver: OK
>>> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: create
>>> table testHiveDriverTable (key int, value string)
>>> 10/02/21 11:25:16 INFO ql.Driver: Starting command: create table
>>> testHiveDriverTable (key int, value string)
>>> 10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table
>>> testHiveDriverTable (key int, value string)
>>> 10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
>>> 10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating
>>> tabletestHiveDriverTable
>>> 10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
>>> 10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table
>>> testHiveDriverTable
>>> 10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32
>>> key, string value}
>>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table:
>>> db=default tbl=testHiveDriverTable
>>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with
>>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize
>>> called
>>> 10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
>>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default
>>> tbl=testHiveDriverTable
>>> OK
>>> 10/02/21 11:25:17 INFO ql.Driver: OK
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>>>
>>>> Which version of Hive are you using? Also, what does the log output of
>>>> the HiveServer process look like?
>>>>
>>>> Thanks.
>>>>
>>>> Carl
>>>>
>>>>
>>>> On Sat, Feb 20, 2010 at 4:36 PM, Something Something <
>>>> mailinglists19@gmail.com> wrote:
>>>>
>>>>> I started HiveServer for the first time using instructions from the
>>>>> following page:
>>>>>
>>>>> http://wiki.apache.org/hadoop/Hive/HiveServer
>>>>>
>>>>>
>>>>> 1) bin/hive --service hiveserver
>>>>> 2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>>>>
>>>>> Getting this error:
>>>>>
>>>>>  org.apache.thrift.TApplicationException: Invalid method name:
>>>>> 'getThriftSchema'
>>>>>     [junit]     at
>>>>> org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
>>>>>     [junit]     at
>>>>> org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
>>>>>     [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
>>>>>     [junit]     at
>>>>> junit.framework.TestResult$1.protect(TestResult.java:106)
>>>>>     [junit]     at
>>>>> junit.framework.TestResult.runProtected(TestResult.java:124)
>>>>>     [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>>>>>     [junit]     at junit.framework.TestCase.run(TestCase.java:118)
>>>>>     [junit]     at
>>>>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>>>>>     [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
>>>>>     [junit]     at
>>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>>>>>     [junit]     at
>>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>>>>>     [junit]     at
>>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>>>>>     [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867
>>>>> sec
>>>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>>>>
>>>>>
>>>>> I am looking into it, but if you know why this is happening please let
>>>>> me know.  Thanks.
>>>>>
>>>>>
>>>>
>>>
>>
>
>

Re: Starting HiveServer

Posted by Ning Zhang <nz...@facebook.com>.
I ran into a different error when running Jdbc test on standalone mode. I'm looking into that issue. It seems your error is due to thrift connection. Can you double check if you have another version of libthrift.jar or libfb303.jar in your classpath? This could be true if you have these two jars in your hadoop's lib directory.

Thanks,
Ning

On Feb 22, 2010, at 12:50 PM, Something Something wrote:

I used this command:  svn co http://svn.apache.org/repos/asf/hadoop/hive/trunk hive
So, AFAIK I got it from trunk around Sat, Feb 20, 2010 at 4:00 PM PST.

I also tried http://svn.apache.org/repos/asf/hadoop/hive/tags/release-0.5.0-rc1/      yesterday (Sunday afternoon), but ran into the same issue.

I have HIVE_HOME set to /home/training/hive, so I am running both commands from hive's root (installation) directory.

I am not getting the error message that you are getting.  It could be because I made the changes suggested by Vidyasagar in this email thread:  http://www.mail-archive.com/hive-user@hadoop.apache.org/msg02535.html

Greatly appreciate your help with this.  If I can't access Hive from a Java program I can't really use Hive so I am stuck at this point (unless of course I fire up the IDE and start debugging the Thrift code).


On Mon, Feb 22, 2010 at 12:05 PM, Carl Steinbach <ca...@cloudera.com>> wrote:
Hey,

I tried running the test on trunk and ran into this issue: http://issues.apache.org/jira/browse/HIVE-1188

Since you appear to be getting a little farther along than this I doubt that you are actually running the test on trunk (though it's possible that you are running on an older copy of trunk). When was the last time you updated your svn workspace? Also, which directory were you in when you ran "bin/hive --service hiveserver" and "ant test -Dtestcase=TestJdbcDriver"?

Thanks.

Carl


On Sun, Feb 21, 2010 at 11:29 AM, Something Something <ma...@gmail.com>> wrote:
I am following instructions on 'Getting Started' (http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from the trunk.   No error messages in Hiveserver log.

This is what I see:

10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table testHiveDriverTable
10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table : db=default tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.warehouse: deleting  hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
OK
10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:16 INFO ql.Driver: Starting command: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating tabletestHiveDriverTable
10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table testHiveDriverTable
10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32 key, string value}
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table: db=default tbl=testHiveDriverTable
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default tbl=testHiveDriverTable
OK
10/02/21 11:25:17 INFO ql.Driver: OK





On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com>> wrote:
Which version of Hive are you using? Also, what does the log output of the HiveServer process look like?

Thanks.

Carl


On Sat, Feb 20, 2010 at 4:36 PM, Something Something <ma...@gmail.com>> wrote:
I started HiveServer for the first time using instructions from the following page:

http://wiki.apache.org/hadoop/Hive/HiveServer


1) bin/hive --service hiveserver
2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true

Getting this error:

 org.apache.thrift.TApplicationException: Invalid method name: 'getThriftSchema'
    [junit]     at org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
    [junit]     at org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
    [junit]     at org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
    [junit]     at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
    [junit]     at org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:118)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
    [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED


I am looking into it, but if you know why this is happening please let me know.  Thanks.







Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
I used this command:  svn co
http://svn.apache.org/repos/asf/hadoop/hive/trunk hive
So, AFAIK I got it from trunk around Sat, Feb 20, 2010 at 4:00 PM PST.

I also tried
http://svn.apache.org/repos/asf/hadoop/hive/tags/release-0.5.0-rc1/
 yesterday (Sunday afternoon), but ran into the same issue.

I have HIVE_HOME set to /home/training/hive, so I am running both commands
from hive's root (installation) directory.

I am not getting the error message that you are getting.  It could be
because I made the changes suggested by Vidyasagar in this email thread:
http://www.mail-archive.com/hive-user@hadoop.apache.org/msg02535.html

Greatly appreciate your help with this.  If I can't access Hive from a Java
program I can't really use Hive so I am stuck at this point (unless of
course I fire up the IDE and start debugging the Thrift code).


On Mon, Feb 22, 2010 at 12:05 PM, Carl Steinbach <ca...@cloudera.com> wrote:

> Hey,
>
> I tried running the test on trunk and ran into this issue:
> http://issues.apache.org/jira/browse/HIVE-1188
>
> Since you appear to be getting a little farther along than this I doubt
> that you are actually running the test on trunk (though it's possible that
> you are running on an older copy of trunk). When was the last time you
> updated your svn workspace? Also, which directory were you in when you ran
> "bin/hive --service hiveserver" and "ant test -Dtestcase=TestJdbcDriver"?
>
> Thanks.
>
> Carl
>
>
> On Sun, Feb 21, 2010 at 11:29 AM, Something Something <
> mailinglists19@gmail.com> wrote:
>
>> I am following instructions on 'Getting Started' (
>> http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from
>> the trunk.   No error messages in Hiveserver log.
>>
>> This is what I see:
>>
>> 10/02/21 11:25:16 INFO ql.Driver: OK
>> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table
>> testHiveDriverTable
>> 10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table
>> testHiveDriverTable
>> 10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table
>> testHiveDriverTable
>> 10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
>> 10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table : db=default
>> tbl=testHiveDriverTable
>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with
>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>> 10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize
>> called
>> 10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
>> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default
>> tbl=testHiveDriverTable
>> 10/02/21 11:25:16 INFO metastore.warehouse: deleting
>> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
>> 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
>> OK
>> 10/02/21 11:25:16 INFO ql.Driver: OK
>> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: create table
>> testHiveDriverTable (key int, value string)
>> 10/02/21 11:25:16 INFO ql.Driver: Starting command: create table
>> testHiveDriverTable (key int, value string)
>> 10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table
>> testHiveDriverTable (key int, value string)
>> 10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
>> 10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating
>> tabletestHiveDriverTable
>> 10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
>> 10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table
>> testHiveDriverTable
>> 10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32
>> key, string value}
>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table:
>> db=default tbl=testHiveDriverTable
>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with
>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>> 10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize
>> called
>> 10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
>> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default
>> tbl=testHiveDriverTable
>> OK
>> 10/02/21 11:25:17 INFO ql.Driver: OK
>>
>>
>>
>>
>>
>> On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>>
>>> Which version of Hive are you using? Also, what does the log output of
>>> the HiveServer process look like?
>>>
>>> Thanks.
>>>
>>> Carl
>>>
>>>
>>> On Sat, Feb 20, 2010 at 4:36 PM, Something Something <
>>> mailinglists19@gmail.com> wrote:
>>>
>>>> I started HiveServer for the first time using instructions from the
>>>> following page:
>>>>
>>>> http://wiki.apache.org/hadoop/Hive/HiveServer
>>>>
>>>>
>>>> 1) bin/hive --service hiveserver
>>>> 2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>>>
>>>> Getting this error:
>>>>
>>>>  org.apache.thrift.TApplicationException: Invalid method name:
>>>> 'getThriftSchema'
>>>>     [junit]     at
>>>> org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
>>>>     [junit]     at
>>>> org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
>>>>     [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
>>>>     [junit]     at
>>>> junit.framework.TestResult$1.protect(TestResult.java:106)
>>>>     [junit]     at
>>>> junit.framework.TestResult.runProtected(TestResult.java:124)
>>>>     [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>>>>     [junit]     at junit.framework.TestCase.run(TestCase.java:118)
>>>>     [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
>>>>     [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
>>>>     [junit]     at
>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>>>>     [junit]     at
>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>>>>     [junit]     at
>>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>>>>     [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867
>>>> sec
>>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>>>
>>>>
>>>> I am looking into it, but if you know why this is happening please let
>>>> me know.  Thanks.
>>>>
>>>>
>>>
>>
>

Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
Hey,

I tried running the test on trunk and ran into this issue:
http://issues.apache.org/jira/browse/HIVE-1188

Since you appear to be getting a little farther along than this I doubt that
you are actually running the test on trunk (though it's possible that you
are running on an older copy of trunk). When was the last time you updated
your svn workspace? Also, which directory were you in when you ran "bin/hive
--service hiveserver" and "ant test -Dtestcase=TestJdbcDriver"?

Thanks.

Carl

On Sun, Feb 21, 2010 at 11:29 AM, Something Something <
mailinglists19@gmail.com> wrote:

> I am following instructions on 'Getting Started' (
> http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from
> the trunk.   No error messages in Hiveserver log.
>
> This is what I see:
>
> 10/02/21 11:25:16 INFO ql.Driver: OK
> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table
> testHiveDriverTable
> 10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table
> testHiveDriverTable
> 10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table
> testHiveDriverTable
> 10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
> 10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table : db=default
> tbl=testHiveDriverTable
> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize
> called
> 10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
> 10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default
> tbl=testHiveDriverTable
> 10/02/21 11:25:16 INFO metastore.warehouse: deleting
> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
> 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
> hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
> OK
> 10/02/21 11:25:16 INFO ql.Driver: OK
> 10/02/21 11:25:16 INFO service.HiveServer: Running the query: create table
> testHiveDriverTable (key int, value string)
> 10/02/21 11:25:16 INFO ql.Driver: Starting command: create table
> testHiveDriverTable (key int, value string)
> 10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table
> testHiveDriverTable (key int, value string)
> 10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
> 10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating
> tabletestHiveDriverTable
> 10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
> 10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table
> testHiveDriverTable
> 10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32 key,
> string value}
> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table: db=default
> tbl=testHiveDriverTable
> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize
> called
> 10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
> 10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default
> tbl=testHiveDriverTable
> OK
> 10/02/21 11:25:17 INFO ql.Driver: OK
>
>
>
>
>
> On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com>wrote:
>
>> Which version of Hive are you using? Also, what does the log output of the
>> HiveServer process look like?
>>
>> Thanks.
>>
>> Carl
>>
>>
>> On Sat, Feb 20, 2010 at 4:36 PM, Something Something <
>> mailinglists19@gmail.com> wrote:
>>
>>> I started HiveServer for the first time using instructions from the
>>> following page:
>>>
>>> http://wiki.apache.org/hadoop/Hive/HiveServer
>>>
>>>
>>> 1) bin/hive --service hiveserver
>>> 2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>>
>>> Getting this error:
>>>
>>>  org.apache.thrift.TApplicationException: Invalid method name:
>>> 'getThriftSchema'
>>>     [junit]     at
>>> org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
>>>     [junit]     at
>>> org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
>>>     [junit]     at
>>> org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
>>>     [junit]     at
>>> org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
>>>     [junit]     at
>>> org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
>>>     [junit]     at
>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
>>>     [junit]     at
>>> org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
>>>     [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
>>>     [junit]     at
>>> junit.framework.TestResult$1.protect(TestResult.java:106)
>>>     [junit]     at
>>> junit.framework.TestResult.runProtected(TestResult.java:124)
>>>     [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>>>     [junit]     at junit.framework.TestCase.run(TestCase.java:118)
>>>     [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
>>>     [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
>>>     [junit]     at
>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>>>     [junit]     at
>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>>>     [junit]     at
>>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>>>     [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>>
>>>
>>> I am looking into it, but if you know why this is happening please let me
>>> know.  Thanks.
>>>
>>>
>>
>

Re: Starting HiveServer

Posted by Something Something <ma...@gmail.com>.
I am following instructions on 'Getting Started' (
http://wiki.apache.org/hadoop/Hive/GettingStarted), so I am getting from the
trunk.   No error messages in Hiveserver log.

This is what I see:

10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: drop table
testHiveDriverTable
10/02/21 11:25:16 INFO ql.Driver: Starting command: drop table
testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parsing command: drop table
testHiveDriverTable
10/02/21 11:25:16 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:16 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: drop_table : db=default
tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:16 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:16 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:16 INFO metastore.HiveMetaStore: 9: get_table : db=default
tbl=testHiveDriverTable
10/02/21 11:25:16 INFO metastore.warehouse: deleting
hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
hdfs://localhost:9000/user/hive/warehouse/testhivedrivertable
OK
10/02/21 11:25:16 INFO ql.Driver: OK
10/02/21 11:25:16 INFO service.HiveServer: Running the query: create table
testHiveDriverTable (key int, value string)
10/02/21 11:25:16 INFO ql.Driver: Starting command: create table
testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parsing command: create table
testHiveDriverTable (key int, value string)
10/02/21 11:25:17 INFO parse.ParseDriver: Parse Completed
10/02/21 11:25:17 INFO parse.DDLSemanticAnalyzer: Creating
tabletestHiveDriverTable
10/02/21 11:25:17 INFO ql.Driver: Semantic Analysis Completed
10/02/21 11:25:17 INFO exec.DDLTask: Default to LazySimpleSerDe for table
testHiveDriverTable
10/02/21 11:25:17 INFO hive.log: DDL: struct testHiveDriverTable { i32 key,
string value}
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: create_table: db=default
tbl=testHiveDriverTable
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/21 11:25:17 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/21 11:25:17 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/21 11:25:17 INFO metastore.HiveMetaStore: 9: get_table : db=default
tbl=testHiveDriverTable
OK
10/02/21 11:25:17 INFO ql.Driver: OK




On Sat, Feb 20, 2010 at 11:01 PM, Carl Steinbach <ca...@cloudera.com> wrote:

> Which version of Hive are you using? Also, what does the log output of the
> HiveServer process look like?
>
> Thanks.
>
> Carl
>
>
> On Sat, Feb 20, 2010 at 4:36 PM, Something Something <
> mailinglists19@gmail.com> wrote:
>
>> I started HiveServer for the first time using instructions from the
>> following page:
>>
>> http://wiki.apache.org/hadoop/Hive/HiveServer
>>
>>
>> 1) bin/hive --service hiveserver
>> 2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>
>> Getting this error:
>>
>>  org.apache.thrift.TApplicationException: Invalid method name:
>> 'getThriftSchema'
>>     [junit]     at
>> org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
>>     [junit]     at
>> org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
>>     [junit]     at
>> org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
>>     [junit]     at
>> org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
>>     [junit]     at
>> org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
>>     [junit]     at
>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
>>     [junit]     at
>> org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
>>     [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
>>     [junit]     at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>>     [junit]     at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>>     [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>>     [junit]     at junit.framework.TestCase.run(TestCase.java:118)
>>     [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
>>     [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
>>     [junit]     at
>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>>     [junit]     at
>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>>     [junit]     at
>> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>>     [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>
>>
>> I am looking into it, but if you know why this is happening please let me
>> know.  Thanks.
>>
>>
>

Re: Starting HiveServer

Posted by Carl Steinbach <ca...@cloudera.com>.
Which version of Hive are you using? Also, what does the log output of the
HiveServer process look like?

Thanks.

Carl

On Sat, Feb 20, 2010 at 4:36 PM, Something Something <
mailinglists19@gmail.com> wrote:

> I started HiveServer for the first time using instructions from the
> following page:
>
> http://wiki.apache.org/hadoop/Hive/HiveServer
>
>
> 1) bin/hive --service hiveserver
> 2)   ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>
> Getting this error:
>
>  org.apache.thrift.TApplicationException: Invalid method name:
> 'getThriftSchema'
>     [junit]     at
> org.apache.thrift.TApplicationException.read(TApplicationException.java:107)
>     [junit]     at
> org.apache.hadoop.hive.service.ThriftHive$Client.recv_getThriftSchema(ThriftHive.java:247)
>     [junit]     at
> org.apache.hadoop.hive.service.ThriftHive$Client.getThriftSchema(ThriftHive.java:231)
>     [junit]     at
> org.apache.hadoop.hive.jdbc.HiveResultSet.initDynamicSerde(HiveResultSet.java:90)
>     [junit]     at
> org.apache.hadoop.hive.jdbc.HiveResultSet.<init>(HiveResultSet.java:77)
>     [junit]     at
> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:201)
>     [junit]     at
> org.apache.hadoop.hive.jdbc.TestJdbcDriver.setUp(TestJdbcDriver.java:81)
>     [junit]     at junit.framework.TestCase.runBare(TestCase.java:125)
>     [junit]     at
> junit.framework.TestResult$1.protect(TestResult.java:106)
>     [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>     [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>     [junit]     at junit.framework.TestCase.run(TestCase.java:118)
>     [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
>     [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
>     [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>     [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>     [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>     [junit] Tests run: 9, Failures: 0, Errors: 9, Time elapsed: 3.867 sec
>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>
>
> I am looking into it, but if you know why this is happening please let me
> know.  Thanks.
>
>