You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by demian rosas <de...@gmail.com> on 2013/04/03 23:22:00 UTC

Re: Problem to get hive with remote mysql metastore working

Hi all,

Just to let you know.

The problem was that the metastore service was down.

I was starting it with

sudo service hive-metastore start

and even when there was a message saying that the service was started, it
was not.

I have started the service with

hive --service metastore

and now, at least can execute the "show tables" command in hive CLI and get
the "OK".

Thanks a lot to all of you guys.

Cheers,
Demian


On 3 April 2013 12:56, Mark Grover <mg...@cloudera.com> wrote:

> Adding back cdh-user
>
> Hi Demian,
> Can you check the status of your metastore service?
>
> sudo service hive-metastore status
>
> If it is down, can you post the logs from under
> /var/log/hive/hive-metastore.log?
>
> Mark
>
> On Wed, Apr 3, 2013 at 12:47 PM, demian rosas <de...@gmail.com> wrote:
> > Hi Mark,
> >
> > Thanks for the hint.
> >
> > Have changed my hive-site.xml file, now it is:
> >
> > <property>
> >   <name>javax.jdo.option.ConnectionURL</name>
> >
> >
> <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
> >   <description>JDBC connect string for a JDBC metastore</description>
> > </property>
> >
> > <property>
> >   <name>javax.jdo.option.ConnectionDriverName</name>
> >   <value>com.mysql.jdbc.Driver</value>
> >   <description>Driver class name for a JDBC metastore</description>
> > </property>
> >
> > <property>
> >   <name>javax.jdo.option.ConnectionUserName</name>
> >   <value>drosash</value>
> > </property>
> >
> > <property>
> >   <name>javax.jdo.option.ConnectionPassword</name>
> >   <value>drosash</value>
> > </property>
> >
> > <property>
> >   <name>datanucleus.autoCreateSchema</name>
> >   <value>false</value>
> > </property>
> >
> > <property>
> >   <name>datanucleus.fixedDatastore</name>
> >   <value>true</value>
> > </property>
> >
> > <property>
> > exit
> >   <name>hive.metastore.uris</name>
> >   <value>thrift://10.240.81.72:9083</value>
> >   <description>IP address (or fully-qualified domain name) and port of
> the
> > metastore host</description>
> > </property>
> >
> > <property>
> >   <name>hive.metastore.warehouse.dir</name>
> >   <value>/user/hive/warehouse</value>
> > </property>
> >
> > <property>
> >   <name>hive.server.thrift.port</name>
> >    <value>10000</value>
> > </property>
> >
> >
> > Then I ran "show tables" in hive CLI with debug messages. Th output is
> this:
> >
> > hive> show tables;
> > 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=Driver.run>
> > 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
> > 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=compile>
> > 13/04/03 12:41:53 INFO parse.ParseDriver: Parsing command: show tables
> > 13/04/03 12:41:53 INFO parse.ParseDriver: Parse Completed
> > 13/04/03 12:41:53 INFO ql.Driver: Semantic Analysis Completed
> > 13/04/03 12:41:53 INFO exec.ListSinkOperator: Initializing Self 0 OP
> > 13/04/03 12:41:53 INFO exec.ListSinkOperator: Operator 0 OP initialized
> > 13/04/03 12:41:53 INFO exec.ListSinkOperator: Initialization Done 0 OP
> > 13/04/03 12:41:53 INFO ql.Driver: Returning Hive schema:
> > Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from
> > deserializer)], properties:null)
> > 13/04/03 12:41:53 INFO ql.Driver: </PERFLOG method=compile
> > start=1365018113270 end=1365018113666 duration=396>
> > 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=Driver.execute>
> > 13/04/03 12:41:53 INFO ql.Driver: Starting command: show tables
> > 13/04/03 12:41:53 INFO ql.Driver: </PERFLOG method=TimeToSubmit
> > start=1365018113270 end=1365018113685 duration=415>
> > 13/04/03 12:41:53 INFO hive.metastore: Trying to connect to metastore
> with
> > URI thrift://10.240.81.72:9083
> > 13/04/03 12:41:53 WARN hive.metastore: Failed to connect to the MetaStore
> > Server...
> > 13/04/03 12:41:53 INFO hive.metastore: Waiting 1 seconds before next
> > connection attempt.
> > 13/04/03 12:41:54 INFO hive.metastore: Trying to connect to metastore
> with
> > URI thrift://10.240.81.72:9083
> > 13/04/03 12:41:54 WARN hive.metastore: Failed to connect to the MetaStore
> > Server...
> > 13/04/03 12:41:54 INFO hive.metastore: Waiting 1 seconds before next
> > connection attempt.
> > 13/04/03 12:41:55 INFO hive.metastore: Trying to connect to metastore
> with
> > URI thrift://10.240.81.72:9083
> > 13/04/03 12:41:55 WARN hive.metastore: Failed to connect to the MetaStore
> > Server...
> > 13/04/03 12:41:55 INFO hive.metastore: Waiting 1 seconds before next
> > connection attempt.
> > FAILED: Error in metadata: java.lang.RuntimeException: Unable to
> instantiate
> > org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> > 13/04/03 12:41:56 ERROR exec.Task: FAILED: Error in metadata:
> > java.lang.RuntimeException: Unable to instantiate
> > org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> > org.apache.hadoop.hive.ql.metadata.HiveException:
> > java.lang.RuntimeException: Unable to instantiate
> > org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> >         at
> > org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
> >         at
> > org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
> >         at
> > org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
> >         at
> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
> >         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
> >         at
> >
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
> >         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
> >         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
> >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
> >         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
> >         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> > Caused by: java.lang.RuntimeException: Unable to instantiate
> > org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> >         at
> >
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1084)
> >         at
> >
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
> >         at
> >
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
> >         at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
> >         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
> >         at
> > org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
> >         ... 18 more
> > Caused by: java.lang.reflect.InvocationTargetException
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >         at
> >
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
> >         ... 23 more
> > Caused by: MetaException(message:Could not connect to meta store using
> any
> > of the URIs provided. Most recent failure:
> > org.apache.thrift.transport.TTransportException:
> java.net.ConnectException:
> > Connection refused
> >         at org.apache.thrift.transport.TSocket.open(TSocket.java:185)
> >         at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277)
> >         at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >         at
> >
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
> >         at
> >
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
> >         at
> >
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
> >         at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
> >         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
> >         at
> > org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
> >         at
> > org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
> >         at
> > org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
> >         at
> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
> >         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
> >         at
> >
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
> >         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
> >         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
> >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
> >         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
> >         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> > Caused by: java.net.ConnectException: Connection refused
> >         at java.net.PlainSocketImpl.socketConnect(Native Method)
> >         at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
> >         at
> > java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
> >         at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
> >         at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
> >         at java.net.Socket.connect(Socket.java:529)
> >         at org.apache.thrift.transport.TSocket.open(TSocket.java:180)
> >         ... 30 more
> > )
> >         at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:323)
> >         at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
> >         ... 28 more
> >
> > FAILED: Execution Error, return code 1 from
> > org.apache.hadoop.hive.ql.exec.DDLTask
> > 13/04/03 12:41:56 ERROR ql.Driver: FAILED: Execution Error, return code 1
> > from org.apache.hadoop.hive.ql.exec.DDLTask
> > 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=Driver.execute
> > start=1365018113666 end=1365018116762 duration=3096>
> > 13/04/03 12:41:56 INFO ql.Driver: <PERFLOG method=releaseLocks>
> > 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=releaseLocks
> > start=1365018116762 end=1365018116762 duration=0>
> > 13/04/03 12:41:56 INFO exec.ListSinkOperator: 0 finished. closing...
> > 13/04/03 12:41:56 INFO exec.ListSinkOperator: 0 forwarded 0 rows
> > 13/04/03 12:41:56 INFO ql.Driver: <PERFLOG method=releaseLocks>
> > 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=releaseLocks
> > start=1365018116765 end=1365018116765 duration=0>
> > hive>
> >
> >
> > Now it is not getting connected with the metastore server
> >
> > 13/04/03 12:41:53 INFO hive.metastore: Trying to connect to metastore
> with
> > URI thrift://10.240.81.72:9083
> > 13/04/03 12:41:53 WARN hive.metastore: Failed to connect to the MetaStore
> > Server...
> >
> > Any idea about this?
> >
> > Thanks a lot in advanced,
> > Demian
> >
> >
> >
> > On 3 April 2013 11:07, Mark Grover <mg...@cloudera.com> wrote:
> >>
> >> Hi Demian,
> >> The port you are using for the hive.metastore.uris property seems a
> >> little off. Can you use the default 9083 port please?
> >>
> >> I understand that MySQL is running on 3306. However, that port number
> >> refers to the port listened on by Hive metastore process. Right now,
> >> the process is trying to bind to 3306 but it can't because MySQL is
> >> bound to it.
> >>
> >> Can you change the port and try again please? Let us know how it goes.
> >>
> >> Mark
> >>
> >> On Tue, Apr 2, 2013 at 8:02 PM, demian rosas <de...@gmail.com>
> wrote:
> >> > Hi Harsh,
> >> >
> >> > Thanks for the info.
> >> >
> >> > What I want to set is a remote metastore. As you said, I am using
> >> > <name>hive.metastore.uris</name> property for indicating the
> information
> >> > of
> >> > my metastore host (in this case the ip address of my machine and 3306
> >> > port).
> >> >
> >> > I have not tested the local metastore configuration yet.
> >> >
> >> > Thanks a lot,
> >> > Demian
> >> >
> >> >
> >> > On 2 April 2013 19:48, Harsh J <ha...@cloudera.com> wrote:
> >> >>
> >> >> Demian,
> >> >>
> >> >> What are you exactly looking for? Note that the wording of metastores
> >> >> can be confusing a bit. The local/remote words don't mean location of
> >> >> the metastore DB/service.
> >> >>
> >> >> Rather, a "local" metastore means that a Hive CLI will connect
> >> >> directly to a configured DB. A "remote" metastore means that a Hive
> >> >> CLI is to connect to a HiveMetaStore Server that runs a thrift
> service
> >> >> (hive --service metastore).
> >> >>
> >> >> If the following is present, then Hive is being told to use remote
> >> >> metastores:
> >> >>   <name>hive.metastore.uris</name>
> >> >>
> >> >> If the above is removed/unset, then Hive looks for JDBC connection
> >> >> properties, in hopes of doing a local metastore connection.
> >> >>
> >> >> Hope this helps.
> >> >>
> >> >> On Wed, Apr 3, 2013 at 8:00 AM, demian rosas <de...@gmail.com>
> >> >> wrote:
> >> >> > nothing happened.
> >> >> >
> >> >> > I added the files to my classpath, but the output is the same.
> >> >> >
> >> >> > Do you have any other idea I can try?
> >> >> >
> >> >> > Thanks !!!
> >> >> >
> >> >> >
> >> >> > On 2 April 2013 19:10, demian rosas <de...@gmail.com> wrote:
> >> >> >>
> >> >> >> Hi Stephen,
> >> >> >>
> >> >> >> Thanks a lot for your hints,
> >> >> >>
> >> >> >> Those entries in my.cnf are not present, so I guess I am fine
> there.
> >> >> >>
> >> >> >> So I suppose I should add all the jar files in $HADOOP_HOME to my
> >> >> >> CLASSPATH right?
> >> >> >>
> >> >> >> I am on it.
> >> >> >>
> >> >> >> Thanks
> >> >> >>
> >> >> >>
> >> >> >> On 2 April 2013 18:44, Stephen Boesch <ja...@gmail.com> wrote:
> >> >> >>>
> >> >> >>> check out your my.cnf to ensure that the bind-address entry is
> >> >> >>> commented
> >> >> >>> out - that will ensure your computer is listening on 0.0.0.0 for
> >> >> >>> all
> >> >> >>> addresses
> >> >> >>>
> >> >> >>> # Instead of skip-networking the default is now to listen only on
> >> >> >>> # localhost which is more compatible and is not less secure.
> >> >> >>> #bind-address           = 127.0.0.1
> >> >> >>> #bind-address           = 192.168.1.64
> >> >> >>>
> >> >> >>> you may also try having bind to the actual ip address of your
> >> >> >>> computer
> >> >> >>> as
> >> >> >>> opposed to localhost
> >> >> >>>
> >> >> >>> Your second issue is easier: the java app lacks the hadoop-*.jar
> on
> >> >> >>> the
> >> >> >>> classpath: simply add $HADOOP_HOME/* to that classpath.
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>> 2013/4/2 Dtraveler <de...@gmail.com>
> >> >> >>>>
> >> >> >>>> Hi,
> >> >> >>>>
> >> >> >>>> I am trying to set a hive installation using a remote mysql
> >> >> >>>> metastore. I
> >> >> >>>> am using CDH4.2 on a fresh installation. All this in a single
> >> >> >>>> machine, so I
> >> >> >>>> am using hadoop pseudo distributed mode.
> >> >> >>>>
> >> >> >>>> So far I have Hadoop working fine. MySql is working and I can
> >> >> >>>> connect
> >> >> >>>> to
> >> >> >>>> it using jdbc from a java application. When I installed hive for
> >> >> >>>> the
> >> >> >>>> first
> >> >> >>>> time, it was using the embedded mode and I was able to define an
> >> >> >>>> external
> >> >> >>>> table pointing to an hdfs location and query the data with a
> hive
> >> >> >>>> query.
> >> >> >>>> Until here everything was fine.
> >> >> >>>>
> >> >> >>>> The problems started when I tried to set the remote mysql
> >> >> >>>> metastore.
> >> >> >>>> I
> >> >> >>>> have followed the instructions provided here:
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringtheHiveMetastore
> >> >> >>>>
> >> >> >>>> Right now I am using hive-server 1 configuration. This are the
> >> >> >>>> properties in my hive-site.xml file:
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>javax.jdo.option.ConnectionURL</name>
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
> >> >> >>>>   <description>JDBC connect string for a JDBC
> >> >> >>>> metastore</description>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>javax.jdo.option.ConnectionDriverName</name>
> >> >> >>>>   <value>com.mysql.jdbc.Driver</value>
> >> >> >>>>   <description>Driver class name for a JDBC
> >> >> >>>> metastore</description>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>javax.jdo.option.ConnectionUserName</name>
> >> >> >>>>   <value>myuser</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>javax.jdo.option.ConnectionPassword</name>
> >> >> >>>>   <value>mypassword</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>datanucleus.autoCreateSchema</name>
> >> >> >>>>   <value>false</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>datanucleus.fixedDatastore</name>
> >> >> >>>>   <value>true</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>hive.metastore.uris</name>
> >> >> >>>>   <value>thrift://my_ipaddress:3306</value>
> >> >> >>>>   <description>IP address (or fully-qualified domain name) and
> >> >> >>>> port
> >> >> >>>> of
> >> >> >>>> the metastore host</description>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>hive.metastore.warehouse.dir</name>
> >> >> >>>>   <value>/user/hive/warehouse</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> <property>
> >> >> >>>>   <name>hive.server.thrift.port</name>
> >> >> >>>>    <value>10000</value>
> >> >> >>>> </property>
> >> >> >>>>
> >> >> >>>> I have started the hive-metastore service and the hive-server
> >> >> >>>> service.
> >> >> >>>> When I test using the hive console with debug messages, I get
> >> >> >>>> this:
> >> >> >>>>
> >> >> >>>> hive -hiveconf hive.root.logger=INFO,console
> >> >> >>>> Logging initialized using configuration in
> >> >> >>>> file:/etc/hive/conf.dist/hive-log4j.properties
> >> >> >>>> 13/04/02 17:10:22 INFO SessionState: Logging initialized using
> >> >> >>>> configuration in file:/etc/hive/conf.dist/hive-log4j.properties
> >> >> >>>> Hive history
> >> >> >>>>
> file=/tmp/drosash/hive_job_log_drosash_201304021710_1321648790.txt
> >> >> >>>> 13/04/02 17:10:22 INFO exec.HiveHistory: Hive history
> >> >> >>>>
> file=/tmp/drosash/hive_job_log_drosash_201304021710_1321648790.txt
> >> >> >>>> hive> show tables;
> >> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=Driver.run>
> >> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
> >> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=compile>
> >> >> >>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parsing command: show
> >> >> >>>> tables
> >> >> >>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parse Completed
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Semantic Analysis Completed
> >> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initializing Self
> 0
> >> >> >>>> OP
> >> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Operator 0 OP
> >> >> >>>> initialized
> >> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initialization
> Done
> >> >> >>>> 0
> >> >> >>>> OP
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Returning Hive schema:
> >> >> >>>> Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string,
> >> >> >>>> comment:from
> >> >> >>>> deserializer)], properties:null)
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=compile
> >> >> >>>> start=1364947825805 end=1364947826183 duration=378>
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: <PERFLOG
> method=Driver.execute>
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Starting command: show tables
> >> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=TimeToSubmit
> >> >> >>>> start=1364947825805 end=1364947826197 duration=392>
> >> >> >>>> 13/04/02 17:10:26 INFO hive.metastore: Trying to connect to
> >> >> >>>> metastore
> >> >> >>>> with URI thrift://10.240.81.72:3306
> >> >> >>>> 13/04/02 17:10:26 INFO hive.metastore: Waiting 1 seconds before
> >> >> >>>> next
> >> >> >>>> connection attempt.
> >> >> >>>> 13/04/02 17:10:27 INFO hive.metastore: Connected to metastore.
> >> >> >>>> 13/04/02 17:10:28 WARN metastore.RetryingMetaStoreClient:
> >> >> >>>> MetaStoreClient lost connection. Attempting to reconnect.
> >> >> >>>> org.apache.thrift.transport.TTransportException
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> >> >>>>         at
> >> >> >>>>
> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
> >> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
> >> >> >>>>         at $Proxy9.getDatabase(Unknown Source)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
> >> >> >>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
> >> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
> >> >> >>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> >> >> >>>> 13/04/02 17:10:29 INFO hive.metastore: Trying to connect to
> >> >> >>>> metastore
> >> >> >>>> with URI thrift://10.240.81.72:3306
> >> >> >>>> 13/04/02 17:10:29 INFO hive.metastore: Waiting 1 seconds before
> >> >> >>>> next
> >> >> >>>> connection attempt.
> >> >> >>>> 13/04/02 17:10:30 INFO hive.metastore: Connected to metastore.
> >> >> >>>> FAILED: Error in metadata:
> >> >> >>>> org.apache.thrift.transport.TTransportException
> >> >> >>>> 13/04/02 17:10:31 ERROR exec.Task: FAILED: Error in metadata:
> >> >> >>>> org.apache.thrift.transport.TTransportException
> >> >> >>>> org.apache.hadoop.hive.ql.metadata.HiveException:
> >> >> >>>> org.apache.thrift.transport.TTransportException
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
> >> >> >>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
> >> >> >>>>         at
> >> >> >>>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
> >> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
> >> >> >>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> >> >> >>>> Caused by: org.apache.thrift.transport.TTransportException
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> >> >>>>         at
> >> >> >>>>
> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
> >> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
> >> >> >>>>         at $Proxy9.getDatabase(Unknown Source)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
> >> >> >>>>         ... 18 more
> >> >> >>>>
> >> >> >>>> FAILED: Execution Error, return code 1 from
> >> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >> >>>> 13/04/02 17:10:31 ERROR ql.Driver: FAILED: Execution Error,
> return
> >> >> >>>> code
> >> >> >>>> 1 from org.apache.hadoop.hive.ql.exec.DDLTask
> >> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG
> method=Driver.execute
> >> >> >>>> start=1364947826183 end=1364947831388 duration=5205>
> >> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
> >> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
> >> >> >>>> start=1364947831388 end=1364947831388 duration=0>
> >> >> >>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 finished.
> >> >> >>>> closing...
> >> >> >>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 forwarded 0 rows
> >> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
> >> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
> >> >> >>>> start=1364947831391 end=1364947831391 duration=0>
> >> >> >>>> hive>
> >> >> >>>>
> >> >> >>>>
> >> >> >>>> When I try to connect to the metastore using a java app I get
> >> >> >>>> this:
> >> >> >>>>
> >> >> >>>> log4j:ERROR Could not instantiate class
> >> >> >>>> [org.apache.hadoop.log.metrics.EventCounter].
> >> >> >>>> java.lang.ClassNotFoundException:
> >> >> >>>> org.apache.hadoop.log.metrics.EventCounter
> >> >> >>>>         at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >> >> >>>>         at java.security.AccessController.doPrivileged(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >> >> >>>>         at
> >> >> >>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >> >> >>>>         at java.lang.Class.forName0(Native Method)
> >> >> >>>>         at java.lang.Class.forName(Class.java:169)
> >> >> >>>>         at
> >> >> >>>> org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
> >> >> >>>>         at
> >> >> >>>> org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:73)
> >> >> >>>>         at
> >> >> >>>> org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:242)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.thrift.transport.TIOStreamTransport.<clinit>(TIOStreamTransport.java:38)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:110)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
> >> >> >>>>         at
> >> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
> >> >> >>>>         at
> >> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:185)
> >> >> >>>>         at Conn.main(Conn.java:19)
> >> >> >>>> log4j:ERROR Could not instantiate appender named "EventCounter".
> >> >> >>>> Exception in thread "main" java.lang.NoClassDefFoundError:
> >> >> >>>> org/apache/hadoop/io/Writable
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:193)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:127)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:126)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:121)
> >> >> >>>>         at
> >> >> >>>>
> >> >> >>>>
> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
> >> >> >>>>         at
> >> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
> >> >> >>>>         at
> >> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:185)
> >> >> >>>>         at Conn.main(Conn.java:19)
> >> >> >>>> Caused by: java.lang.ClassNotFoundException:
> >> >> >>>> org.apache.hadoop.io.Writable
> >> >> >>>>         at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >> >> >>>>         at java.security.AccessController.doPrivileged(Native
> >> >> >>>> Method)
> >> >> >>>>         at
> >> >> >>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >> >> >>>>         at
> >> >> >>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >> >> >>>>         ... 8 more
> >> >> >>>>
> >> >> >>>>
> >> >> >>>> The code of my java app is this:
> >> >> >>>>
> >> >> >>>> import java.sql.*;
> >> >> >>>>
> >> >> >>>>
> >> >> >>>> class Conn {
> >> >> >>>>   public static void main (String[] args) throws Exception
> >> >> >>>>   {
> >> >> >>>>    Class.forName ("org.apache.hadoop.hive.jdbc.HiveDriver");
> >> >> >>>>
> >> >> >>>>
> >> >> >>>> Connection conn = DriverManager.getConnection
> >> >> >>>>                   ("jdbc:hive://localhost:10000/default", "",
> "");
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>    try {
> >> >> >>>>      Statement stmt = conn.createStatement();
> >> >> >>>>      try {
> >> >> >>>>        //ResultSet rset = stmt.executeQuery("select BANNER from
> >> >> >>>> SYS.V_$VERSION");
> >> >> >>>>        ResultSet rset = stmt.executeQuery("show tables");
> >> >> >>>>        try {
> >> >> >>>>          while (rset.next())
> >> >> >>>>            System.out.println (rset.getString(1));   // Print
> col
> >> >> >>>> 1
> >> >> >>>>        }
> >> >> >>>>        finally {
> >> >> >>>>           try { rset.close(); } catch (Exception ignore) {}
> >> >> >>>>        }
> >> >> >>>>      }
> >> >> >>>>      finally {
> >> >> >>>>        try { stmt.close(); } catch (Exception ignore) {}
> >> >> >>>>      }
> >> >> >>>>    }
> >> >> >>>>    finally {
> >> >> >>>>      try { conn.close(); } catch (Exception ignore) {}
> >> >> >>>>    }
> >> >> >>>>   }
> >> >> >>>> }
> >> >> >>>>
> >> >> >>>> Do you have any idea about what I am missing?
> >> >> >>>>
> >> >> >>>> Thank you very much in advance
> >> >> >>>>
> >> >> >>>> --
> >> >> >>>>
> >> >> >>>>
> >> >> >>>>
> >> >> >>>
> >> >> >>>
> >> >> >>> --
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>
> >> >> >>
> >> >> >
> >> >> > --
> >> >> >
> >> >> >
> >> >> >
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Harsh J
> >> >>
> >> >> --
> >> >>
> >> >>
> >> >>
> >> >
> >> > --
> >> >
> >> >
> >> >
> >
> >
>

Re: Problem to get hive with remote mysql metastore working

Posted by Sanjay Subramanian <Sa...@wizecommerce.com>.
U owe all of us a Jamba Juice for not checking if the service was up  - LOL
Happy Hiving
sanjay
From: demian rosas <de...@gmail.com>>
Reply-To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Date: Wednesday, April 3, 2013 2:22 PM
To: "cdh-user@cloudera.org<ma...@cloudera.org>" <cd...@cloudera.org>>, "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Subject: Re: Problem to get hive with remote mysql metastore working

Hi all,

Just to let you know.

The problem was that the metastore service was down.

I was starting it with

sudo service hive-metastore start

and even when there was a message saying that the service was started, it was not.

I have started the service with

hive --service metastore

and now, at least can execute the "show tables" command in hive CLI and get the "OK".

Thanks a lot to all of you guys.

Cheers,
Demian


On 3 April 2013 12:56, Mark Grover <mg...@cloudera.com>> wrote:
Adding back cdh-user

Hi Demian,
Can you check the status of your metastore service?

sudo service hive-metastore status

If it is down, can you post the logs from under
/var/log/hive/hive-metastore.log?

Mark

On Wed, Apr 3, 2013 at 12:47 PM, demian rosas <de...@gmail.com>> wrote:
> Hi Mark,
>
> Thanks for the hint.
>
> Have changed my hive-site.xml file, now it is:
>
> <property>
>   <name>javax.jdo.option.ConnectionURL</name>
>
> <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
>   <description>JDBC connect string for a JDBC metastore</description>
> </property>
>
> <property>
>   <name>javax.jdo.option.ConnectionDriverName</name>
>   <value>com.mysql.jdbc.Driver</value>
>   <description>Driver class name for a JDBC metastore</description>
> </property>
>
> <property>
>   <name>javax.jdo.option.ConnectionUserName</name>
>   <value>drosash</value>
> </property>
>
> <property>
>   <name>javax.jdo.option.ConnectionPassword</name>
>   <value>drosash</value>
> </property>
>
> <property>
>   <name>datanucleus.autoCreateSchema</name>
>   <value>false</value>
> </property>
>
> <property>
>   <name>datanucleus.fixedDatastore</name>
>   <value>true</value>
> </property>
>
> <property>
> exit
>   <name>hive.metastore.uris</name>
>   <value>thrift://10.240.81.72:9083<http://10.240.81.72:9083></value>
>   <description>IP address (or fully-qualified domain name) and port of the
> metastore host</description>
> </property>
>
> <property>
>   <name>hive.metastore.warehouse.dir</name>
>   <value>/user/hive/warehouse</value>
> </property>
>
> <property>
>   <name>hive.server.thrift.port</name>
>    <value>10000</value>
> </property>
>
>
> Then I ran "show tables" in hive CLI with debug messages. Th output is this:
>
> hive> show tables;
> 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=Driver.run>
> 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
> 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=compile>
> 13/04/03 12:41:53 INFO parse.ParseDriver: Parsing command: show tables
> 13/04/03 12:41:53 INFO parse.ParseDriver: Parse Completed
> 13/04/03 12:41:53 INFO ql.Driver: Semantic Analysis Completed
> 13/04/03 12:41:53 INFO exec.ListSinkOperator: Initializing Self 0 OP
> 13/04/03 12:41:53 INFO exec.ListSinkOperator: Operator 0 OP initialized
> 13/04/03 12:41:53 INFO exec.ListSinkOperator: Initialization Done 0 OP
> 13/04/03 12:41:53 INFO ql.Driver: Returning Hive schema:
> Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from
> deserializer)], properties:null)
> 13/04/03 12:41:53 INFO ql.Driver: </PERFLOG method=compile
> start=1365018113270 end=1365018113666 duration=396>
> 13/04/03 12:41:53 INFO ql.Driver: <PERFLOG method=Driver.execute>
> 13/04/03 12:41:53 INFO ql.Driver: Starting command: show tables
> 13/04/03 12:41:53 INFO ql.Driver: </PERFLOG method=TimeToSubmit
> start=1365018113270 end=1365018113685 duration=415>
> 13/04/03 12:41:53 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://10.240.81.72:9083<http://10.240.81.72:9083>
> 13/04/03 12:41:53 WARN hive.metastore: Failed to connect to the MetaStore
> Server...
> 13/04/03 12:41:53 INFO hive.metastore: Waiting 1 seconds before next
> connection attempt.
> 13/04/03 12:41:54 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://10.240.81.72:9083<http://10.240.81.72:9083>
> 13/04/03 12:41:54 WARN hive.metastore: Failed to connect to the MetaStore
> Server...
> 13/04/03 12:41:54 INFO hive.metastore: Waiting 1 seconds before next
> connection attempt.
> 13/04/03 12:41:55 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://10.240.81.72:9083<http://10.240.81.72:9083>
> 13/04/03 12:41:55 WARN hive.metastore: Failed to connect to the MetaStore
> Server...
> 13/04/03 12:41:55 INFO hive.metastore: Waiting 1 seconds before next
> connection attempt.
> FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> 13/04/03 12:41:56 ERROR exec.Task: FAILED: Error in metadata:
> java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>         at
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>         at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>         at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1084)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
>         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>         ... 18 more
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
>         ... 23 more
> Caused by: MetaException(message:Could not connect to meta store using any
> of the URIs provided. Most recent failure:
> org.apache.thrift.transport.TTransportException: java.net.ConnectException:
> Connection refused
>         at org.apache.thrift.transport.TSocket.open(TSocket.java:185)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
>         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>         at
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>         at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: java.net.ConnectException: Connection refused
>         at java.net.PlainSocketImpl.socketConnect(Native Method)
>         at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>         at
> java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>         at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>         at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>         at java.net.Socket.connect(Socket.java:529)
>         at org.apache.thrift.transport.TSocket.open(TSocket.java:180)
>         ... 30 more
> )
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:323)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
>         ... 28 more
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> 13/04/03 12:41:56 ERROR ql.Driver: FAILED: Execution Error, return code 1
> from org.apache.hadoop.hive.ql.exec.DDLTask
> 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=Driver.execute
> start=1365018113666 end=1365018116762 duration=3096>
> 13/04/03 12:41:56 INFO ql.Driver: <PERFLOG method=releaseLocks>
> 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=releaseLocks
> start=1365018116762 end=1365018116762 duration=0>
> 13/04/03 12:41:56 INFO exec.ListSinkOperator: 0 finished. closing...
> 13/04/03 12:41:56 INFO exec.ListSinkOperator: 0 forwarded 0 rows
> 13/04/03 12:41:56 INFO ql.Driver: <PERFLOG method=releaseLocks>
> 13/04/03 12:41:56 INFO ql.Driver: </PERFLOG method=releaseLocks
> start=1365018116765 end=1365018116765 duration=0>
> hive>
>
>
> Now it is not getting connected with the metastore server
>
> 13/04/03 12:41:53 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://10.240.81.72:9083<http://10.240.81.72:9083>
> 13/04/03 12:41:53 WARN hive.metastore: Failed to connect to the MetaStore
> Server...
>
> Any idea about this?
>
> Thanks a lot in advanced,
> Demian
>
>
>
> On 3 April 2013 11:07, Mark Grover <mg...@cloudera.com>> wrote:
>>
>> Hi Demian,
>> The port you are using for the hive.metastore.uris property seems a
>> little off. Can you use the default 9083 port please?
>>
>> I understand that MySQL is running on 3306. However, that port number
>> refers to the port listened on by Hive metastore process. Right now,
>> the process is trying to bind to 3306 but it can't because MySQL is
>> bound to it.
>>
>> Can you change the port and try again please? Let us know how it goes.
>>
>> Mark
>>
>> On Tue, Apr 2, 2013 at 8:02 PM, demian rosas <de...@gmail.com>> wrote:
>> > Hi Harsh,
>> >
>> > Thanks for the info.
>> >
>> > What I want to set is a remote metastore. As you said, I am using
>> > <name>hive.metastore.uris</name> property for indicating the information
>> > of
>> > my metastore host (in this case the ip address of my machine and 3306
>> > port).
>> >
>> > I have not tested the local metastore configuration yet.
>> >
>> > Thanks a lot,
>> > Demian
>> >
>> >
>> > On 2 April 2013 19:48, Harsh J <ha...@cloudera.com>> wrote:
>> >>
>> >> Demian,
>> >>
>> >> What are you exactly looking for? Note that the wording of metastores
>> >> can be confusing a bit. The local/remote words don't mean location of
>> >> the metastore DB/service.
>> >>
>> >> Rather, a "local" metastore means that a Hive CLI will connect
>> >> directly to a configured DB. A "remote" metastore means that a Hive
>> >> CLI is to connect to a HiveMetaStore Server that runs a thrift service
>> >> (hive --service metastore).
>> >>
>> >> If the following is present, then Hive is being told to use remote
>> >> metastores:
>> >>   <name>hive.metastore.uris</name>
>> >>
>> >> If the above is removed/unset, then Hive looks for JDBC connection
>> >> properties, in hopes of doing a local metastore connection.
>> >>
>> >> Hope this helps.
>> >>
>> >> On Wed, Apr 3, 2013 at 8:00 AM, demian rosas <de...@gmail.com>>
>> >> wrote:
>> >> > nothing happened.
>> >> >
>> >> > I added the files to my classpath, but the output is the same.
>> >> >
>> >> > Do you have any other idea I can try?
>> >> >
>> >> > Thanks !!!
>> >> >
>> >> >
>> >> > On 2 April 2013 19:10, demian rosas <de...@gmail.com>> wrote:
>> >> >>
>> >> >> Hi Stephen,
>> >> >>
>> >> >> Thanks a lot for your hints,
>> >> >>
>> >> >> Those entries in my.cnf are not present, so I guess I am fine there.
>> >> >>
>> >> >> So I suppose I should add all the jar files in $HADOOP_HOME to my
>> >> >> CLASSPATH right?
>> >> >>
>> >> >> I am on it.
>> >> >>
>> >> >> Thanks
>> >> >>
>> >> >>
>> >> >> On 2 April 2013 18:44, Stephen Boesch <ja...@gmail.com>> wrote:
>> >> >>>
>> >> >>> check out your my.cnf to ensure that the bind-address entry is
>> >> >>> commented
>> >> >>> out - that will ensure your computer is listening on 0.0.0.0 for
>> >> >>> all
>> >> >>> addresses
>> >> >>>
>> >> >>> # Instead of skip-networking the default is now to listen only on
>> >> >>> # localhost which is more compatible and is not less secure.
>> >> >>> #bind-address           = 127.0.0.1
>> >> >>> #bind-address           = 192.168.1.64
>> >> >>>
>> >> >>> you may also try having bind to the actual ip address of your
>> >> >>> computer
>> >> >>> as
>> >> >>> opposed to localhost
>> >> >>>
>> >> >>> Your second issue is easier: the java app lacks the hadoop-*.jar on
>> >> >>> the
>> >> >>> classpath: simply add $HADOOP_HOME/* to that classpath.
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> 2013/4/2 Dtraveler <de...@gmail.com>>
>> >> >>>>
>> >> >>>> Hi,
>> >> >>>>
>> >> >>>> I am trying to set a hive installation using a remote mysql
>> >> >>>> metastore. I
>> >> >>>> am using CDH4.2 on a fresh installation. All this in a single
>> >> >>>> machine, so I
>> >> >>>> am using hadoop pseudo distributed mode.
>> >> >>>>
>> >> >>>> So far I have Hadoop working fine. MySql is working and I can
>> >> >>>> connect
>> >> >>>> to
>> >> >>>> it using jdbc from a java application. When I installed hive for
>> >> >>>> the
>> >> >>>> first
>> >> >>>> time, it was using the embedded mode and I was able to define an
>> >> >>>> external
>> >> >>>> table pointing to an hdfs location and query the data with a hive
>> >> >>>> query.
>> >> >>>> Until here everything was fine.
>> >> >>>>
>> >> >>>> The problems started when I tried to set the remote mysql
>> >> >>>> metastore.
>> >> >>>> I
>> >> >>>> have followed the instructions provided here:
>> >> >>>>
>> >> >>>>
>> >> >>>> https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringtheHiveMetastore
>> >> >>>>
>> >> >>>> Right now I am using hive-server 1 configuration. This are the
>> >> >>>> properties in my hive-site.xml file:
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>javax.jdo.option.ConnectionURL</name>
>> >> >>>>
>> >> >>>>
>> >> >>>>
>> >> >>>> <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
>> >> >>>>   <description>JDBC connect string for a JDBC
>> >> >>>> metastore</description>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>javax.jdo.option.ConnectionDriverName</name>
>> >> >>>>   <value>com.mysql.jdbc.Driver</value>
>> >> >>>>   <description>Driver class name for a JDBC
>> >> >>>> metastore</description>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>javax.jdo.option.ConnectionUserName</name>
>> >> >>>>   <value>myuser</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>javax.jdo.option.ConnectionPassword</name>
>> >> >>>>   <value>mypassword</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>datanucleus.autoCreateSchema</name>
>> >> >>>>   <value>false</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>datanucleus.fixedDatastore</name>
>> >> >>>>   <value>true</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>hive.metastore.uris</name>
>> >> >>>>   <value>thrift://my_ipaddress:3306</value>
>> >> >>>>   <description>IP address (or fully-qualified domain name) and
>> >> >>>> port
>> >> >>>> of
>> >> >>>> the metastore host</description>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>hive.metastore.warehouse.dir</name>
>> >> >>>>   <value>/user/hive/warehouse</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> <property>
>> >> >>>>   <name>hive.server.thrift.port</name>
>> >> >>>>    <value>10000</value>
>> >> >>>> </property>
>> >> >>>>
>> >> >>>> I have started the hive-metastore service and the hive-server
>> >> >>>> service.
>> >> >>>> When I test using the hive console with debug messages, I get
>> >> >>>> this:
>> >> >>>>
>> >> >>>> hive -hiveconf hive.root.logger=INFO,console
>> >> >>>> Logging initialized using configuration in
>> >> >>>> file:/etc/hive/conf.dist/hive-log4j.properties
>> >> >>>> 13/04/02 17:10:22 INFO SessionState: Logging initialized using
>> >> >>>> configuration in file:/etc/hive/conf.dist/hive-log4j.properties
>> >> >>>> Hive history
>> >> >>>> file=/tmp/drosash/hive_job_log_drosash_201304021710_1321648790.txt
>> >> >>>> 13/04/02 17:10:22 INFO exec.HiveHistory: Hive history
>> >> >>>> file=/tmp/drosash/hive_job_log_drosash_201304021710_1321648790.txt
>> >> >>>> hive> show tables;
>> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=Driver.run>
>> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
>> >> >>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=compile>
>> >> >>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parsing command: show
>> >> >>>> tables
>> >> >>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parse Completed
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Semantic Analysis Completed
>> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initializing Self 0
>> >> >>>> OP
>> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Operator 0 OP
>> >> >>>> initialized
>> >> >>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initialization Done
>> >> >>>> 0
>> >> >>>> OP
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Returning Hive schema:
>> >> >>>> Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string,
>> >> >>>> comment:from
>> >> >>>> deserializer)], properties:null)
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=compile
>> >> >>>> start=1364947825805 end=1364947826183 duration=378>
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: <PERFLOG method=Driver.execute>
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: Starting command: show tables
>> >> >>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=TimeToSubmit
>> >> >>>> start=1364947825805 end=1364947826197 duration=392>
>> >> >>>> 13/04/02 17:10:26 INFO hive.metastore: Trying to connect to
>> >> >>>> metastore
>> >> >>>> with URI thrift://10.240.81.72:3306<http://10.240.81.72:3306>
>> >> >>>> 13/04/02 17:10:26 INFO hive.metastore: Waiting 1 seconds before
>> >> >>>> next
>> >> >>>> connection attempt.
>> >> >>>> 13/04/02 17:10:27 INFO hive.metastore: Connected to metastore.
>> >> >>>> 13/04/02 17:10:28 WARN metastore.RetryingMetaStoreClient:
>> >> >>>> MetaStoreClient lost connection. Attempting to reconnect.
>> >> >>>> org.apache.thrift.transport.TTransportException
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> >> >>>>         at
>> >> >>>> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
>> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>> >> >>>>         at $Proxy9.getDatabase(Unknown Source)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>> >> >>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >> >>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> >> >>>> 13/04/02 17:10:29 INFO hive.metastore: Trying to connect to
>> >> >>>> metastore
>> >> >>>> with URI thrift://10.240.81.72:3306<http://10.240.81.72:3306>
>> >> >>>> 13/04/02 17:10:29 INFO hive.metastore: Waiting 1 seconds before
>> >> >>>> next
>> >> >>>> connection attempt.
>> >> >>>> 13/04/02 17:10:30 INFO hive.metastore: Connected to metastore.
>> >> >>>> FAILED: Error in metadata:
>> >> >>>> org.apache.thrift.transport.TTransportException
>> >> >>>> 13/04/02 17:10:31 ERROR exec.Task: FAILED: Error in metadata:
>> >> >>>> org.apache.thrift.transport.TTransportException
>> >> >>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>> >> >>>> org.apache.thrift.transport.TTransportException
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>> >> >>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>> >> >>>>         at
>> >> >>>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >> >>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> >> >>>> Caused by: org.apache.thrift.transport.TTransportException
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> >> >>>>         at
>> >> >>>> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
>> >> >>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >> >>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>> >> >>>>         at $Proxy9.getDatabase(Unknown Source)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>> >> >>>>         ... 18 more
>> >> >>>>
>> >> >>>> FAILED: Execution Error, return code 1 from
>> >> >>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>>> 13/04/02 17:10:31 ERROR ql.Driver: FAILED: Execution Error, return
>> >> >>>> code
>> >> >>>> 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=Driver.execute
>> >> >>>> start=1364947826183 end=1364947831388 duration=5205>
>> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
>> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
>> >> >>>> start=1364947831388 end=1364947831388 duration=0>
>> >> >>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 finished.
>> >> >>>> closing...
>> >> >>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 forwarded 0 rows
>> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
>> >> >>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
>> >> >>>> start=1364947831391 end=1364947831391 duration=0>
>> >> >>>> hive>
>> >> >>>>
>> >> >>>>
>> >> >>>> When I try to connect to the metastore using a java app I get
>> >> >>>> this:
>> >> >>>>
>> >> >>>> log4j:ERROR Could not instantiate class
>> >> >>>> [org.apache.hadoop.log.metrics.EventCounter].
>> >> >>>> java.lang.ClassNotFoundException:
>> >> >>>> org.apache.hadoop.log.metrics.EventCounter
>> >> >>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >> >>>>         at java.security.AccessController.doPrivileged(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> >> >>>>         at
>> >> >>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> >> >>>>         at java.lang.Class.forName0(Native Method)
>> >> >>>>         at java.lang.Class.forName(Class.java:169)
>> >> >>>>         at
>> >> >>>> org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
>> >> >>>>         at
>> >> >>>> org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:73)
>> >> >>>>         at
>> >> >>>> org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:242)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.thrift.transport.TIOStreamTransport.<clinit>(TIOStreamTransport.java:38)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:110)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>> >> >>>>         at
>> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>>>         at
>> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>>>         at Conn.main(Conn.java:19)
>> >> >>>> log4j:ERROR Could not instantiate appender named "EventCounter".
>> >> >>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> >> >>>> org/apache/hadoop/io/Writable
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:193)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:127)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:126)
>> >> >>>>         at
>> >> >>>>
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:121)
>> >> >>>>         at
>> >> >>>>
>> >> >>>> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>> >> >>>>         at
>> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>>>         at
>> >> >>>> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>>>         at Conn.main(Conn.java:19)
>> >> >>>> Caused by: java.lang.ClassNotFoundException:
>> >> >>>> org.apache.hadoop.io.Writable
>> >> >>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >> >>>>         at java.security.AccessController.doPrivileged(Native
>> >> >>>> Method)
>> >> >>>>         at
>> >> >>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> >> >>>>         at
>> >> >>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> >> >>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> >> >>>>         ... 8 more
>> >> >>>>
>> >> >>>>
>> >> >>>> The code of my java app is this:
>> >> >>>>
>> >> >>>> import java.sql.*;
>> >> >>>>
>> >> >>>>
>> >> >>>> class Conn {
>> >> >>>>   public static void main (String[] args) throws Exception
>> >> >>>>   {
>> >> >>>>    Class.forName ("org.apache.hadoop.hive.jdbc.HiveDriver");
>> >> >>>>
>> >> >>>>
>> >> >>>> Connection conn = DriverManager.getConnection
>> >> >>>>                   ("jdbc:hive://localhost:10000/default", "", "");
>> >> >>>>
>> >> >>>>
>> >> >>>>    try {
>> >> >>>>      Statement stmt = conn.createStatement();
>> >> >>>>      try {
>> >> >>>>        //ResultSet rset = stmt.executeQuery("select BANNER from
>> >> >>>> SYS.V_$VERSION");
>> >> >>>>        ResultSet rset = stmt.executeQuery("show tables");
>> >> >>>>        try {
>> >> >>>>          while (rset.next())
>> >> >>>>            System.out.println (rset.getString(1));   // Print col
>> >> >>>> 1
>> >> >>>>        }
>> >> >>>>        finally {
>> >> >>>>           try { rset.close(); } catch (Exception ignore) {}
>> >> >>>>        }
>> >> >>>>      }
>> >> >>>>      finally {
>> >> >>>>        try { stmt.close(); } catch (Exception ignore) {}
>> >> >>>>      }
>> >> >>>>    }
>> >> >>>>    finally {
>> >> >>>>      try { conn.close(); } catch (Exception ignore) {}
>> >> >>>>    }
>> >> >>>>   }
>> >> >>>> }
>> >> >>>>
>> >> >>>> Do you have any idea about what I am missing?
>> >> >>>>
>> >> >>>> Thank you very much in advance
>> >> >>>>
>> >> >>>> --
>> >> >>>>
>> >> >>>>
>> >> >>>>
>> >> >>>
>> >> >>>
>> >> >>> --
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>
>> >> >>
>> >> >
>> >> > --
>> >> >
>> >> >
>> >> >
>> >>
>> >>
>> >>
>> >> --
>> >> Harsh J
>> >>
>> >> --
>> >>
>> >>
>> >>
>> >
>> > --
>> >
>> >
>> >
>
>


CONFIDENTIALITY NOTICE
======================
This email message and any attachments are for the exclusive use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message along with any attachments, from your computer system. If you are the intended recipient, please be advised that the content of this message is subject to access, review and disclosure by the sender's Email System Administrator.