You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Zhou Shuaifeng <zh...@huawei.com> on 2010/05/26 11:08:24 UTC

Hive Develop problem for help

 
Hi All,

When I use Hadoop and Hive in suse linux, I got a plblem:

hive> show tables;
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask

My hadoop version is 0.20.2 and hive version is 0.5.0.

Have anybody ever met the same problem with me? How to solve it?

Thanks a lot.


Shuaifeng(Frank) Zhou
Huawei Technologies Co., Ltd.
Tel: +86-29-81873251
Fax: +86-29-81873238
Mobile: +86-13572288349
E-mail: zhoushuaifeng@huawei.com
www.huawei.com
****************************************************************************
***********
This e-mail and its attachments contain confidential information from
HUAWEI, which is intended only for the person or entity whose address is
listed above. Any use of the information contained herein in any way
(including, but not limited to, total or partial disclosure, reproduction,
or dissemination) by persons other than the intended recipient(s) is
prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!
**********************************************


RE: Hive Develop problem for help

Posted by Ning Zhang <nz...@facebook.com>.
can you rerun the installation steps (starting from ant package) from http://wiki.apache.org/hadoop/Hive/AdminManual/Installation

Ning
________________________________________
From: Zhou Shuaifeng [zhoushuaifeng@huawei.com]
Sent: Wednesday, May 26, 2010 8:17 PM
To: hive-dev@hadoop.apache.org
Cc: ac.pippo@huawei.com; zhzf1511@huawei.com
Subject: Re: Hive Develop problem for help

Hi Ning,

Thank you very much. There is still problems, details are below, please help
to check where the problem is, thanks a lot.

The runing service info:

linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # jps
9133 HMaster
9072 HQuorumPeer
19910 NetworkServerControl
8699 SecondaryNameNode
8544 NameNode
8778 JobTracker
20016 Jps
19929 RunJar

The command and error info are below:

linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # hive
Hive history file=/tmp/root/hive_job_log_root_201005271108_742407423.txt
hive> show tables;
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

The hive-site.xml settings are below:

<?xml version="1.0"?>
<configuration>
<property>
  <name>hive.metastore.local</name>
  <value>true</value>
  <description>controls whether to connect to remove metastore server or
open a new metastore server in Hive Client JVM</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby://2.1.37.110:1527/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>org.apache.derby.jdbc.ClientDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>
</configuration>

The logs are below:

2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.resources" but it cannot be resolved.
2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.resources" but it cannot be resolved.
2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.runtime" but it cannot be resolved.
2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.runtime" but it cannot be resolved.
2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.text" but it cannot be resolved.
2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.text" but it cannot be resolved.
2010-05-27 10:54:56,158 ERROR exec.DDLTask
(SessionState.java:printError(248)) - FAILED: Error in metadata:
javax.jdo.JDOFatalInternalException: Error creating transactional connection
factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
org.apache.hadoop.hive.ql.metadata.HiveException:
javax.jdo.JDOFatalInternalException: Error creating transactional connection
factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:441)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:423)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:410)
        at
org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:404)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:159)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
        at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
        at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(Nucl
eusJDOHelper.java:395)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
istenceManagerFactory.java:547)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactor
y(JDOPersistenceManagerFactory.java:175)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHe
lper.java:1159)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:161)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectSto
re.java:178)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:122
)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.
java:101)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStor
e.java:134)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hi
veMetaStore.java:150)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore
.java:122)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaSto
re.java:104)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreCli
ent.java:75)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:828)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:838)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:439)
        ... 18 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
sorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
torAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
nManagedPluginRegistry.java:576)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
.java:324)
        at
org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:
190)
        at
org.datanucleus.store.mapped.MappedStoreManager.<init>(MappedStoreManager.ja
va:139)
        at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:265)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
sorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
torAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
nManagedPluginRegistry.java:576)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
.java:300)
        at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationMan
ager.java:106)
        at org.datanucleus.store.FederationManager.<init>(FederationManager.
java:68)
        at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManage
rFactoryImpl.java:152)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
istenceManagerFactory.java:529)
        ... 43 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke
the "default" plugin to create a ConnectionPool gave an error : Invalid
datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
class is incorrect.
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
nFactoryImpl.java:169)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryIm
pl.java:91)
        ... 62 more
Caused by: org.datanucleus.exceptions.NucleusUserException: Invalid
datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
class is incorrect.
        at
org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
ManagerDataSource.java:99)
        at
org.datanucleus.store.rdbms.datasource.DefaultDataSourceFactory.makePooledDa
taSource(DefaultDataSourceFactory.java:39)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
nFactoryImpl.java:148)
        ... 63 more
Caused by: org.datanucleus.exceptions.ClassNotResolvedException: Class "org.
apache.derby.jdbc.ClientDriver" was not found in the CLASSPATH. Please check
your specification and your CLASSPATH.
        at
org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
ava:250)
        at
org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
ava:415)
        at
org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
ManagerDataSource.java:87)
        ... 65 more

Shuaifeng(Frank) Zhou
Huawei Technologies Co., Ltd.
Tel: +86-29-81873251
Fax: +86-29-81873238
Mobile: +86-13572288349
E-mail: zhoushuaifeng@huawei.com
www.huawei.com
****************************************************************************
***********
This e-mail and its attachments contain confidential information from
HUAWEI, which is intended only for the person or entity whose address is
listed above. Any use of the information contained herein in any way
(including, but not limited to, total or partial disclosure, reproduction,
or dissemination) by persons other than the intended recipient(s) is
prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!
**********************************************

-----邮件原件-----
发件人: Ning Zhang [mailto:nzhang@facebook.com]
发送时间: 2010年5月26日 21:50
收件人: hive-dev@hadoop.apache.org
主题: Re: Hive Develop problem for help

JDO is used for storing Java objects to the metastore. It may be a
configuration error. can you double check your metastore configuration in
hive-site.xml? more detailed exception could be found in /tmp/<userid>/hive.
log.

On May 26, 2010, at 2:08 AM, Zhou Shuaifeng wrote:

>
> Hi All,
>
> When I use Hadoop and Hive in suse linux, I got a plblem:
>
> hive> show tables;
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
> My hadoop version is 0.20.2 and hive version is 0.5.0.
>
> Have anybody ever met the same problem with me? How to solve it?
>
> Thanks a lot.
>
>
> Shuaifeng(Frank) Zhou
> Huawei Technologies Co., Ltd.
> Tel: +86-29-81873251
> Fax: +86-29-81873238
> Mobile: +86-13572288349
> E-mail: zhoushuaifeng@huawei.com
> www.huawei.com
> **********************************************************************
> ******
> ***********
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address
> is listed above. Any use of the information contained herein in any
> way (including, but not limited to, total or partial disclosure,
> reproduction, or dissemination) by persons other than the intended
> recipient(s) is prohibited. If you receive this e-mail in error,
> please notify the sender by phone or email immediately and delete it!
> **********************************************
>


Re: Hive Develop problem for help

Posted by Vinithra Varadharajan <vi...@cloudera.com>.
Hi,

It seems like you don't have the JDBC driver to the derby DB (org.
apache.derby.jdbc.ClientDriver) in your classpath or in the hive lib
directory. These steps are specified in
http://wiki.apache.org/hadoop/HiveDerbyServerMode -> Copy Derby Jar Files.

HTH!
-Vinithra

2010/5/26 Zhou Shuaifeng <zh...@huawei.com>

> Hi Ning,
>
> Thank you very much. There is still problems, details are below, please
> help
> to check where the problem is, thanks a lot.
>
> The runing service info:
>
> linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # jps
> 9133 HMaster
> 9072 HQuorumPeer
> 19910 NetworkServerControl
> 8699 SecondaryNameNode
> 8544 NameNode
> 8778 JobTracker
> 20016 Jps
> 19929 RunJar
>
> The command and error info are below:
>
> linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # hive
> Hive history file=/tmp/root/hive_job_log_root_201005271108_742407423.txt
> hive> show tables;
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> hive>
>
> The hive-site.xml settings are below:
>
> <?xml version="1.0"?>
> <configuration>
> <property>
>  <name>hive.metastore.local</name>
>  <value>true</value>
>  <description>controls whether to connect to remove metastore server or
> open a new metastore server in Hive Client JVM</description>
> </property>
>
> <property>
>  <name>javax.jdo.option.ConnectionURL</name>
>  <value>jdbc:derby://2.1.37.110:1527/metastore_db;create=true</value>
>  <description>JDBC connect string for a JDBC metastore</description>
> </property>
>
> <property>
>  <name>javax.jdo.option.ConnectionDriverName</name>
>  <value>org.apache.derby.jdbc.ClientDriver</value>
>  <description>Driver class name for a JDBC metastore</description>
> </property>
> </configuration>
>
> The logs are below:
>
> 2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.core.resources" but it cannot be resolved.
> 2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.core.resources" but it cannot be resolved.
> 2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.core.runtime" but it cannot be resolved.
> 2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.core.runtime" but it cannot be resolved.
> 2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.text" but it cannot be resolved.
> 2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> "org.
> eclipse.text" but it cannot be resolved.
> 2010-05-27 10:54:56,158 ERROR exec.DDLTask
> (SessionState.java:printError(248)) - FAILED: Error in metadata:
> javax.jdo.JDOFatalInternalException: Error creating transactional
> connection
> factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> org.apache.hadoop.hive.ql.metadata.HiveException:
> javax.jdo.JDOFatalInternalException: Error creating transactional
> connection
> factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:441)
>        at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:423)
>        at
> org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:410)
>        at
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:404)
>        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:159)
>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
>        at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
>        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
>        at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
>        at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: javax.jdo.JDOFatalInternalException: Error creating
> transactional
> connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>         at
>
> org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(Nucl
> eusJDOHelper.java:395)
>        at
>
> org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
> istenceManagerFactory.java:547)
>        at
>
> org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactor
> y(JDOPersistenceManagerFactory.java:175)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
>        at
>
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHe
> lper.java:1159)
>        at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
>        at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
>        at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:161)
>        at
>
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectSto
> re.java:178)
>        at
>
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:122
> )
>        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.
> java:101)
>        at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>        at
>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>        at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStor
> e.java:134)
>        at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hi
> veMetaStore.java:150)
>        at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore
> .java:122)
>        at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaSto
> re.java:104)
>        at
>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreCli
> ent.java:75)
>        at
>
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:828)
>        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:838)
>        at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:439)
>        ... 18 more
> Caused by: java.lang.reflect.InvocationTargetException
>        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>        at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
> sorImpl.java:39)
>        at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
>        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>        at
>
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
> nManagedPluginRegistry.java:576)
>        at
>
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
> .java:324)
>        at
>
> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:
> 190)
>        at
>
> org.datanucleus.store.mapped.MappedStoreManager.<init>(MappedStoreManager.ja
> va:139)
>        at
> org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:265)
>        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>        at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
> sorImpl.java:39)
>        at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
>        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>        at
>
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
> nManagedPluginRegistry.java:576)
>        at
>
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
> .java:300)
>        at
>
> org.datanucleus.store.FederationManager.initialiseStoreManager(FederationMan
> ager.java:106)
>        at org.datanucleus.store.FederationManager.<init>(FederationManager.
> java:68)
>        at
>
> org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManage
> rFactoryImpl.java:152)
>        at
>
> org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
> istenceManagerFactory.java:529)
>        ... 43 more
> Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke
> the "default" plugin to create a ConnectionPool gave an error : Invalid
> datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
> havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
> class is incorrect.
>        at
>
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
> nFactoryImpl.java:169)
>        at
>
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryIm
> pl.java:91)
>        ... 62 more
> Caused by: org.datanucleus.exceptions.NucleusUserException: Invalid
> datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
> havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
> class is incorrect.
>        at
>
> org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
> ManagerDataSource.java:99)
>        at
>
> org.datanucleus.store.rdbms.datasource.DefaultDataSourceFactory.makePooledDa
> taSource(DefaultDataSourceFactory.java:39)
>        at
>
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
> nFactoryImpl.java:148)
>        ... 63 more
> Caused by: org.datanucleus.exceptions.ClassNotResolvedException: Class
> "org.
> apache.derby.jdbc.ClientDriver" was not found in the CLASSPATH. Please
> check
> your specification and your CLASSPATH.
>        at
>
> org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
> ava:250)
>        at
>
> org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
> ava:415)
>        at
>
> org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
> ManagerDataSource.java:87)
>        ... 65 more
>
> Shuaifeng(Frank) Zhou
> Huawei Technologies Co., Ltd.
> Tel: +86-29-81873251
> Fax: +86-29-81873238
> Mobile: +86-13572288349
> E-mail: zhoushuaifeng@huawei.com
> www.huawei.com
>
> ****************************************************************************
> ***********
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by
> phone or email immediately and delete it!
> **********************************************
>
> -----邮件原件-----
> 发件人: Ning Zhang [mailto:nzhang@facebook.com]
> 发送时间: 2010年5月26日 21:50
> 收件人: hive-dev@hadoop.apache.org
> 主题: Re: Hive Develop problem for help
>
> JDO is used for storing Java objects to the metastore. It may be a
> configuration error. can you double check your metastore configuration in
> hive-site.xml? more detailed exception could be found in
> /tmp/<userid>/hive.
> log.
>
> On May 26, 2010, at 2:08 AM, Zhou Shuaifeng wrote:
>
> >
> > Hi All,
> >
> > When I use Hadoop and Hive in suse linux, I got a plblem:
> >
> > hive> show tables;
> > FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> > creating transactional connection factory
> > NestedThrowables:
> > java.lang.reflect.InvocationTargetException
> > FAILED: Execution Error, return code 1 from
> > org.apache.hadoop.hive.ql.exec.DDLTask
> >
> > My hadoop version is 0.20.2 and hive version is 0.5.0.
> >
> > Have anybody ever met the same problem with me? How to solve it?
> >
> > Thanks a lot.
> >
> >
> > Shuaifeng(Frank) Zhou
> > Huawei Technologies Co., Ltd.
> > Tel: +86-29-81873251
> > Fax: +86-29-81873238
> > Mobile: +86-13572288349
> > E-mail: zhoushuaifeng@huawei.com
> > www.huawei.com
> > **********************************************************************
> > ******
> > ***********
> > This e-mail and its attachments contain confidential information from
> > HUAWEI, which is intended only for the person or entity whose address
> > is listed above. Any use of the information contained herein in any
> > way (including, but not limited to, total or partial disclosure,
> > reproduction, or dissemination) by persons other than the intended
> > recipient(s) is prohibited. If you receive this e-mail in error,
> > please notify the sender by phone or email immediately and delete it!
> > **********************************************
> >
>
>

Re: Hive Develop problem for help

Posted by Zhou Shuaifeng <zh...@huawei.com>.
Hi Ning, 

Thank you very much. There is still problems, details are below, please help
to check where the problem is, thanks a lot.

The runing service info:

linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # jps
9133 HMaster
9072 HQuorumPeer
19910 NetworkServerControl
8699 SecondaryNameNode
8544 NameNode
8778 JobTracker
20016 Jps
19929 RunJar

The command and error info are below:

linux-01:/opt/hadoop/db-derby-10.5.3.0-bin/data # hive
Hive history file=/tmp/root/hive_job_log_root_201005271108_742407423.txt
hive> show tables;
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

The hive-site.xml settings are below:

<?xml version="1.0"?>
<configuration>
<property>
  <name>hive.metastore.local</name>
  <value>true</value>
  <description>controls whether to connect to remove metastore server or
open a new metastore server in Hive Client JVM</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby://2.1.37.110:1527/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>org.apache.derby.jdbc.ClientDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>
</configuration>

The logs are below:

2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.resources" but it cannot be resolved.
2010-05-27 10:54:55,890 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.resources" but it cannot be resolved.
2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.runtime" but it cannot be resolved.
2010-05-27 10:54:55,894 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.core.runtime" but it cannot be resolved.
2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.text" but it cannot be resolved.
2010-05-27 10:54:55,895 ERROR DataNucleus.Plugin
(Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.
eclipse.text" but it cannot be resolved.
2010-05-27 10:54:56,158 ERROR exec.DDLTask
(SessionState.java:printError(248)) - FAILED: Error in metadata:
javax.jdo.JDOFatalInternalException: Error creating transactional connection
factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
org.apache.hadoop.hive.ql.metadata.HiveException:
javax.jdo.JDOFatalInternalException: Error creating transactional connection
factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:441)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:423)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:410)
        at
org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:404)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:159)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
        at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
        at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(Nucl
eusJDOHelper.java:395)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
istenceManagerFactory.java:547)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactor
y(JDOPersistenceManagerFactory.java:175)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHe
lper.java:1159)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:161)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectSto
re.java:178)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:122
)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.
java:101)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStor
e.java:134)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hi
veMetaStore.java:150)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore
.java:122)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaSto
re.java:104)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreCli
ent.java:75)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:828)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:838)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesForDb(Hive.java:439)
        ... 18 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
sorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
torAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
nManagedPluginRegistry.java:576)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
.java:324)
        at
org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:
190)
        at
org.datanucleus.store.mapped.MappedStoreManager.<init>(MappedStoreManager.ja
va:139)
        at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:265)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
sorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
torAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(No
nManagedPluginRegistry.java:576)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager
.java:300)
        at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationMan
ager.java:106)
        at org.datanucleus.store.FederationManager.<init>(FederationManager.
java:68)
        at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManage
rFactoryImpl.java:152)
        at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPers
istenceManagerFactory.java:529)
        ... 43 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke
the "default" plugin to create a ConnectionPool gave an error : Invalid
datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
class is incorrect.
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
nFactoryImpl.java:169)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryIm
pl.java:91)
        ... 62 more
Caused by: org.datanucleus.exceptions.NucleusUserException: Invalid
datastore driver class "org.apache.derby.jdbc.ClientDriver" : maybe you
havent specified the JDBC driver JAR in your CLASSPATH, or the name of the
class is incorrect.
        at
org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
ManagerDataSource.java:99)
        at
org.datanucleus.store.rdbms.datasource.DefaultDataSourceFactory.makePooledDa
taSource(DefaultDataSourceFactory.java:39)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(Connectio
nFactoryImpl.java:148)
        ... 63 more
Caused by: org.datanucleus.exceptions.ClassNotResolvedException: Class "org.
apache.derby.jdbc.ClientDriver" was not found in the CLASSPATH. Please check
your specification and your CLASSPATH.
        at
org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
ava:250)
        at
org.datanucleus.JDOClassLoaderResolver.classForName(JDOClassLoaderResolver.j
ava:415)
        at
org.datanucleus.store.rdbms.datasource.DriverManagerDataSource.<init>(Driver
ManagerDataSource.java:87)
        ... 65 more

Shuaifeng(Frank) Zhou
Huawei Technologies Co., Ltd.
Tel: +86-29-81873251
Fax: +86-29-81873238
Mobile: +86-13572288349
E-mail: zhoushuaifeng@huawei.com
www.huawei.com
****************************************************************************
***********
This e-mail and its attachments contain confidential information from
HUAWEI, which is intended only for the person or entity whose address is
listed above. Any use of the information contained herein in any way
(including, but not limited to, total or partial disclosure, reproduction,
or dissemination) by persons other than the intended recipient(s) is
prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!
**********************************************

-----邮件原件-----
发件人: Ning Zhang [mailto:nzhang@facebook.com]
发送时间: 2010年5月26日 21:50
收件人: hive-dev@hadoop.apache.org
主题: Re: Hive Develop problem for help

JDO is used for storing Java objects to the metastore. It may be a
configuration error. can you double check your metastore configuration in
hive-site.xml? more detailed exception could be found in /tmp/<userid>/hive.
log.

On May 26, 2010, at 2:08 AM, Zhou Shuaifeng wrote:

>
> Hi All,
>
> When I use Hadoop and Hive in suse linux, I got a plblem:
>
> hive> show tables;
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
> My hadoop version is 0.20.2 and hive version is 0.5.0.
>
> Have anybody ever met the same problem with me? How to solve it?
>
> Thanks a lot.
>
>
> Shuaifeng(Frank) Zhou
> Huawei Technologies Co., Ltd.
> Tel: +86-29-81873251
> Fax: +86-29-81873238
> Mobile: +86-13572288349
> E-mail: zhoushuaifeng@huawei.com
> www.huawei.com
> **********************************************************************
> ******
> ***********
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address
> is listed above. Any use of the information contained herein in any
> way (including, but not limited to, total or partial disclosure,
> reproduction, or dissemination) by persons other than the intended
> recipient(s) is prohibited. If you receive this e-mail in error,
> please notify the sender by phone or email immediately and delete it!
> **********************************************
>


Re: Hive Develop problem for help

Posted by Ning Zhang <nz...@facebook.com>.
JDO is used for storing Java objects to the metastore. It may be a configuration error. can you double check your metastore configuration in hive-site.xml? more detailed exception could be found in /tmp/<userid>/hive.log. 

On May 26, 2010, at 2:08 AM, Zhou Shuaifeng wrote:

> 
> Hi All,
> 
> When I use Hadoop and Hive in suse linux, I got a plblem:
> 
> hive> show tables;
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> 
> My hadoop version is 0.20.2 and hive version is 0.5.0.
> 
> Have anybody ever met the same problem with me? How to solve it?
> 
> Thanks a lot.
> 
> 
> Shuaifeng(Frank) Zhou
> Huawei Technologies Co., Ltd.
> Tel: +86-29-81873251
> Fax: +86-29-81873238
> Mobile: +86-13572288349
> E-mail: zhoushuaifeng@huawei.com
> www.huawei.com
> ****************************************************************************
> ***********
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender by
> phone or email immediately and delete it!
> **********************************************
>