You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@drill.apache.org by "Anton Gozhiy (Jira)" <ji...@apache.org> on 2020/03/04 18:01:00 UTC

[jira] [Updated] (DRILL-7624) When Hive plugin is enabled with default config, cannot execute any SQL query

     [ https://issues.apache.org/jira/browse/DRILL-7624?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Anton Gozhiy updated DRILL-7624:
--------------------------------
    Summary: When Hive plugin is enabled with default config, cannot execute any SQL query  (was: when enabled Hive plugin with default config, can not execute any SQL query)

> When Hive plugin is enabled with default config, cannot execute any SQL query
> -----------------------------------------------------------------------------
>
>                 Key: DRILL-7624
>                 URL: https://issues.apache.org/jira/browse/DRILL-7624
>             Project: Apache Drill
>          Issue Type: Bug
>    Affects Versions: 1.18.0
>            Reporter: Dmytro Kondriukov
>            Priority: Major
>
> *Preconditions:*
> Enable "hive" plugin, without editing configuration (default config)
> *Steps:*
>  Run any valid query.
> {code:sql}
> SELECT 100; 
> {code}
>  *Expected result:* success executed SQL query 
> *Actual result:*  "UserRemoteException : INTERNAL_ERROR ERROR: Failure setting up Hive metastore client." 
> {noformat}
> org.apache.drill.common.exceptions.UserRemoteException: INTERNAL_ERROR ERROR: Failure setting up Hive metastore client. 
> Plugin name hive 
> Plugin class org.apache.drill.exec.store.hive.HiveStoragePlugin 
> Please, refer to logs for more information. 
> [Error Id: db44f5c3-5136-4fc6-8158-50b63d775fe0 ]
> {noformat}
>  
> {noformat}
>   (org.apache.drill.common.exceptions.ExecutionSetupException) Failure setting up Hive metastore client.
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():78
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.hadoop.hive.metastore.api.MetaException) Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=../sample-data/drill_hive_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
> 	at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
> 	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
> 	at org.apache.derby.jdbc.Driver20.connect(Unknown Source)
> 	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
> 	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
> 	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
> 	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
> 	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
> 	at sun.reflect.GeneratedConstructorAccessor111.newInstance(Unknown Source)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
> 	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> 	at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
> 	at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
> 	at sun.reflect.GeneratedMethodAccessor47.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:519)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:548)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:403)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:340)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:301)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:624)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:590)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:584)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:427)
> 	at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6900)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:129)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>(DrillHiveMetaStoreClient.java:54)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching(DrillHiveMetaStoreClientFactory.java:101)
> 	at org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>(HiveSchemaFactory.java:76)
> 	at org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>(HiveStoragePlugin.java:77)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.drill.exec.store.ClassicConnectorLocator.create(ClassicConnectorLocator.java:274)
> 	at org.apache.drill.exec.store.ConnectorHandle.newInstance(ConnectorHandle.java:98)
> 	at org.apache.drill.exec.store.PluginHandle.plugin(PluginHandle.java:143)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:616)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:601)
> 	at org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules(SqlHandlerConfig.java:48)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:367)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:351)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:338)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel(DefaultSqlHandler.java:663)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:198)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:169)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:283)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:163)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:140)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:93)
> 	at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:590)
> 	at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:275)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	... 89 more
> Caused by: java.sql.SQLException: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
> 	... 86 more
> Caused by: ERROR XBM0H: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
> 	at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
> 	... 86 more
> ------
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():83
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.hadoop.hive.metastore.api.MetaException) Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=../sample-data/drill_hive_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
> 	at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
> 	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
> 	at org.apache.derby.jdbc.Driver20.connect(Unknown Source)
> 	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
> 	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
> 	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
> 	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
> 	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
> 	at sun.reflect.GeneratedConstructorAccessor111.newInstance(Unknown Source)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
> 	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> 	at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
> 	at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
> 	at sun.reflect.GeneratedMethodAccessor47.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:519)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:548)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:403)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:340)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:301)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:624)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:590)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:584)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:427)
> 	at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6900)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:129)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>(DrillHiveMetaStoreClient.java:54)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching(DrillHiveMetaStoreClientFactory.java:101)
> 	at org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>(HiveSchemaFactory.java:76)
> 	at org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>(HiveStoragePlugin.java:77)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.drill.exec.store.ClassicConnectorLocator.create(ClassicConnectorLocator.java:274)
> 	at org.apache.drill.exec.store.ConnectorHandle.newInstance(ConnectorHandle.java:98)
> 	at org.apache.drill.exec.store.PluginHandle.plugin(PluginHandle.java:143)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:616)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:601)
> 	at org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules(SqlHandlerConfig.java:48)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:367)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:351)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:338)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel(DefaultSqlHandler.java:663)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:198)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:169)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:283)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:163)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:140)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:93)
> 	at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:590)
> 	at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:275)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	... 89 more
> Caused by: java.sql.SQLException: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
> 	... 86 more
> Caused by: ERROR XBM0H: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
> 	at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
> 	... 86 more
> ------
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():211
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (javax.jdo.JDOFatalDataStoreException) Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=../sample-data/drill_hive_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
> 	at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
> 	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
> 	at org.apache.derby.jdbc.Driver20.connect(Unknown Source)
> 	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
> 	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
> 	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
> 	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
> 	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
> 	at sun.reflect.GeneratedConstructorAccessor111.newInstance(Unknown Source)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
> 	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> 	at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
> 	at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
> 	at sun.reflect.GeneratedMethodAccessor47.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:519)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:548)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:403)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:340)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:301)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:624)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:590)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:584)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:427)
> 	at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6900)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:129)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>(DrillHiveMetaStoreClient.java:54)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching(DrillHiveMetaStoreClientFactory.java:101)
> 	at org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>(HiveSchemaFactory.java:76)
> 	at org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>(HiveStoragePlugin.java:77)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.drill.exec.store.ClassicConnectorLocator.create(ClassicConnectorLocator.java:274)
> 	at org.apache.drill.exec.store.ConnectorHandle.newInstance(ConnectorHandle.java:98)
> 	at org.apache.drill.exec.store.PluginHandle.plugin(PluginHandle.java:143)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:616)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:601)
> 	at org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules(SqlHandlerConfig.java:48)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:367)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:351)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:338)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel(DefaultSqlHandler.java:663)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:198)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:169)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:283)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:163)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:140)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:93)
> 	at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:590)
> 	at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:275)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	... 89 more
> Caused by: java.sql.SQLException: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
> 	... 86 more
> Caused by: ERROR XBM0H: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
> 	at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
> 	... 86 more
> ------
>     org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException():529
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():830
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (java.sql.SQLException) Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=../sample-data/drill_hive_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
> 	at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
> 	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
> 	at org.apache.derby.jdbc.Driver20.connect(Unknown Source)
> 	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
> 	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
> 	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
> 	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
> 	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
> 	at sun.reflect.GeneratedConstructorAccessor111.newInstance(Unknown Source)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
> 	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> 	at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
> 	at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
> 	at sun.reflect.GeneratedMethodAccessor47.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:519)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:548)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:403)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:340)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:301)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:624)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:590)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:584)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:427)
> 	at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6900)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:129)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>(DrillHiveMetaStoreClient.java:54)
> 	at org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching(DrillHiveMetaStoreClientFactory.java:101)
> 	at org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>(HiveSchemaFactory.java:76)
> 	at org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>(HiveStoragePlugin.java:77)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.drill.exec.store.ClassicConnectorLocator.create(ClassicConnectorLocator.java:274)
> 	at org.apache.drill.exec.store.ConnectorHandle.newInstance(ConnectorHandle.java:98)
> 	at org.apache.drill.exec.store.PluginHandle.plugin(PluginHandle.java:143)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:616)
> 	at org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next(StoragePluginRegistryImpl.java:601)
> 	at org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules(SqlHandlerConfig.java:48)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:367)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:351)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:338)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel(DefaultSqlHandler.java:663)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:198)
> 	at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:169)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:283)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:163)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:140)
> 	at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:93)
> 	at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:590)
> 	at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:275)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.SQLException: Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	... 89 more
> Caused by: java.sql.SQLException: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
> 	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source)
> 	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
> 	... 86 more
> Caused by: ERROR XBM0H: Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
> 	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
> 	at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
> 	at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
> 	... 86 more
> ------
>     sun.reflect.GeneratedConstructorAccessor113.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     com.jolbox.bonecp.PoolUtil.generateSQLException():192
>     com.jolbox.bonecp.BoneCP.<init>():422
>     com.jolbox.bonecp.BoneCPDataSource.getConnection():120
>     org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection():483
>     org.datanucleus.store.rdbms.RDBMSStoreManager.<init>():297
>     sun.reflect.GeneratedConstructorAccessor111.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension():606
>     org.datanucleus.plugin.PluginManager.createExecutableExtension():301
>     org.datanucleus.NucleusContextHelper.createStoreManagerForProperties():133
>     org.datanucleus.PersistenceNucleusContextImpl.initialise():422
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():817
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (java.sql.SQLException) Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
>     org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException():-1
>     org.apache.derby.impl.jdbc.Util.newEmbedSQLException():-1
>     org.apache.derby.impl.jdbc.Util.seeNextException():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.createDatabase():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.<init>():-1
>     org.apache.derby.impl.jdbc.EmbedConnection40.<init>():-1
>     org.apache.derby.jdbc.Driver40.getNewEmbedConnection():-1
>     org.apache.derby.jdbc.InternalDriver.connect():-1
>     org.apache.derby.jdbc.Driver20.connect():-1
>     org.apache.derby.jdbc.AutoloadedDriver.connect():-1
>     java.sql.DriverManager.getConnection():664
>     java.sql.DriverManager.getConnection():208
>     com.jolbox.bonecp.BoneCP.obtainRawInternalConnection():361
>     com.jolbox.bonecp.BoneCP.<init>():416
>     com.jolbox.bonecp.BoneCPDataSource.getConnection():120
>     org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection():483
>     org.datanucleus.store.rdbms.RDBMSStoreManager.<init>():297
>     sun.reflect.GeneratedConstructorAccessor111.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension():606
>     org.datanucleus.plugin.PluginManager.createExecutableExtension():301
>     org.datanucleus.NucleusContextHelper.createStoreManagerForProperties():133
>     org.datanucleus.PersistenceNucleusContextImpl.initialise():422
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():817
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.derby.impl.jdbc.EmbedSQLException) Failed to create database '../sample-data/drill_hive_db', see the next exception for details.
>     org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException():-1
>     org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA():-1
>     org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException():-1
>     org.apache.derby.impl.jdbc.Util.newEmbedSQLException():-1
>     org.apache.derby.impl.jdbc.Util.seeNextException():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.createDatabase():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.<init>():-1
>     org.apache.derby.impl.jdbc.EmbedConnection40.<init>():-1
>     org.apache.derby.jdbc.Driver40.getNewEmbedConnection():-1
>     org.apache.derby.jdbc.InternalDriver.connect():-1
>     org.apache.derby.jdbc.Driver20.connect():-1
>     org.apache.derby.jdbc.AutoloadedDriver.connect():-1
>     java.sql.DriverManager.getConnection():664
>     java.sql.DriverManager.getConnection():208
>     com.jolbox.bonecp.BoneCP.obtainRawInternalConnection():361
>     com.jolbox.bonecp.BoneCP.<init>():416
>     com.jolbox.bonecp.BoneCPDataSource.getConnection():120
>     org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection():483
>     org.datanucleus.store.rdbms.RDBMSStoreManager.<init>():297
>     sun.reflect.GeneratedConstructorAccessor111.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension():606
>     org.datanucleus.plugin.PluginManager.createExecutableExtension():301
>     org.datanucleus.NucleusContextHelper.createStoreManagerForProperties():133
>     org.datanucleus.PersistenceNucleusContextImpl.initialise():422
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():817
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.derby.impl.jdbc.EmbedSQLException) Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
>     org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException():-1
>     org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA():-1
>     org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException():-1
>     org.apache.derby.impl.jdbc.Util.generateCsSQLException():-1
>     org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException():-1
>     org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.handleException():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.createDatabase():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.<init>():-1
>     org.apache.derby.impl.jdbc.EmbedConnection40.<init>():-1
>     org.apache.derby.jdbc.Driver40.getNewEmbedConnection():-1
>     org.apache.derby.jdbc.InternalDriver.connect():-1
>     org.apache.derby.jdbc.Driver20.connect():-1
>     org.apache.derby.jdbc.AutoloadedDriver.connect():-1
>     java.sql.DriverManager.getConnection():664
>     java.sql.DriverManager.getConnection():208
>     com.jolbox.bonecp.BoneCP.obtainRawInternalConnection():361
>     com.jolbox.bonecp.BoneCP.<init>():416
>     com.jolbox.bonecp.BoneCPDataSource.getConnection():120
>     org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection():483
>     org.datanucleus.store.rdbms.RDBMSStoreManager.<init>():297
>     sun.reflect.GeneratedConstructorAccessor111.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension():606
>     org.datanucleus.plugin.PluginManager.createExecutableExtension():301
>     org.datanucleus.NucleusContextHelper.createStoreManagerForProperties():133
>     org.datanucleus.PersistenceNucleusContextImpl.initialise():422
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():817
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.derby.iapi.error.StandardException) Directory /opt/mapr/drill/sample-data/drill_hive_db cannot be created.
>     org.apache.derby.iapi.error.StandardException.newException():-1
>     org.apache.derby.impl.services.monitor.StorageFactoryService$10.run():-1
>     java.security.AccessController.doPrivileged():-2
>     org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot():-1
>     org.apache.derby.impl.services.monitor.BaseMonitor.bootService():-1
>     org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService():-1
>     org.apache.derby.iapi.services.monitor.Monitor.createPersistentService():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.createDatabase():-1
>     org.apache.derby.impl.jdbc.EmbedConnection.<init>():-1
>     org.apache.derby.impl.jdbc.EmbedConnection40.<init>():-1
>     org.apache.derby.jdbc.Driver40.getNewEmbedConnection():-1
>     org.apache.derby.jdbc.InternalDriver.connect():-1
>     org.apache.derby.jdbc.Driver20.connect():-1
>     org.apache.derby.jdbc.AutoloadedDriver.connect():-1
>     java.sql.DriverManager.getConnection():664
>     java.sql.DriverManager.getConnection():208
>     com.jolbox.bonecp.BoneCP.obtainRawInternalConnection():361
>     com.jolbox.bonecp.BoneCP.<init>():416
>     com.jolbox.bonecp.BoneCPDataSource.getConnection():120
>     org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection():483
>     org.datanucleus.store.rdbms.RDBMSStoreManager.<init>():297
>     sun.reflect.GeneratedConstructorAccessor111.newInstance():-1
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension():606
>     org.datanucleus.plugin.PluginManager.createExecutableExtension():301
>     org.datanucleus.NucleusContextHelper.createStoreManagerForProperties():133
>     org.datanucleus.PersistenceNucleusContextImpl.initialise():422
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration():817
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory():334
>     org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory():213
>     sun.reflect.GeneratedMethodAccessor47.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     javax.jdo.JDOHelper$16.run():1965
>     java.security.AccessController.doPrivileged():-2
>     javax.jdo.JDOHelper.invoke():1960
>     javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation():1166
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():808
>     javax.jdo.JDOHelper.getPersistenceManagerFactory():701
>     org.apache.hadoop.hive.metastore.ObjectStore.getPMF():519
>     org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager():548
>     org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper():403
>     org.apache.hadoop.hive.metastore.ObjectStore.initialize():340
>     org.apache.hadoop.hive.metastore.ObjectStore.setConf():301
>     org.apache.hadoop.util.ReflectionUtils.setConf():76
>     org.apache.hadoop.util.ReflectionUtils.newInstance():136
>     org.apache.hadoop.hive.metastore.RawStoreProxy.<init>():58
>     org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy():67
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf():624
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf():590
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS():584
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB():655
>     org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init():427
>     sun.reflect.GeneratedMethodAccessor48.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal():148
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke():107
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>():79
>     org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy():92
>     org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler():6900
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():164
>     org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>():129
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClient.<init>():54
>     org.apache.drill.exec.store.hive.client.DrillHiveMetaStoreClientFactory.createCloseableClientWithCaching():101
>     org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.<init>():76
>     org.apache.drill.exec.store.hive.HiveStoragePlugin.<init>():77
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():62
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():423
>     org.apache.drill.exec.store.ClassicConnectorLocator.create():274
>     org.apache.drill.exec.store.ConnectorHandle.newInstance():98
>     org.apache.drill.exec.store.PluginHandle.plugin():143
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():616
>     org.apache.drill.exec.store.StoragePluginRegistryImpl$PluginIterator.next():601
>     org.apache.drill.exec.planner.sql.handlers.SqlHandlerConfig.getRules():48
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():367
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():351
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform():338
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRel():663
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert():198
>     org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan():169
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():283
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan():163
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan():140
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():93
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():590
>     org.apache.drill.exec.work.foreman.Foreman.run():275
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)