You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Ibrar Ahmed <ib...@gmail.com> on 2015/05/13 20:58:09 UTC

Hive/Hbase Integration issue

Hi,

I am creating a table using hive and getting this error.

[127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value string)
                      > STORED BY
'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
                      > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
":key,cf1:val")
                      > TBLPROPERTIES ("hbase.table.name" = "xyz");



[Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
Can't get the locations
    at
org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
    at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
    at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
    at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
    at
org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288)
    at
org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
    at
org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
    at
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
    at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
    at
org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
    at
org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
    at
org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
    at
org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
    at
org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
    at com.sun.proxy.$Proxy7.createTable(Unknown Source)
    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
    at
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
    at
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
)


Any help/clue can help.

Re: Hive/Hbase Integration issue

Posted by Ibrar Ahmed <ib...@gmail.com>.
Now my hbase is working fine now, but i am still getting the same error


[127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value string)
                      > STORED BY 'org.apache.hadoop.hive.hbase.
HBaseStorageHandler'
                      > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
":key,cf1:val")
                      > TBLPROPERTIES ("hbase.table.name" = "xyz");



[Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
Can't get the locations
    at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadRepli
cas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)

On Thu, May 14, 2015 at 1:18 AM, Ibrar Ahmed <ib...@gmail.com> wrote:

> Seems you are right, Sometime I got this error while running hbase shell
> command.
>
>
> ibrar@ibrar-virtual-machine:/usr/local/hbase/bin$ ./hbase shell
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 2015-05-14 01:14:27,063 WARN  [main] util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2015-05-14 01:14:43,982 ERROR [main] zookeeper.RecoverableZooKeeper:
> ZooKeeper exists failed after 4 attempts
> 2015-05-14 01:14:43,983 WARN  [main] zookeeper.ZKUtil:
> hconnection-0x4d980c0x0, quorum=localhost:2181, baseZNode=/hbase Unable to
> set watcher on znode (/hbase/hbaseid)
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
>     at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
>     at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
>     at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045)
>     at
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:222)
>     at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:481)
>     at
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
>     at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
>     at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:833)
>     at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:623)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>     at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>     at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>     at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
>     at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450)
>     at
> org.jruby.javasupport.JavaMethod.invokeStaticDirect(JavaMethod.java:362)
>     at
> org.jruby.java.invokers.StaticMethodInvoker.call(StaticMethodInvoker.java:58)
>     at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312)
>     at
> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169)
>     at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)
>     at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95)
>     at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
>     at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
>     at
> org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
>     at
> org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169)
>     at
> org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191)
>     at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302)
>     at
> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144)
>     at
> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:148)
>     at org.jruby.RubyClass.newInstance(RubyClass.java:822)
>     at
> org.jruby.RubyClass$i$newInstance.call(RubyClass$i$newInstance.gen:65535)
>     at
> org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrNBlock.call(JavaMethod.java:249)
>     at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:292)
>     at
> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:135)
>     at
> usr.local.hbase.bin.$_dot_dot_.bin.hirb.__file__(/usr/local/hbase/bin/../bin/hirb.rb:118)
>     at
> usr.local.hbase.bin.$_dot_dot_.bin.hirb.load(/usr/local/hbase/bin/../bin/hirb.rb)
>     at org.jruby.Ruby.runScript(Ruby.java:697)
>     at org.jruby.Ruby.runScript(Ruby.java:690)
>     at org.jruby.Ruby.runNormally(Ruby.java:597)
>     at org.jruby.Ruby.runFromMain(Ruby.java:446)
>     at org.jruby.Main.doRunFromMain(Main.java:369)
>     at org.jruby.Main.internalRun(Main.java:258)
>     at org.jruby.Main.run(Main.java:224)
>     at org.jruby.Main.run(Main.java:208)
>     at org.jruby.Main.main(Main.java:188)
>
>
> On Thu, May 14, 2015 at 1:11 AM, kulkarni.swarnim@gmail.com <
> kulkarni.swarnim@gmail.com> wrote:
>
>> Ibrar,
>>
>> This seems to be an issue with the cluster rather than the integration
>> itself. Can you make sure that HBase is happy and healthy and all RS are up
>> and running?
>>
>> On Wed, May 13, 2015 at 1:58 PM, Ibrar Ahmed <ib...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I am creating a table using hive and getting this error.
>>>
>>> [127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value
>>> string)
>>>                       > STORED BY
>>> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>                       > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
>>> ":key,cf1:val")
>>>                       > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>
>>>
>>>
>>> [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
>>> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
>>> Can't get the locations
>>>     at
>>> org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
>>>     at
>>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
>>>     at
>>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
>>>     at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>>>     at
>>> org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288)
>>>     at
>>> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
>>>     at
>>> org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
>>>     at
>>> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
>>>     at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
>>>     at
>>> org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
>>>     at
>>> org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
>>>     at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
>>>     at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
>>>     at
>>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>     at com.sun.proxy.$Proxy7.createTable(Unknown Source)
>>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
>>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
>>>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
>>>     at
>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
>>>     at
>>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
>>>     at
>>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
>>>     at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>     at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>     at
>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>>     at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>     at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>     at java.lang.Thread.run(Thread.java:745)
>>> )
>>>
>>>
>>> Any help/clue can help.
>>>
>>>
>>
>>
>> --
>> Swarnim
>>
>
>
>
> --
>
>


-- 
Ibrar Ahmed

Re: Hive/Hbase Integration issue

Posted by Ibrar Ahmed <ib...@gmail.com>.
Seems you are right, Sometime I got this error while running hbase shell
command.


ibrar@ibrar-virtual-machine:/usr/local/hbase/bin$ ./hbase shell

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2015-05-14 01:14:27,063 WARN  [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2015-05-14 01:14:43,982 ERROR [main] zookeeper.RecoverableZooKeeper:
ZooKeeper exists failed after 4 attempts
2015-05-14 01:14:43,983 WARN  [main] zookeeper.ZKUtil:
hconnection-0x4d980c0x0, quorum=localhost:2181, baseZNode=/hbase Unable to
set watcher on znode (/hbase/hbaseid)
org.apache.zookeeper.KeeperException$ConnectionLossException:
KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
    at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
    at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
    at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045)
    at
org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:222)
    at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:481)
    at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
    at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
    at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:833)
    at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:623)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
    at
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
    at
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450)
    at
org.jruby.javasupport.JavaMethod.invokeStaticDirect(JavaMethod.java:362)
    at
org.jruby.java.invokers.StaticMethodInvoker.call(StaticMethodInvoker.java:58)
    at
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312)
    at
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169)
    at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)
    at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95)
    at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
    at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
    at
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
    at
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169)
    at
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191)
    at
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302)
    at
org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144)
    at
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:148)
    at org.jruby.RubyClass.newInstance(RubyClass.java:822)
    at
org.jruby.RubyClass$i$newInstance.call(RubyClass$i$newInstance.gen:65535)
    at
org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrNBlock.call(JavaMethod.java:249)
    at
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:292)
    at
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:135)
    at
usr.local.hbase.bin.$_dot_dot_.bin.hirb.__file__(/usr/local/hbase/bin/../bin/hirb.rb:118)
    at
usr.local.hbase.bin.$_dot_dot_.bin.hirb.load(/usr/local/hbase/bin/../bin/hirb.rb)
    at org.jruby.Ruby.runScript(Ruby.java:697)
    at org.jruby.Ruby.runScript(Ruby.java:690)
    at org.jruby.Ruby.runNormally(Ruby.java:597)
    at org.jruby.Ruby.runFromMain(Ruby.java:446)
    at org.jruby.Main.doRunFromMain(Main.java:369)
    at org.jruby.Main.internalRun(Main.java:258)
    at org.jruby.Main.run(Main.java:224)
    at org.jruby.Main.run(Main.java:208)
    at org.jruby.Main.main(Main.java:188)


On Thu, May 14, 2015 at 1:11 AM, kulkarni.swarnim@gmail.com <
kulkarni.swarnim@gmail.com> wrote:

> Ibrar,
>
> This seems to be an issue with the cluster rather than the integration
> itself. Can you make sure that HBase is happy and healthy and all RS are up
> and running?
>
> On Wed, May 13, 2015 at 1:58 PM, Ibrar Ahmed <ib...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am creating a table using hive and getting this error.
>>
>> [127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value
>> string)
>>                       > STORED BY
>> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>                       > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
>> ":key,cf1:val")
>>                       > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>>
>>
>> [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
>> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>> MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
>> Can't get the locations
>>     at
>> org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
>>     at
>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
>>     at
>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
>>     at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
>>     at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
>>     at
>> org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
>>     at
>> org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
>>     at
>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
>>     at
>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
>>     at
>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
>>     at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554)
>>     at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>     at com.sun.proxy.$Proxy7.createTable(Unknown Source)
>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>     at
>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>     at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
>>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
>>     at
>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
>>     at
>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
>>     at
>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
>>     at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>     at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>     at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>     at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>     at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>     at java.lang.Thread.run(Thread.java:745)
>> )
>>
>>
>> Any help/clue can help.
>>
>>
>
>
> --
> Swarnim
>



--

Re: Hive/Hbase Integration issue

Posted by "kulkarni.swarnim@gmail.com" <ku...@gmail.com>.
Ibrar,

This seems to be an issue with the cluster rather than the integration
itself. Can you make sure that HBase is happy and healthy and all RS are up
and running?

On Wed, May 13, 2015 at 1:58 PM, Ibrar Ahmed <ib...@gmail.com> wrote:

> Hi,
>
> I am creating a table using hive and getting this error.
>
> [127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value string)
>                       > STORED BY
> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>                       > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
> ":key,cf1:val")
>                       > TBLPROPERTIES ("hbase.table.name" = "xyz");
>
>
>
> [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
> Can't get the locations
>     at
> org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
>     at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
>     at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
>     at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>     at
> org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288)
>     at
> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
>     at
> org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
>     at
> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
>     at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
>     at
> org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
>     at
> org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
>     at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
>     at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
>     at
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>     at com.sun.proxy.$Proxy7.createTable(Unknown Source)
>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>     at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
>     at
> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
>     at
> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
>     at
> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
>     at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>     at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>     at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> )
>
>
> Any help/clue can help.
>
>


-- 
Swarnim