You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@trafodion.apache.org by "Liu, Yuan (Yuan)" <yu...@esgyn.cn> on 2018/04/20 10:18:07 UTC
Load data from Hive table with only 1 replica got error
Hi Trafodioneers,
When I loaded data from Hive table(with only 1 replica) into Trafodion, I got below error,
>>load into TEST_LOAD_PERFORMANCE select * from hive.hive.TEST_LOAD_PERFORMANCE;
Task: LOAD Status: Started Object: TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE
Task: CLEANUP Status: Started Time: 2018-04-20 17:00:30.694053
Task: CLEANUP Status: Ended Time: 2018-04-20 17:00:30.709068
Task: CLEANUP Status: Ended Elapsed Time: 00:00:00.015
Task: LOADING DATA Status: Started Time: 2018-04-20 17:00:30.709099
*** ERROR[8448] Unable to access Hbase interface. Call to ExpHbaseInterface::addToHFile returned error HBASE_ADD_TO_HFILE_ERROR(-713). Cause: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException): Not replicated yet: /user/trafodion/bulkload/TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE/#1/hfile17_1524217739743
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.analyzeFileState(FSNamesystem.java:3467)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3256)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:677)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:213)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:485)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)
Does it mean LOAD does not support load data from data with only 1 replica? I have 3 data nodes.
Best regards,
Yuan
RE: Load data from Hive table with only 1 replica got error
Posted by "Liu, Ming (Ming)" <mi...@esgyn.cn>.
It should not have such limitation from Trafodion side.
For example, in Trafodion's development environment, one can install a local hadoop via a script, which install a development Hadoop instance. That one is a single node Hadoop with replica number as 1. And bulkloader can still work well.
So this probably is due to some other issues.
Ming
From: Liu, Yuan (Yuan) <yu...@esgyn.cn>
Sent: Friday, April 20, 2018 6:18 PM
To: dev@trafodion.apache.org; user@trafodion.apache.org
Subject: Load data from Hive table with only 1 replica got error
Hi Trafodioneers,
When I loaded data from Hive table(with only 1 replica) into Trafodion, I got below error,
>>load into TEST_LOAD_PERFORMANCE select * from hive.hive.TEST_LOAD_PERFORMANCE;
Task: LOAD Status: Started Object: TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE
Task: CLEANUP Status: Started Time: 2018-04-20 17:00:30.694053
Task: CLEANUP Status: Ended Time: 2018-04-20 17:00:30.709068
Task: CLEANUP Status: Ended Elapsed Time: 00:00:00.015
Task: LOADING DATA Status: Started Time: 2018-04-20 17:00:30.709099
*** ERROR[8448] Unable to access Hbase interface. Call to ExpHbaseInterface::addToHFile returned error HBASE_ADD_TO_HFILE_ERROR(-713). Cause: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException): Not replicated yet: /user/trafodion/bulkload/TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE/#1/hfile17_1524217739743
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.analyzeFileState(FSNamesystem.java:3467)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3256)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:677)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:213)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:485)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)
Does it mean LOAD does not support load data from data with only 1 replica? I have 3 data nodes.
Best regards,
Yuan
RE: Load data from Hive table with only 1 replica got error
Posted by "Liu, Ming (Ming)" <mi...@esgyn.cn>.
It should not have such limitation from Trafodion side.
For example, in Trafodion's development environment, one can install a local hadoop via a script, which install a development Hadoop instance. That one is a single node Hadoop with replica number as 1. And bulkloader can still work well.
So this probably is due to some other issues.
Ming
From: Liu, Yuan (Yuan) <yu...@esgyn.cn>
Sent: Friday, April 20, 2018 6:18 PM
To: dev@trafodion.apache.org; user@trafodion.apache.org
Subject: Load data from Hive table with only 1 replica got error
Hi Trafodioneers,
When I loaded data from Hive table(with only 1 replica) into Trafodion, I got below error,
>>load into TEST_LOAD_PERFORMANCE select * from hive.hive.TEST_LOAD_PERFORMANCE;
Task: LOAD Status: Started Object: TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE
Task: CLEANUP Status: Started Time: 2018-04-20 17:00:30.694053
Task: CLEANUP Status: Ended Time: 2018-04-20 17:00:30.709068
Task: CLEANUP Status: Ended Elapsed Time: 00:00:00.015
Task: LOADING DATA Status: Started Time: 2018-04-20 17:00:30.709099
*** ERROR[8448] Unable to access Hbase interface. Call to ExpHbaseInterface::addToHFile returned error HBASE_ADD_TO_HFILE_ERROR(-713). Cause: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException): Not replicated yet: /user/trafodion/bulkload/TRAFODION.SEABASE.TEST_LOAD_PERFORMANCE/#1/hfile17_1524217739743
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.analyzeFileState(FSNamesystem.java:3467)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3256)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:677)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:213)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:485)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)
Does it mean LOAD does not support load data from data with only 1 replica? I have 3 data nodes.
Best regards,
Yuan