You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by "wenbin lee (Jira)" <ji...@apache.org> on 2020/06/10 06:17:00 UTC
[jira] [Created] (HDFS-15405) DataXceiver error processing
READ_BLOCK operation src: /10.10.10.87:37424 dst: /10.10.10.87:50010
wenbin lee created HDFS-15405:
---------------------------------
Summary: DataXceiver error processing READ_BLOCK operation src: /10.10.10.87:37424 dst: /10.10.10.87:50010
Key: HDFS-15405
URL: https://issues.apache.org/jira/browse/HDFS-15405
Project: Hadoop HDFS
Issue Type: Bug
Components: datanode
Affects Versions: 2.5.0
Reporter: wenbin lee
datanode service restart, datanode logfile generate many error information:
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: S10-870.server.baihe:50010:DataXceiver error processing READ_BLOCK operation src: /10.10.10.87:37424 dst: /10.10.10.87:50010
java.io.IOException: Replica gen stamp < block genstamp, block=BP-1354516653-10.10.10.33-1532503068514:blk_1080284948_6544482, replica=ReplicaWaitingToBeRecovered, blk_1080284948_6544202, RWR
getNumBytes() = 6127077
getBytesOnDisk() = 6127077
getVisibleLength()= -1
getVolume() = /home/disk3/dfs/dn/current
getBlockFile() = /home/disk3/dfs/dn/current/BP-1354516653-10.10.10.33-1532503068514/current/rbw/blk_1080284948
unlinked=false
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:240)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:745)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org