You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by yaoxiaohua <ya...@outlook.com> on 2015/12/16 08:16:06 UTC

hadoop datanode read or write block error

Hi ,

                This is my first email to ask question for hadoop.

Hadoop version hadoop2.3

Jdk : ibm jdk 1.7

Issues:

I found a lot of error in the data node process log like this:

2015-12-16 14:54:28,438 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving
BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413 src:
/172.19.206.118:58220 dest: /172.19.206.142:50011

2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get
disk usage information

java.io.IOException: Expecting a line not the end of stream

        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)

        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)

        at org.apache.hadoop.util.Shell.run(Shell.java:418)

        at org.apache.hadoop.fs.DU.run(DU.java:190)

        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)

        at java.lang.Thread.run(Thread.java:809)

2015-12-16 14:54:32,189 WARN
org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService

java.io.IOException: Expecting a line not the end of stream

        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)

        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)

        at org.apache.hadoop.util.Shell.run(Shell.java:418)

        at org.apache.hadoop.fs.DU.run(DU.java:190)

        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)

        at java.lang.Thread.run(Thread.java:809)

ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
px42pub:50011:DataXceiver error processing WRITE_BLOCK operation  

 src: /172.19.206.138:47717 dest: /172.19.206.142:50011

java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED

        at
java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.ja
va:968)

        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)

        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)

        at org.apache.hadoop.io.Text.encode(Text.java:443)

        at org.apache.hadoop.io.Text.set(Text.java:198)

        at org.apache.hadoop.io.Text.<init>(Text.java:88)

        at
org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.
java:124)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.jav
a:71)

        at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)

        at java.lang.Thread.run(Thread.java:809)

 

could you give me any suggestion for this?

 

Thanks.

 

Best Regards,

Evan Yao

 


Re: hadoop datanode read or write block error

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello Evan,

It looks like there are two separate problems.

The errors that mention "disk usage" indicate that the DataNode forked to call "du -sk", but the command didn't return any output.  I recommend checking that this machine has du available on the path, and the du command is working as expected.

The error about the WRITE_BLOCK operation indicates a failure to fully deserialize a client request to write a block.  Interestingly, it was able to deserialize enough of the message to identify it as a WRITE_BLOCK operation, but then it failed later parsing the remaining payload.  Is it possible that the client that is trying to perform the write is running a very old version of the HDFS client code?

--Chris Nauroth

From: yaoxiaohua <ya...@outlook.com>>
Date: Tuesday, December 15, 2015 at 11:16 PM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop datanode read or write block error

Hi ,
                This is my first email to ask question for hadoop.
Hadoop version hadoop2.3
Jdk : ibm jdk 1.7
Issues:
I found a lot of error in the data node process log like this:
2015-12-16 14:54:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413 src: /172.19.206.118:58220 dest: /172.19.206.142:50011
2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get disk usage information
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
2015-12-16 14:54:32,189 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: px42pub:50011:DataXceiver error processing WRITE_BLOCK operation
 src: /172.19.206.138:47717 dest: /172.19.206.142:50011
java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED
        at java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.java:968)
        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)
        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)
        at org.apache.hadoop.io.Text.encode(Text.java:443)
        at org.apache.hadoop.io.Text.set(Text.java:198)
        at org.apache.hadoop.io.Text.<init>(Text.java:88)
        at org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
        at java.lang.Thread.run(Thread.java:809)

could you give me any suggestion for this?

Thanks.

Best Regards,
Evan Yao


Re: hadoop datanode read or write block error

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello Evan,

It looks like there are two separate problems.

The errors that mention "disk usage" indicate that the DataNode forked to call "du -sk", but the command didn't return any output.  I recommend checking that this machine has du available on the path, and the du command is working as expected.

The error about the WRITE_BLOCK operation indicates a failure to fully deserialize a client request to write a block.  Interestingly, it was able to deserialize enough of the message to identify it as a WRITE_BLOCK operation, but then it failed later parsing the remaining payload.  Is it possible that the client that is trying to perform the write is running a very old version of the HDFS client code?

--Chris Nauroth

From: yaoxiaohua <ya...@outlook.com>>
Date: Tuesday, December 15, 2015 at 11:16 PM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop datanode read or write block error

Hi ,
                This is my first email to ask question for hadoop.
Hadoop version hadoop2.3
Jdk : ibm jdk 1.7
Issues:
I found a lot of error in the data node process log like this:
2015-12-16 14:54:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413 src: /172.19.206.118:58220 dest: /172.19.206.142:50011
2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get disk usage information
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
2015-12-16 14:54:32,189 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: px42pub:50011:DataXceiver error processing WRITE_BLOCK operation
 src: /172.19.206.138:47717 dest: /172.19.206.142:50011
java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED
        at java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.java:968)
        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)
        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)
        at org.apache.hadoop.io.Text.encode(Text.java:443)
        at org.apache.hadoop.io.Text.set(Text.java:198)
        at org.apache.hadoop.io.Text.<init>(Text.java:88)
        at org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
        at java.lang.Thread.run(Thread.java:809)

could you give me any suggestion for this?

Thanks.

Best Regards,
Evan Yao


Re: hadoop datanode read or write block error

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello Evan,

It looks like there are two separate problems.

The errors that mention "disk usage" indicate that the DataNode forked to call "du -sk", but the command didn't return any output.  I recommend checking that this machine has du available on the path, and the du command is working as expected.

The error about the WRITE_BLOCK operation indicates a failure to fully deserialize a client request to write a block.  Interestingly, it was able to deserialize enough of the message to identify it as a WRITE_BLOCK operation, but then it failed later parsing the remaining payload.  Is it possible that the client that is trying to perform the write is running a very old version of the HDFS client code?

--Chris Nauroth

From: yaoxiaohua <ya...@outlook.com>>
Date: Tuesday, December 15, 2015 at 11:16 PM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop datanode read or write block error

Hi ,
                This is my first email to ask question for hadoop.
Hadoop version hadoop2.3
Jdk : ibm jdk 1.7
Issues:
I found a lot of error in the data node process log like this:
2015-12-16 14:54:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413 src: /172.19.206.118:58220 dest: /172.19.206.142:50011
2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get disk usage information
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
2015-12-16 14:54:32,189 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: px42pub:50011:DataXceiver error processing WRITE_BLOCK operation
 src: /172.19.206.138:47717 dest: /172.19.206.142:50011
java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED
        at java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.java:968)
        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)
        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)
        at org.apache.hadoop.io.Text.encode(Text.java:443)
        at org.apache.hadoop.io.Text.set(Text.java:198)
        at org.apache.hadoop.io.Text.<init>(Text.java:88)
        at org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
        at java.lang.Thread.run(Thread.java:809)

could you give me any suggestion for this?

Thanks.

Best Regards,
Evan Yao


Re: hadoop datanode read or write block error

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello Evan,

It looks like there are two separate problems.

The errors that mention "disk usage" indicate that the DataNode forked to call "du -sk", but the command didn't return any output.  I recommend checking that this machine has du available on the path, and the du command is working as expected.

The error about the WRITE_BLOCK operation indicates a failure to fully deserialize a client request to write a block.  Interestingly, it was able to deserialize enough of the message to identify it as a WRITE_BLOCK operation, but then it failed later parsing the remaining payload.  Is it possible that the client that is trying to perform the write is running a very old version of the HDFS client code?

--Chris Nauroth

From: yaoxiaohua <ya...@outlook.com>>
Date: Tuesday, December 15, 2015 at 11:16 PM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop datanode read or write block error

Hi ,
                This is my first email to ask question for hadoop.
Hadoop version hadoop2.3
Jdk : ibm jdk 1.7
Issues:
I found a lot of error in the data node process log like this:
2015-12-16 14:54:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413 src: /172.19.206.118:58220 dest: /172.19.206.142:50011
2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get disk usage information
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
2015-12-16 14:54:32,189 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: px42pub:50011:DataXceiver error processing WRITE_BLOCK operation
 src: /172.19.206.138:47717 dest: /172.19.206.142:50011
java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED
        at java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.java:968)
        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)
        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)
        at org.apache.hadoop.io.Text.encode(Text.java:443)
        at org.apache.hadoop.io.Text.set(Text.java:198)
        at org.apache.hadoop.io.Text.<init>(Text.java:88)
        at org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
        at java.lang.Thread.run(Thread.java:809)

could you give me any suggestion for this?

Thanks.

Best Regards,
Evan Yao