You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by ba...@gmail.com on 2006/06/19 23:52:48 UTC

Can't close file copied to DFS

Hi All,

I just recently upgraded from hadoop .2 to hadoop .32 and now I can't
get a file to the DFS. It seems like when we go to upload the CRC file
(which happens before the actual file is upload) I get a problem when
closing the stream. The error is, 'java.io.IOException: failure
closing block of file which is coming from a
'java.net.SocketTimeoutException: Read timed out''. I am not sure why
this would be happening this is a test environment where everything is
on a local box.

I can see through debugging that:

1) FileSystem is created by asking NameNode for DataNode
2) File bytes are copied to stream
3) CRC bytes are created based on file bytes
4) CRC bytes are copied to stream
5) CRC calls close, and simply hangs for a long time resulting in the
socket/IO exception.

Any help would be much appreciated.