You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Phantom <gh...@gmail.com> on 2007/06/22 03:27:54 UTC

Remote writes into HDFS

Hi

Is it possible to keep the file system open for say 1 hour, continously
write into it and then close it. I constantly get the same error on attempt
to close the handle of the file. I can see all the writes and flushes
happening w/o any problems.
This is the exception I get when I try to close the file. I am writing into
HDFS from a remote machine. What am I doing wrong ?

Exception in thread "main" java.io.IOException: Filesystem closed
        at org.apache.hadoop.dfs.DFSClient.checkOpen(DFSClient.java:168)
        at org.apache.hadoop.dfs.DFSClient.access$200(DFSClient.java:48)
        at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.write(
DFSClient.java:1245)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(
FSDataOutputStream.java:38)
        at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
        at java.io.DataOutputStream.write(DataOutputStream.java:90)
        at org.apache.hadoop.fs.ChecksumFileSystem$FSOutputSummer.write(
ChecksumFileSystem.java:402)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(
FSDataOutputStream.java:38)
        at java.io.BufferedOutputStream.flushBuffer(
BufferedOutputStream.java:65)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
        at java.io.DataOutputStream.flush(DataOutputStream.java:106)
        at org.apache.hadoop.fs.FSDataOutputStream.close(
FSDataOutputStream.java:91)
Call to org/apache/hadoop/fs/FSDataOutputStream::close failed!
[Thu Jun 21 18:24:42 2007] "File closed -1"

Thanks
Avinash