You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Phantom <gh...@gmail.com> on 2007/06/21 23:12:04 UTC

Native C HDFS API

Here is a problem that has been driving me insane. Any help/pointer would be
greatly appreciated. I write into a file and everything fine as long as I am
writing but when I attempt to close the file by calling hdfsCloseFile() I
get a return value of 0 indicating successful close but the JNI call fails
with the following exception :

Exception in thread "main" java.io.IOException: Filesystem closed
        at org.apache.hadoop.dfs.DFSClient.checkOpen(DFSClient.java:168)
        at org.apache.hadoop.dfs.DFSClient.access$200(DFSClient.java:48)
        at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.flush(
DFSClient.java:1270)
        at java.io.FilterOutputStream.flush(FilterOutputStream.java:123)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:124)
        at java.io.DataOutputStream.flush(DataOutputStream.java:106)
        at java.io.FilterOutputStream.flush(FilterOutputStream.java:123)
        at java.io.FilterOutputStream.flush(FilterOutputStream.java:123)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:124)
        at java.io.DataOutputStream.flush(DataOutputStream.java:106)
        at org.apache.hadoop.fs.FSDataOutputStream.close(
FSDataOutputStream.java:91)
Call to org/apache/hadoop/fs/FSDataOutputStream::close failed!

Could this be a possible permissions issue ? I am running the client code as
root but Hadoop has been installed on a remote machine under some other
userid. Could someone please tell me what I am doing wrong ?

Thanks
A