You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Raghu Angadi (JIRA)" <ji...@apache.org> on 2007/10/17 03:03:50 UTC
[jira] Commented: (HADOOP-2067) multiple close() failing in Hadoop
0.14
[ https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12535386 ]
Raghu Angadi commented on HADOOP-2067:
--------------------------------------
in 0.13.x multiple close() was allowed unintentionally I think, second close was hidden from FSInputStream by BufferedInputStream in between.
Many Java streams seem to tolerate closing more than once, may be Hadoop should too.
> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
> Key: HADOOP-2067
> URL: https://issues.apache.org/jira/browse/HADOOP-2067
> Project: Hadoop
> Issue Type: Bug
> Components: dfs
> Reporter: lohit vijayarenu
> Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
> FSDataInputStream
> fSDataInputStream =
> fileSystem.open(new Path(propertyFileName));
> Properties subProperties =
> new Properties();
> subProperties.
> loadFromXML(fSDataInputStream);
> fSDataInputStream.
> close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
> How should this be handled from a users point of view?
> Thanks
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.