You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "lohit vijayarenu (JIRA)" <ji...@apache.org> on 2007/10/17 02:52:50 UTC

[jira] Created: (HADOOP-2067) multiple close() failing in Hadoop 0.14

multiple close() failing in Hadoop 0.14
---------------------------------------

                 Key: HADOOP-2067
                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
             Project: Hadoop
          Issue Type: Bug
          Components: dfs
            Reporter: lohit vijayarenu


It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
The use case was to open a file on DFS like shown below
<code>
 FSDataInputStream
	fSDataInputStream =
	fileSystem.open(new Path(propertyFileName));
      Properties subProperties =
	new Properties();
      subProperties.
	loadFromXML(fSDataInputStream);
      fSDataInputStream.
	close();
</code>

This failed with an IOException
<exception>
EXCEPTIN RAISED, which is java.io.IOException: Stream closed
java.io.IOException: Stream closed
</exception>

The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.

How should this be handled from a users point of view? 

Thanks


-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Updated: (HADOOP-2067) multiple close() failing in Hadoop 0.14

Posted by "Raghu Angadi (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Raghu Angadi updated HADOOP-2067:
---------------------------------

    Affects Version/s: 0.14.3

> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
>                 Key: HADOOP-2067
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>    Affects Versions: 0.14.3
>            Reporter: lohit vijayarenu
>         Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
>  FSDataInputStream
> 	fSDataInputStream =
> 	fileSystem.open(new Path(propertyFileName));
>       Properties subProperties =
> 	new Properties();
>       subProperties.
> 	loadFromXML(fSDataInputStream);
>       fSDataInputStream.
> 	close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
> How should this be handled from a users point of view? 
> Thanks

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Updated: (HADOOP-2067) multiple close() failing in Hadoop 0.14

Posted by "lohit vijayarenu (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

lohit vijayarenu updated HADOOP-2067:
-------------------------------------

    Attachment: stack_trace_13_and_14.txt

> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
>                 Key: HADOOP-2067
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>            Reporter: lohit vijayarenu
>         Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
>  FSDataInputStream
> 	fSDataInputStream =
> 	fileSystem.open(new Path(propertyFileName));
>       Properties subProperties =
> 	new Properties();
>       subProperties.
> 	loadFromXML(fSDataInputStream);
>       fSDataInputStream.
> 	close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
> How should this be handled from a users point of view? 
> Thanks

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Commented: (HADOOP-2067) multiple close() failing in Hadoop 0.14

Posted by "Raghu Angadi (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12535386 ] 

Raghu Angadi commented on HADOOP-2067:
--------------------------------------

in 0.13.x multiple close() was allowed unintentionally I think, second close was hidden from FSInputStream by BufferedInputStream in between.

Many Java streams seem to tolerate closing more than once, may be Hadoop should too.

> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
>                 Key: HADOOP-2067
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>            Reporter: lohit vijayarenu
>         Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
>  FSDataInputStream
> 	fSDataInputStream =
> 	fileSystem.open(new Path(propertyFileName));
>       Properties subProperties =
> 	new Properties();
>       subProperties.
> 	loadFromXML(fSDataInputStream);
>       fSDataInputStream.
> 	close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
> How should this be handled from a users point of view? 
> Thanks

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Commented: (HADOOP-2067) multiple close() failing in Hadoop 0.14

Posted by "Raghu Angadi (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12535388 ] 

Raghu Angadi commented on HADOOP-2067:
--------------------------------------

A work around for user code affected by this is to use a {{BufferedInputStream(fsDataInputStream)}}.

> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
>                 Key: HADOOP-2067
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>    Affects Versions: 0.14.3
>            Reporter: lohit vijayarenu
>         Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
>  FSDataInputStream
> 	fSDataInputStream =
> 	fileSystem.open(new Path(propertyFileName));
>       Properties subProperties =
> 	new Properties();
>       subProperties.
> 	loadFromXML(fSDataInputStream);
>       fSDataInputStream.
> 	close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
> How should this be handled from a users point of view? 
> Thanks

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.