You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Michael Bieniosek (JIRA)" <ji...@apache.org> on 2008/09/09 19:59:44 UTC
[jira] Created: (HADOOP-4134) "Exception in
createBlockOutputStream" shouldn't delete exception stack trace
"Exception in createBlockOutputStream" shouldn't delete exception stack trace
-----------------------------------------------------------------------------
Key: HADOOP-4134
URL: https://issues.apache.org/jira/browse/HADOOP-4134
Project: Hadoop Core
Issue Type: Bug
Affects Versions: 0.18.0
Reporter: Michael Bieniosek
I'm occasionally (1/5000 times) getting this error after upgrading everything to hadoop-0.18:
08/09/09 03:28:36 INFO dfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Could not read from stream
08/09/09 03:28:36 INFO dfs.DFSClient: Abandoning block blk_624229997631234952_8205908
DFSClient contains the logging code:
LOG.info("Exception in createBlockOutputStream " + ie);
This would be better written with ie as the second argument to LOG.info, so that the stack trace could be preserved. As it is, I don't know how to start debugging.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (HADOOP-4134) "Exception in
createBlockOutputStream" shouldn't delete exception stack trace
Posted by "Cagdas (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12658780#action_12658780 ]
Cagdas commented on HADOOP-4134:
--------------------------------
Hi
Was there a resolution for this issue ?
I tried the patch in HADOOP-4533 but it did not help.
Thank you!
C.
> "Exception in createBlockOutputStream" shouldn't delete exception stack trace
> -----------------------------------------------------------------------------
>
> Key: HADOOP-4134
> URL: https://issues.apache.org/jira/browse/HADOOP-4134
> Project: Hadoop Core
> Issue Type: Bug
> Affects Versions: 0.18.0
> Reporter: Michael Bieniosek
>
> I'm occasionally (1/5000 times) getting this error after upgrading everything to hadoop-0.18:
> 08/09/09 03:28:36 INFO dfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Could not read from stream
> 08/09/09 03:28:36 INFO dfs.DFSClient: Abandoning block blk_624229997631234952_8205908
> DFSClient contains the logging code:
> LOG.info("Exception in createBlockOutputStream " + ie);
> This would be better written with ie as the second argument to LOG.info, so that the stack trace could be preserved. As it is, I don't know how to start debugging.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.