You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "ayushtkn (via GitHub)" <gi...@apache.org> on 2023/02/12 10:10:33 UTC

[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5329: HDFS-16897. fix abundant Broken pipe exception in BlockSender

ayushtkn commented on code in PR #5329:
URL: https://github.com/apache/hadoop/pull/5329#discussion_r1103772620


##########
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/BlockSender.java:
##########
@@ -655,7 +655,7 @@ private int sendPacket(ByteBuffer pkt, int maxChunks, OutputStream out,
           if (ioem.startsWith(EIO_ERROR)) {
             throw new DiskFileCorruptException("A disk IO error occurred", e);
           }
-          if (!ioem.startsWith("Broken pipe")
+          if (!ioem.contains("Broken pipe")
               && !ioem.startsWith("Connection reset")) {

Review Comment:
   If I catch it right, in your use case, the cause had an exception, which started with Broken Pipe?
   I would say extract the the cause if not null, then do this entire check. With your present approach in your case. I doubt if ``ioem.startsWith("Connection reset")`` this won't work properly, if that is also inside the cause, Second doing a contains doesn't look like very safe to me either.
   
   Check if something like this can work
   ```
             String causeMessage = e.getCause() != null ? e.getCause().getMessage() : "";
             if (!ioem.startsWith("Broken pipe") && !ioem.startsWith("Connection reset") &&
                 !causeMessage.startsWith("Broken pipe") && !causeMessage.startsWith("Connection reset")) {
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org