You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/12/13 06:54:09 UTC

[GitHub] [spark] pan3793 commented on a change in pull request #34831: [SPARK-37574][CORE][SHUFFLE] Simplify fetchBlocks w/o retry

pan3793 commented on a change in pull request #34831:
URL: https://github.com/apache/spark/pull/34831#discussion_r767450865



##########
File path: core/src/main/scala/org/apache/spark/network/netty/NettyBlockTransferService.scala
##########
@@ -139,14 +139,7 @@ private[spark] class NettyBlockTransferService(
           }
         }
       }
-
-      if (maxRetries > 0) {
-        // Note this Fetcher will correctly handle maxRetries == 0; we avoid it just in case there's
-        // a bug in this code. We should remove the if statement once we're sure of the stability.

Review comment:
       There are several place overwrite the conf to `0`, please search `("spark.shuffle.io.maxRetries", "0")` in `ExternalShuffleIntegrationSuite` and `BlockManagerSuite`

##########
File path: core/src/main/scala/org/apache/spark/network/netty/NettyBlockTransferService.scala
##########
@@ -139,14 +139,7 @@ private[spark] class NettyBlockTransferService(
           }
         }
       }
-
-      if (maxRetries > 0) {
-        // Note this Fetcher will correctly handle maxRetries == 0; we avoid it just in case there's
-        // a bug in this code. We should remove the if statement once we're sure of the stability.

Review comment:
       There are several places overwrite the conf to `0`, please search `("spark.shuffle.io.maxRetries", "0")` in `ExternalShuffleIntegrationSuite` and `BlockManagerSuite`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org