You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/22 10:32:44 UTC

[GitHub] [spark] f-thiele commented on pull request #34043: [SPARK-36782][CORE] Avoid blocking dispatcher-BlockManagerMaster during UpdateBlockInfo

f-thiele commented on pull request #34043:
URL: https://github.com/apache/spark/pull/34043#issuecomment-924801194


   > I am still testing locally, and making sure there are no issues - but the gist is:
   > 
   >    * updateBlockInfo returns Future[Boolean]
   >    * When blockId is a shuffle block - return a Future { } around the existing blockId match
   >    * For all other return in that method, use Future.successful to complete immediately.
   >    * As with the current PR, chain the future to return response.
   > 
   
   Thanks @mridulm. This was very helpful. I agree it would be good to have a different architectural approach than the here presented extra locks. I tried out your suggestions in 306fa172c32a389faecd4fb1aeca9f9e2e44d884 and my tests at least pass fine. Now the change is much more localized with no additional locking.
   
   I don't know whether the repo standard procedure is to close this PR and open a new one with your suggestions or to adapt the existing one - I just pushed my changes here assuming the latter but let me know in case you prefer to open a new PR.
   
   With the proposed changes I suppose we don't actually delegate it to the `DAGScheduler` (as suggested by @Ngone51) but to the `block-manager-ask-thread-pool`. If this is undesired, I believe there's a more fundamental change needed?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org