You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/03 13:59:37 UTC

[GitHub] [spark] tgravescs commented on pull request #37268: [SPARK-39853][CORE] Support stage level task resource profile for standalone cluster when dynamic allocation disabled

tgravescs commented on PR #37268:
URL: https://github.com/apache/spark/pull/37268#issuecomment-1203990274

   thanks for adding more details.
   
   ```
    /**
      * Return resource profile Ids of executors where tasks can be assigned to.
      */
     def compatibleExecutorRpIds(rpMgr: ResourceProfileManager): Set[Int]
   ```
   
   It seems a little bit odd to ask the ResourceProfile to give you compatible other ResourceProfiles.  This feels like it should be in the ResourceProfileManager which knows about all the ResourceProfiles.  I guess that is why you pass in the ResourceProfileManager here?    Is the intention the user could explicitly set which ResourceProfiles its compatible with?   If so I definitely would want a way to not have to specify it.
   
   The other issue raised that wasn't addressed was the reuse policy.  I guess in this case we are limiting the executor profile to 1 because we don't have dynamic allocation so one could argue that if you use task resource request with that you know what you get. Which I am fine with but we need to be clear that it might very well waste resources.
   
   Also if the intent is to not support TaskResourceProfile with dynamic allocation, I think we should throw an exception if anyone uses it with the dynamic allocation config on.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org