You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/02/27 15:52:29 UTC

[GitHub] [spark] Ngone51 edited a comment on issue #27722: [SPARK-30969][CORE] Remove resource coordination support from Standalone

Ngone51 edited a comment on issue #27722: [SPARK-30969][CORE] Remove resource coordination support from Standalone
URL: https://github.com/apache/spark/pull/27722#issuecomment-592031322
 
 
   > Its just running discovery script and assuming this owns all of them, correct?
   
   Yea, I think this is the only difference. And it really break local-cluster mode. Though, I haven't made change in `LocalClusterSpark` to handle it and only update one related test to workaround it. And there's no more local-cluster mode tests rely on **coordination**.
   
   And, of cause, user should still be aware of resource conflict between multi drivers. So I still keep the warning in the doc. Same node driver and worker mode, actually, is not recommended when driver submitted in client mode. Ideally, it should be an edge node out of cluster for the driver and assumed not to be on the same of host of worker.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org