You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/16 13:18:26 UTC

[GitHub] [spark] tgravescs commented on pull request #29770: [SPARK-32899][CORE] Support submit application with user-defined cluster manager

tgravescs commented on pull request #29770:
URL: https://github.com/apache/spark/pull/29770#issuecomment-693399056


   this is an interesting issue.  One of the issues is how does spark submit properly know what all arguments are supported by that cluster manager. Similar what deployment modes are supported. there is a lot of cluster manager specific logic in here and this may work for you for most things but I would be surprised if it worked for all things.
   
   Did you test this with both spark-submit and the interactive shells (spark-shell, pyspark, etc)?  I'm not sure if you cluster manager supports full cluster mode or not vs running driver locally.
   
   I think if we officially want to support this we need something else, some parts would need to be pluggable.  I think that is going to be a whole lot more change though. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org