You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/03/07 00:57:02 UTC

[GitHub] [spark] holdenk commented on issue #26687: [SPARK-30055][k8s] Allow configuration of restart policy for Kubernetes pods

holdenk commented on issue #26687: [SPARK-30055][k8s] Allow configuration of restart policy for Kubernetes pods
URL: https://github.com/apache/spark/pull/26687#issuecomment-596024105
 
 
   Thanks for working on this @khogeland. I like the idea of allowing people to configure the restart policy. I'm curious how this would work with Spark's dynamic scale down? When Spark cleans up the executors is does so by asking them to exit. Would it make sense to warn people about this and suggest they choose between `Never` and `OnFailure` in the description?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org