You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/12/13 09:30:25 UTC

[GitHub] [spark] antonipp commented on pull request #38376: [SPARK-40817] [Kubernetes] Do not discard remote user-specified files when launching Spark jobs on Kubernetes

antonipp commented on PR #38376:
URL: https://github.com/apache/spark/pull/38376#issuecomment-1348043914

   @holdenk thank you for the review! 
   I realised that the integration test I wrote didn't work properly in the CI. I believe this was due to the fact the SparkPi job I used in the test was finishing way too quickly and Executor pods were cleaned way too quickly as well, so the `doBasicExecutorPodCheck` wasn't succeeding. I believe I fixed it with https://github.com/apache/spark/pull/38376/commits/3facb0d4318f2d8c9f7df689422434ad36aac3f6, all tests are green now! 
   I think we should be good to merge? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org