You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/17 15:53:52 UTC

[GitHub] [spark] Yikun opened a new pull request #35557: [SPARK-38244][K8S] Upgrade kubernetes-client to 5.12.1

Yikun opened a new pull request #35557:
URL: https://github.com/apache/spark/pull/35557


   ### What changes were proposed in this pull request?
   Upgrade kubernetes-client to 5.12.1:
   https://github.com/fabric8io/kubernetes-client/releases/tag/v5.12.1
   
   ### Why are the changes needed?
   The next kubernetes client version would be 6.x with breaking changes: https://github.com/fabric8io/kubernetes-client/blob/master/CHANGELOG.md#note-breaking-changes-in-the-api .
   
   We better to upgrade to latest 5.X to reduce upgrade cost.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   - CI
   - integration test
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified autoConfigure context, did you set it? otherwise, it just only call `autoConfigure(config, null)` twice).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   
   I saw the change has been reverted, so we have time to take a deeper look.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified `spark.kubernetes.context`).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified `spark.kubernetes.context`, did you set it? otherwise, it just only can `autoConfigure(config, null)` twice).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4
   2. Wrong context name in `docker-desktop` backend test on kubernetes test. https://github.com/apache/spark/pull/35595
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead of.
   
   See [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context accicently. so it is passed.
   - With 5.12.1, the user specified context will not be overwrited, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead. so it passed.
   - With 5.12.1, the docker-backend integration test use the wrong `docker-for-desktop` context. so it failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045927712


   Unfortunately, this seems to break `docker-for-desktop` IT testing.
   ```
   [info] KubernetesSuite:
   [info] org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite *** ABORTED *** (343 milliseconds)
   [info]   io.fabric8.kubernetes.client.KubernetesClientException: An error has occurred.
   [info]   at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:103)
   [info]   at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:97)
   [info]   at io.fabric8.kubernetes.client.dsl.base.CreateOnlyResourceOperation.create(CreateOnlyResourceOperation.java:63)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesTestComponents.createNamespace(KubernetesTestComponents.scala:50)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.setUpTest(KubernetesSuite.scala:194)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.$anonfun$new$1(KubernetesSuite.scala:199)
   [info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   [info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:210)
   [info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:203)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.runTest(KubernetesSuite.scala:43)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
   [info]   at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
   [info]   at scala.collection.immutable.List.foreach(List.scala:431)
   [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
   [info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
   [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
   [info]   at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
   [info]   at org.scalatest.Suite.run(Suite.scala:1112)
   [info]   at org.scalatest.Suite.run$(Suite.scala:1094)
   [info]   at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
   [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
   [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:64
   [info]   at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
   [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
   [info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.org$scalatest$BeforeAndAfter$$super$run(KubernetesSuite.scala:43)
   [info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:273)
   [info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:271)
   [info]   at org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.run(KubernetesSuite.scala:43)
   [info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
   [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
   [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
   [info]   at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   [info]   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
   [info]   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   [info]   at java.base/java.lang.Thread.run(Thread.java:833)
   [info]   Cause: java.net.UnknownHostException: kubernetes.default.svc: nodename nor servname provided, or not known
   ```
   
   Could you check docker desktop, @Yikun ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045929717


   It seems that the following hidden commit (which is not listed in the release note) could be the usual suspect.
   - https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes ~(especially when specified autoConfigure context, did you set it? otherwise, it just only call `autoConfigure(config, null)` twice)~ when specified autoConfigure context parameter.
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   
   I saw the change has been reverted, so we have time to take a deeper look.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045936316


   Spark uses `autoConfigure` here.
   https://github.com/apache/spark/blob/8a70aeccf96de04cc122a30f50ab752b9b9c85ed/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala#L91


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun commented on pull request #35557: [SPARK-38244][K8S] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1043116171


   cc @dongjoon-hyun 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified `spark.kubernetes.context`, did you set it? otherwise, it just only can `autoConfigure(config, null)` twice).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   
   I saw you revert this change, so we have time to take a deeper look.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop`, there should be two bugs in current spark repo.
   
   1. Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite the specified context, which is fixed in v5.12.1.
   2. Wrong context name in `docker-desktop` backend test on kubernetes test. https://github.com/apache/spark/pull/35595
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead of.
   
   See [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context accicently. so it is passed.
   - With 5.12.1, the user specified wrong context will not be overwrited, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046624864


   @dongjoon-hyun I haven't a high performance env to complete all k8s integration test on `docker-desktop` yet, but it works on basic test. It would be good if you could do a double confirmed! Thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead. so it is passed.
   - With 5.12.1, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1047179557


   Thank you for the investigation, @Yikun .


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun closed pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun closed pull request #35557:
URL: https://github.com/apache/spark/pull/35557


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified `spark.kubernetes.context`, did you set it? otherwise, it just only call `autoConfigure(config, null)` twice).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   
   I saw the change has been reverted, so we have time to take a deeper look.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite the specified context, which is fixed in v5.12.1.
   2. Wrong context name in `docker-desktop` backend test on kubernetes test. https://github.com/apache/spark/pull/35595
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead of.
   
   See [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context (null) accicently. so it is passed.
   - With 5.12.1, the user specified wrong context will not be overwrited, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, wrong_context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context (null) accicently. so it is passed.
   - With 5.12.1, the user specified wrong context will not be overwrited, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context accicently. so it is passed.
   - With 5.12.1, the user env current context will not overwrite, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4 Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. 
   Also see [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   2. https://github.com/apache/spark/pull/35595 Wrong context name in `docker-desktop` backend test on kubernetes test.
   
   These two bugs let "two negatives make a positive", that means
   - With 5.12.0, the docker-backend integration test will never use `docker-for-desktop` as context input, because it would be overwrite by user env current context. so it is passed.
   - With 5.12.1, the docker-backend integration test use the wrong `docker-for-desktop` context. so it is failed.
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun commented on pull request #35557: [SPARK-38244][K8S] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1043110400


   Integrations test passed:
   ```
   [info] KubernetesSuite:
   [info] - Run SparkPi with no resources (12 seconds, 129 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 672 milliseconds)
   [info] - Run SparkPi with a very long application name. (10 seconds, 643 milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (10 seconds, 588 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (10 seconds, 554 milliseconds)
   [info] - Run SparkPi with an argument. (11 seconds, 571 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment variables. (10 seconds, 611 milliseconds)
   [info] - All pods have the same service account by default (10 seconds, 532 milliseconds)
   [info] - Run extraJVMOptions check on driver (5 seconds, 528 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (11 seconds, 728 milliseconds)
   [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (18 seconds, 19 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (19 seconds, 567 milliseconds)
   [info] - Run PySpark on simple pi.py example (12 seconds, 546 milliseconds)
   [info] - Run PySpark to test a pyfiles example (14 seconds, 668 milliseconds)
   [info] - Run PySpark with memory customization (11 seconds, 532 milliseconds)
   [info] - Run in client mode. (9 seconds, 195 milliseconds)
   [info] - Start pod creation from template (11 seconds, 609 milliseconds)
   [info] - Test basic decommissioning (45 seconds, 983 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (46 seconds, 179 milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 45 seconds)
   [info] - Test decommissioning timeouts (45 seconds, 264 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 8 seconds)
   [info] VolcanoSuite:
   [info] - Run SparkPi with no resources (11 seconds, 556 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (12 seconds, 690 milliseconds)
   [info] - Run SparkPi with a very long application name. (12 seconds, 538 milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (11 seconds, 527 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (11 seconds, 551 milliseconds)
   [info] - Run SparkPi with an argument. (12 seconds, 497 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment variables. (11 seconds, 551 milliseconds)
   [info] - All pods have the same service account by default (11 seconds, 532 milliseconds)
   [info] - Run extraJVMOptions check on driver (6 seconds, 552 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (13 seconds, 546 milliseconds)
   [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (20 seconds, 99 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (20 seconds, 245 milliseconds)
   [info] - Run PySpark on simple pi.py example (13 seconds, 631 milliseconds)
   [info] - Run PySpark to test a pyfiles example (16 seconds, 633 milliseconds)
   [info] - Run PySpark with memory customization (12 seconds, 558 milliseconds)
   [info] - Run in client mode. (9 seconds, 119 milliseconds)
   [info] - Start pod creation from template (11 seconds, 624 milliseconds)
   [info] - Test basic decommissioning (47 seconds, 931 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (46 seconds, 962 milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 46 seconds)
   [info] - Test decommissioning timeouts (46 seconds, 419 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 8 seconds)
   [info] - Run SparkPi with volcano scheduler (11 seconds, 567 milliseconds)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045990765


   @dongjoon-hyun Thanks for invistigation.
   
   Yes, this might has some behavior changes(especially when specified `spark.kubernetes.context`, did you set it? otherwise, it just only call `autoConfigure(config, null)` twice).
   
   https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L236-L245
   
   ```java
   // v5.12.0
   public static Config autoConfigure(String context) {
     config.autoConfigure = Boolean.TRUE;
     autoConfigure(config, null);
     return autoConfigure(config, context);
   }
   
   // v5.12.1
   public static Config autoConfigure(String context) {
     return autoConfigure(config, context);
   }
   ```
   
   I saw you revert this change, so we have time to take a deeper look.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1045934759


   I verified that it's recovered after reverting.
   ```
   [info] KubernetesSuite:
   [info] - Run SparkPi with no resources (10 seconds, 137 milliseconds)
   ```
   
   Let me revert this inevitably.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun commented on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun commented on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop`, there should be two bugs in current spark repo.
   
   1. Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite the specified context, which is fixed in v5.12.1.
   2. Wrong context name in `docker-desktop` backend test on kubernetes test. https://github.com/apache/spark/pull/35595
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead of.
   
   See [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L587) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] Yikun edited a comment on pull request #35557: [SPARK-38244][K8S][BUILD] Upgrade kubernetes-client to 5.12.1

Posted by GitBox <gi...@apache.org>.
Yikun edited a comment on pull request #35557:
URL: https://github.com/apache/spark/pull/35557#issuecomment-1046609601


   @dongjoon-hyun I think I understand the reason why this error only happened in `docker-desktop` after 5.12.1 upgrade, there should be two bugs in current spark repo.
   
   1. Before K8S client v5.12.1 (5.12.0 in spark), `autoConfigure` with `null` will overwrite `autoConfigure` with specified context. https://github.com/fabric8io/kubernetes-client/commit/81c11ca5c2044eac2cbebe5107945ec95011e8f4
   2. Wrong context name in `docker-desktop` backend test on kubernetes test. https://github.com/apache/spark/pull/35595
   
   These two bugs let "two negatives make a positive", that means the docker-backend integration test will never use `docker-for-desktop` as context input, because it use env current context instead.
   
   See [here](https://github.com/fabric8io/kubernetes-client/blob/677884ca88f3911f9b41ec919db152d441ad2cdd/kubernetes-client-api/src/main/java/io/fabric8/kubernetes/client/Config.java#L591-L596) as reference, if we already call `autoConfigure(config, null)`, the `autoConfigure(config, context)` will not work).
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org