You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/09 23:19:05 UTC

[GitHub] [spark] dongjoon-hyun commented on a change in pull request #33949: [SPARK-36334][K8S][FOLLOWUP] Allow equal resource version to update snapshot

dongjoon-hyun commented on a change in pull request #33949:
URL: https://github.com/apache/spark/pull/33949#discussion_r705784682



##########
File path: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/ExecutorPodsPollingSnapshotSource.scala
##########
@@ -67,9 +67,9 @@ private[spark] class ExecutorPodsPollingSnapshotSource(
       if (conf.get(KUBERNETES_EXECUTOR_API_POLLING_WITH_RESOURCE_VERSION)) {
         val list = pods.list(new ListOptionsBuilder().withResourceVersion("0").build())
         val newResourceVersion = UnsignedLong.valueOf(list.getMetadata.getResourceVersion())
-        // Replace only when we receive a monotonically increased resourceVersion
+        // Replace only when we receive a monotonically increased or equal resourceVersion
         // because some K8s API servers may return old(smaller) cached versions in case of HA setup.
-        if (resourceVersion == null || newResourceVersion.compareTo(resourceVersion) > 0) {
+        if (resourceVersion == null || newResourceVersion.compareTo(resourceVersion) >= 0) {

Review comment:
       The resource will be the same. We simply invoke `snapshotsStore.replaceSnapshot(list.getItems.asScala.toSeq)` again to make it sure that the driver works in the same way of Spark 3.2 and olders.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org