You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "LuciferYang (via GitHub)" <gi...@apache.org> on 2023/05/11 03:14:59 UTC

[GitHub] [spark] LuciferYang commented on a diff in pull request #41105: [SPARK-43403][UI] Ensure old SparkUI in HistoryServer has been detached before loading new one

LuciferYang commented on code in PR #41105:
URL: https://github.com/apache/spark/pull/41105#discussion_r1190592755


##########
core/src/main/scala/org/apache/spark/deploy/history/ApplicationCache.scala:
##########
@@ -48,11 +49,28 @@ private[history] class ApplicationCache(
     val retainedApplications: Int,
     val clock: Clock) extends Logging {
 
+  /**
+   * Keep track of SparkUIs in [[ApplicationCache#appCache]] and SparkUIs removed from
+   * [[ApplicationCache#appCache]] but not detached yet.
+   */
+  private val loadedApps = new ConcurrentHashMap[CacheKey, CountDownLatch]()
+
   private val appLoader = new CacheLoader[CacheKey, CacheEntry] {
 
     /** the cache key doesn't match a cached entry, or the entry is out-of-date, so load it. */
     override def load(key: CacheKey): CacheEntry = {
-      loadApplicationEntry(key.appId, key.attemptId)
+      // Ensure old SparkUI has been detached before loading new one.
+      val removalLatch = loadedApps.get(key)
+      if (removalLatch != null) {
+        // Old SparkUI is in the middle of detaching.
+        // Waiting 10 seconds should be enough since detaching usually takes less than 1 second.
+        if (!removalLatch.await(10, TimeUnit.SECONDS)) {
+          throw new TimeoutException("Timed out waiting for old SparkUI to be detached")

Review Comment:
   More places in Spark directly use `java.util.concurrent.TimeoutException` instead of `scala.concurrent.TimeoutException`(`s.c.TimeoutException` is just an alias for j.u.c.TimeoutException).
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org