You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "LuciferYang (via GitHub)" <gi...@apache.org> on 2023/05/10 07:29:49 UTC

[GitHub] [spark] LuciferYang commented on a diff in pull request #41105: [SPARK-43403] Ensure old SparkUI in HistoryServer has been detached before loading new one

LuciferYang commented on code in PR #41105:
URL: https://github.com/apache/spark/pull/41105#discussion_r1189470862


##########
core/src/main/scala/org/apache/spark/deploy/history/ApplicationCache.scala:
##########
@@ -48,11 +49,28 @@ private[history] class ApplicationCache(
     val retainedApplications: Int,
     val clock: Clock) extends Logging {
 
+  /**
+   * Keep track of SparkUIs in [[ApplicationCache#appCache]] and SparkUIs removed from
+   * [[ApplicationCache#appCache]] but not detached yet.
+   */
+  private val loadedApps = new ConcurrentHashMap[CacheKey, CountDownLatch]()
+
   private val appLoader = new CacheLoader[CacheKey, CacheEntry] {
 
     /** the cache key doesn't match a cached entry, or the entry is out-of-date, so load it. */
     override def load(key: CacheKey): CacheEntry = {
-      loadApplicationEntry(key.appId, key.attemptId)
+      // Ensure old SparkUI has been detached before loading new one.

Review Comment:
   I have some questions about this. Shouldn't the `CacheLoader#load` method only be executed once for accessing the same key? Why is there a scenario where the same key enters the load method more than once? 
   
   Is this a bug of Guava 14.0.1 or did I have an incorrect understanding of the mechanism of `CacheLoader`? Please correct me.
   
    In addition, from the Jira description, does this issue only exist in Spark 3.1.2?
   
   also cc @steveloughran FYI due to he is the author of `ApplicationCache`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org