You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/12/16 22:00:00 UTC
[jira] [Updated] (SPARK-41447) Reduce the number of doMergeApplicationListing invocations
[ https://issues.apache.org/jira/browse/SPARK-41447?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-41447:
----------------------------------
Summary: Reduce the number of doMergeApplicationListing invocations (was: clean up expired event log files that don't exist in listing db)
> Reduce the number of doMergeApplicationListing invocations
> ----------------------------------------------------------
>
> Key: SPARK-41447
> URL: https://issues.apache.org/jira/browse/SPARK-41447
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 3.4.0
> Reporter: shuyouZZ
> Assignee: shuyouZZ
> Priority: Major
> Fix For: 3.4.0
>
>
> When restarting the history server, the previous logic is to execute {{checkForLogs}} first, which will cause the expired event log files to be parsed, and then execute {{checkAndCleanLog}} to delete parsed info, which is unnecessary. In history server log, we can see many {{INFO FsHIstoryProvider: Finished parsing application_xxx}} followed by {{{}INFO FsHIstoryProvider: Deleting expired event log for application_xxx{}}}. If there are a large number of expired log files in the log directory, it will affect the speed of replay.
> In order to avoid this, we can put {{cleanLogs}} before {{{}checkForLogs{}}}.
> In addition, since {{cleanLogs}} is executed before {{{}checkForLogs{}}}, when the history server is starting, the expired log info may not exist in the listing db, so we need to clean up these log files in {{{}cleanLogs{}}}.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org