You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/07/27 17:19:00 UTC
[jira] [Updated] (SPARK-36242) Ensure spill file closed before set
success to true in ExternalSorter.spillMemoryIteratorToDisk method
[ https://issues.apache.org/jira/browse/SPARK-36242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-36242:
----------------------------------
Fix Version/s: 3.1.3
> Ensure spill file closed before set success to true in ExternalSorter.spillMemoryIteratorToDisk method
> ------------------------------------------------------------------------------------------------------
>
> Key: SPARK-36242
> URL: https://issues.apache.org/jira/browse/SPARK-36242
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 3.3.0
> Reporter: Yang Jie
> Assignee: Yang Jie
> Priority: Minor
> Fix For: 3.2.0, 3.1.3, 3.3.0
>
>
> The processes of ExternalSorter.spillMemoryIteratorToDisk and ExternalAppendOnlyMap.spillMemoryIteratorToDisk are similar, but there are some differences in setting `success = true`
>
> Code of ExternalSorter.spillMemoryIteratorToDisk as follows:
>
> {code:java}
> if (objectsWritten > 0) {
> flush()
> } else {
> writer.revertPartialWritesAndClose()
> }
> success = true
> } finally {
> if (success) {
> writer.close()
> } else {
> ...
> }
> }{code}
> Code of ExternalSorter.spillMemoryIteratorToDisk as follows:
> {code:java}
> if (objectsWritten > 0) {
> flush()
> writer.close()
> } else {
> writer.revertPartialWritesAndClose()
> }
> success = true
> } finally {
> if (!success) {
> ...
> }
> }{code}
> It seems that the processing of `ExternalSorter.spillMemoryIteratorToDisk` mehod is more reasonable, We should make sure setting `success = true` after the spill file is closed
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org