You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eyal Farago (JIRA)" <ji...@apache.org> on 2018/08/13 16:04:00 UTC
[jira] [Created] (SPARK-25103) CompletionIterator may delay GC of
completed resources
Eyal Farago created SPARK-25103:
-----------------------------------
Summary: CompletionIterator may delay GC of completed resources
Key: SPARK-25103
URL: https://issues.apache.org/jira/browse/SPARK-25103
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.3.0, 2.2.0, 2.1.0, 2.0.1
Reporter: Eyal Farago
while working on SPARK-22713 , I fund (and partially fixed) a scenario in which an iterator is already exhausted but still holds a reference to some resources that can be GCed at this point.
However, these resources can not be GCed because of this reference.
the specific fix applied in SPARK-22713 was to wrap the iterator with a CompletionIterator that cleans it when exhausted, thing is that it's quite easy to get this wrong by closing over local variables or _this_ reference in the cleanup function itself.
I propose solving this by modifying CompletionIterator to discard references to the wrapped iterator and cleanup function once exhausted.
* a dive into the code showed that most CompletionIterators are eventually used by
{code:java}
org.apache.spark.scheduler.ShuffleMapTask#runTask{code}
which does:
{code:java}
writer.write(rdd.iterator(partition, context).asInstanceOf[Iterator[_ <: Product2[Any, Any]]]){code}
looking at
{code:java}
org.apache.spark.shuffle.ShuffleWriter#write{code}
implementations, it seems all of them first exhaust the iterator and then perform some kind of post-processing: i.e. merging spills, sorting, writing partitions files and then concatenating them into a single file... bottom line the Iterator may actually be 'sitting' for some time after being exhausted.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org