You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Milkowski (JIRA)" <ji...@apache.org> on 2017/01/25 17:52:26 UTC
[jira] [Created] (SPARK-19364) Some Blocks in Storage Persists
Forever
Andrew Milkowski created SPARK-19364:
----------------------------------------
Summary: Some Blocks in Storage Persists Forever
Key: SPARK-19364
URL: https://issues.apache.org/jira/browse/SPARK-19364
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.0.2
Environment: ubuntu unix
spar 2.0.2
application is java
Reporter: Andrew Milkowski
running standard kinesis stream ingestion with a java spark app and creating dstream after running for some time some block streams seem to persist forever and never cleaned up and this eventually leads to memory depletion on workers
we even tried cleaning RDD's with the following:
cleaner = ssc.sparkContext().sc().cleaner().get();
filtered.foreachRDD(new VoidFunction<JavaRDD<String>>() {
@Override
public void call(JavaRDD<String> rdd) throws Exception {
cleaner.doCleanupRDD(rdd.id(), true);
}
});
despite above blocks do persis still, this can be seen in spark admin UI
for instance
input-0-1485362233945 1 ip-<>:34245 Memory Serialized 1442.5 KB
above block stays and is never cleaned up
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org