You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/04/11 21:33:12 UTC
[jira] [Commented] (SPARK-6844) Memory leak occurs when register
temp table with cache table on
[ https://issues.apache.org/jira/browse/SPARK-6844?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14491166#comment-14491166 ]
Apache Spark commented on SPARK-6844:
-------------------------------------
User 'viirya' has created a pull request for this issue:
https://github.com/apache/spark/pull/5475
> Memory leak occurs when register temp table with cache table on
> ---------------------------------------------------------------
>
> Key: SPARK-6844
> URL: https://issues.apache.org/jira/browse/SPARK-6844
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.0
> Reporter: Jack Hu
> Labels: Memory, SQL
>
> There is a memory leak in register temp table with cache on
> This is the simple code to reproduce this issue:
> {code}
> val sparkConf = new SparkConf().setAppName("LeakTest")
> val sparkContext = new SparkContext(sparkConf)
> val sqlContext = new SQLContext(sparkContext)
> val tableName = "tmp"
> val jsonrdd = sparkContext.textFile("""sample.json""")
> var loopCount = 1L
> while(true) {
> sqlContext.jsonRDD(jsonrdd).registerTempTable(tableName)
> sqlContext.cacheTable(tableName)
> println("L: " +loopCount + " R:" + sqlContext.sql("""select count(*) from tmp""").count())
> sqlContext.uncacheTable(tableName)
> loopCount += 1
> }
> {code}
> The cause is that the {{InMemoryRelation}}. {{InMemoryColumnarTableScan}} uses the accumulator ({{InMemoryRelation.batchStats}},{{InMemoryColumnarTableScan.readPartitions}}, {{InMemoryColumnarTableScan.readBatches}} ) to get some information from partitions or for test. These accumulators will register itself into a static map in {{Accumulators.originals}} and never get cleaned up.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org