You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "tzxxh (Jira)" <ji...@apache.org> on 2019/11/07 01:34:00 UTC

[jira] [Updated] (SPARK-29782) spark broadcast can not be destoryed in some versions

     [ https://issues.apache.org/jira/browse/SPARK-29782?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

tzxxh updated SPARK-29782:
--------------------------
    Attachment:     (was: 1.png)

> spark broadcast can not be destoryed in some versions
> -----------------------------------------------------
>
>                 Key: SPARK-29782
>                 URL: https://issues.apache.org/jira/browse/SPARK-29782
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager
>    Affects Versions: 2.3.3, 2.4.1, 2.4.2, 2.4.3, 2.4.4
>            Reporter: tzxxh
>            Priority: Major
>
> In spark version (2.3.3 , 2.4.1 , 2.4.2 , 2.4.3 , 2.4.4) use Broadcast.destroy() method can not destroy the broadcast data, the driver and executor storage memory in spark ui is continuous increase。
> {code:java}
> //代码占位符
> val batch = Seq(1 to 9999: _*) 
> val strSeq = batch.map(i => s"xxh-$i") v
> al rdd = sc.parallelize(strSeq) 
> rdd.cache() 
> batch.foreach(_ => { 
>   val broc = sc.broadcast(strSeq) 
>   rdd.map(id => broc.value.contains(id)).collect() 
>   broc.destroy() 
> })
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org