You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/10/26 19:45:33 UTC

[jira] [Updated] (SPARK-2527) incorrect persistence level shown in Spark UI after repersisting

     [ https://issues.apache.org/jira/browse/SPARK-2527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen updated SPARK-2527:
------------------------------
    Fix Version/s: 1.2.0

> incorrect persistence level shown in Spark UI after repersisting
> ----------------------------------------------------------------
>
>                 Key: SPARK-2527
>                 URL: https://issues.apache.org/jira/browse/SPARK-2527
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 1.0.0
>            Reporter: Diana Carroll
>            Assignee: Josh Rosen
>             Fix For: 1.2.0
>
>         Attachments: persistbug1.png, persistbug2.png
>
>
> If I persist an RDD at one level, unpersist it, then repersist it at another level, the UI will continue to show the RDD at the first level...but correctly show individual partitions at the second level.
> {code}
> import org.apache.spark.api.java.StorageLevels
> import org.apache.spark.api.java.StorageLevels._
> val test1 = sc.parallelize(Array(1,2,3))test1.persist(StorageLevels.DISK_ONLY)
> test1.count()
> test1.unpersist()
> test1.persist(StorageLevels.MEMORY_ONLY)
> test1.count()
> {code}
> after the first call to persist and count, the Spark App web UI shows:
> RDD Storage Info for 14 Storage Level: Disk Serialized 1x Replicated 
> rdd_14_0 	Disk Serialized 1x Replicated
> After the second call, it shows:
> RDD Storage Info for 14 Storage Level: Disk Serialized 1x Replicated 
> rdd_14_0 	Memory Deserialized 1x Replicated 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org