You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Zhang, Liye (JIRA)" <ji...@apache.org> on 2015/07/15 05:40:04 UTC

[jira] [Commented] (SPARK-9044) Updated RDD name does not reflect under "Storage" tab

    [ https://issues.apache.org/jira/browse/SPARK-9044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14627472#comment-14627472 ] 

Zhang, Liye commented on SPARK-9044:
------------------------------------

It's a bug, since RDD is identified by rdd.id and it is immutable, if we only change the name, the RDD is stay the same (cached), and the blockManager would not regard that there is blocks to update, so the metrics would not get updated. So the RDDInfo will not be updated also.

> Updated RDD name does not reflect under "Storage" tab
> -----------------------------------------------------
>
>                 Key: SPARK-9044
>                 URL: https://issues.apache.org/jira/browse/SPARK-9044
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 1.3.1, 1.4.0
>         Environment: Mac  OSX
>            Reporter: Wenjie Zhang
>            Priority: Minor
>
> I was playing the spark-shell in my macbook, here is what I did:
> scala> val textFile = sc.textFile("/Users/jackzhang/Downloads/ProdPart.txt");
> scala> textFile.cache
> scala> textFile.setName("test1")
> scala> textFile.collect
> scala> textFile.name
> res10: String = test1
> After this four commands, I can see the "test1" RDD listed in the "Storage" tab.
> However, if I continually run following commands, nothing will happen from the "Storage" tab:
> scala> textFile.setName("test2")
> scala> textFile.cache
> scala> textFile.collect
> scala> textFile.name
> res10: String = test2
> I am expecting the name of the RDD shows in "Storage" tab should be "test2", is this a bug?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org