You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Genmao Yu (JIRA)" <ji...@apache.org> on 2017/02/24 10:16:44 UTC

[jira] [Comment Edited] (SPARK-19699) createOrReplaceTable does not always replace an existing table of the same name

    [ https://issues.apache.org/jira/browse/SPARK-19699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15882385#comment-15882385 ] 

Genmao Yu edited comment on SPARK-19699 at 2/24/17 10:16 AM:
-------------------------------------------------------------

Good catch! Maybe we can add {{rdd.id}} or something else. [~cloud_fan] What's your opinion?

!https://cloud.githubusercontent.com/assets/7402327/23299586/02f7b73a-fabd-11e6-8daf-321ca9ab5ed0.png!


was (Author: unclegen):
Good catch! Maybe we can add {{rdd.id}} or something else. [~cloud_fan]

!https://cloud.githubusercontent.com/assets/7402327/23299586/02f7b73a-fabd-11e6-8daf-321ca9ab5ed0.png!

> createOrReplaceTable does not always replace an existing table of the same name
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-19699
>                 URL: https://issues.apache.org/jira/browse/SPARK-19699
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Barry Becker
>            Priority: Minor
>
> There are cases when dataframe.createOrReplaceTempView does not replace an existing table with the same name.
> Please also refer to my [related stack-overflow post|http://stackoverflow.com/questions/42371690/in-spark-2-1-how-come-the-dataframe-createoreplacetemptable-does-not-replace-an].
> To reproduce, do
> {code}
> df.collect()
> df.createOrReplaceTempView("foo1")
> df.sqlContext.cacheTable("foo1")
> {code}
> with one dataframe, and then do exactly the same thing with a different dataframe. Then look in the storage tab in the spark UI and see multiple entries for "foo1" in the "RDD Name" column.
> Maybe I am misunderstanding, but this causes 2 apparent problems
> 1) How do you know which table will be retrieved with sqlContext.table("foo1") ?
> 2) The duplicate entries represent a memory leak. 
>   I have tried calling dropTempTable(existingName) first, but then have occasionally seen a FAILFAST error when trying to use the table. It's as if the dropTempTable is not synchronous, but maybe I am doing something wrong.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org