You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/01/02 13:09:00 UTC

[jira] [Commented] (SPARK-22613) Make UNCACHE TABLE behaviour consistent with CACHE TABLE

    [ https://issues.apache.org/jira/browse/SPARK-22613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16308019#comment-16308019 ] 

Apache Spark commented on SPARK-22613:
--------------------------------------

User 'vinodkc' has created a pull request for this issue:
https://github.com/apache/spark/pull/20134

> Make UNCACHE TABLE behaviour consistent with CACHE TABLE
> --------------------------------------------------------
>
>                 Key: SPARK-22613
>                 URL: https://issues.apache.org/jira/browse/SPARK-22613
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, SQL
>    Affects Versions: 2.2.0
>            Reporter: Andreas Maier
>            Priority: Minor
>
> The Spark SQL function CACHE TABLE is eager by default. Therefore it offers an optional keyword LAZY in case you do not want to cache the complete table immediately (See https://docs.databricks.com/spark/latest/spark-sql/language-manual/cache-table.html). But the corresponding Spark SQL function UNCACHE TABLE is lazy by default and doesn't offer an option EAGER (See https://docs.databricks.com/spark/latest/spark-sql/language-manual/uncache-table.html, https://stackoverflow.com/questions/47226494/is-uncache-table-a-lazy-operation-in-spark-sql). So one cannot cache and uncache a table in an eager way using Spark SQL. 
> As a user I want an option EAGER for UNCACHE TABLE. An alternative could be to change the behaviour of UNCACHE TABLE to be eager by default (consistent with CACHE TABLE) and then offer an option LAZY also for UNCACHE TABLE. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org