You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by zzl1787 <gi...@git.apache.org> on 2017/11/28 11:40:06 UTC

[GitHub] spark pull request #19129: [SPARK-13656][SQL] Delete spark.sql.parquet.cache...

Github user zzl1787 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19129#discussion_r153467044
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -1587,6 +1580,10 @@ options.
           Note that this is different from the Hive behavior.
         - As a result, `DROP TABLE` statements on those tables will not remove the data.
     
    + - From Spark 2.0.1, `spark.sql.parquet.cacheMetadata` is no longer used. See
    +   [SPARK-16321](https://issues.apache.org/jira/browse/SPARK-16321) and
    +   [SPARK-15639](https://issues.apache.org/jira/browse/SPARK-15639) for details.
    --- End diff --
    
    Hi, I'm new to spark.  I wonder how to disable metadata caching after deleting this conf. I created an external table, and the parquet files in specified location are updated daily, So I want to disable metadata caching rather than executing 'refresh table xxx'.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org