You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2019/04/19 11:43:00 UTC

[jira] [Resolved] (SPARK-27504) File source V2: support refreshing metadata cache

     [ https://issues.apache.org/jira/browse/SPARK-27504?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-27504.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 24401
[https://github.com/apache/spark/pull/24401]

> File source V2: support refreshing metadata cache
> -------------------------------------------------
>
>                 Key: SPARK-27504
>                 URL: https://issues.apache.org/jira/browse/SPARK-27504
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Major
>             Fix For: 3.0.0
>
>
> In file source V1, if some file is deleted manually, reading the DataFrame/Table will throws an exception with suggestion message "It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved.".
> After refreshing the table/DataFrame, the reads should return correct results.
> We should follow it in file source V2 as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org