You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/21 23:26:24 UTC

[GitHub] [spark] JoshRosen commented on issue #24668: [SPARK-27676][SQL] InMemoryFileIndex should respect spark.sql.files.ignoreMissingFiles

JoshRosen commented on issue #24668: [SPARK-27676][SQL] InMemoryFileIndex should respect spark.sql.files.ignoreMissingFiles
URL: https://github.com/apache/spark/pull/24668#issuecomment-494595508
 
 
   It looks like this change is breaking the ability to drop a catalog table whose underlying files don't exist / have been deleted. In `DropTableCommand` we have
   
   ```scala
   catalog.refreshTable(tableName)
   catalog.dropTable(tableName, ifExists, purge)
   ```
   
   Here, `refresh()` is both clearing old caches (file listing, cached tables, views) _and_ is repopulating some of them (re-listing) and that's triggering the error.
   
   To avoid this, I think I can more narrowly-scope this patch's changes to only propagate `FileNotFoundException` if it occurs for non-root-path listings.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org