You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/10/30 16:34:33 UTC

[GitHub] [hudi] xushiyan commented on issue #6001: [SUPPORT] Cannot create again after deleting the Hudi external table using Spark SQL

xushiyan commented on issue #6001:
URL: https://github.com/apache/hudi/issues/6001#issuecomment-1296297443

   @JoshuaZhuCN let me clarify: when it comes to delete the whole table, we support 3 syntaxes
   
   - TRUNCATE TABLE: delete all records via file system; table retained in metastore
   - DROP TABLE: delete no record; table removed from metastore
   - DROP TABLE PURGE: delete all records via file system; table removed from metastore
   
   So in your case where you expect data deleted from the storage, you should use 
   
   ```sql
   DROP TABLE IF EXISTS `default`.`spark_hudi_test_ddl` PURGE;
   ```
   
   If you use `DROP TABLE` (without purge) and you want to recreate the table again, your CREATE TABLE statement should just be
   
   ```sql
   CREATE TABLE IF NOT EXISTS `default`.`spark_hudi_test_ddl` USING HUDI
   LOCATION 'hdfs://localhost:9000/hudi/test/spark_hudi_test_ddl';
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org