You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by "second_comet.yahoo.com via user" <us...@hive.apache.org> on 2022/09/25 03:00:41 UTC

external table using delta format

when execute below command in beeline or pyspark, the table metadata is stored successfully in hive metastore with below warning

CREATE EXTERNAL TABLE testtable USING DELTA LOCATION 's3a://path/to/delta/delta-folder/'

WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider delta. Persisting data source table `testdb`.` testtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.

Is there anyway to save the external table using format that hive metastore can support without throwing above error when using together with beeline/pyspark? 

I can’t use  STORED BY 'io.delta.hive.DeltaStorageHandler' , because I create external table using pyspark, instead of pyhive . I would like it to be compatible with spark