You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hanna Liashchuk (Jira)" <ji...@apache.org> on 2021/12/14 21:39:00 UTC

[jira] [Created] (SPARK-37648) Spark catalog and Delta tables

Hanna Liashchuk created SPARK-37648:
---------------------------------------

             Summary: Spark catalog and Delta tables
                 Key: SPARK-37648
                 URL: https://issues.apache.org/jira/browse/SPARK-37648
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.1.2
         Environment: Spark version 3.1.2
Scala version 2.12.10
Hive version 2.3.7
Delta version 1.0.0
            Reporter: Hanna Liashchuk


I'm using Spark with Delta tables, while tables are created, there are no columns in the table.

Steps to reproduce:
1. Start spark-shell 
{code:java}
spark-shell --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog" --conf "spark.sql.legacy.parquet.int96RebaseModeInWrite=LEGACY"{code}
2. Create delta table
{code:java}
spark.range(10).write.format("delta").option("path", "tmp/delta").saveAsTable("delta"){code}
3. Make sure table exists 
{code:java}
spark.catalog.listTables.show{code}
4. Find out that columns are not
{code:java}
spark.catalog.listColumns("delta").show{code}

This is critical for Delta integration with different BI tools such as Power BI or Tableau, as they are querying spark catalog for the metadata and we are getting errors that no columns are found. 
Discussion can be found in Delta repository - https://github.com/delta-io/delta/issues/695



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org