You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Pan (Jira)" <ji...@apache.org> on 2021/12/20 02:43:00 UTC

[jira] [Comment Edited] (SPARK-37648) Spark catalog and Delta tables

    [ https://issues.apache.org/jira/browse/SPARK-37648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17462361#comment-17462361 ] 

Cheng Pan edited comment on SPARK-37648 at 12/20/21, 2:42 AM:
--------------------------------------------------------------

We have a workaround for this issue in Apache Kyuubi (Incubating), [https://github.com/apache/incubator-kyuubi/pull/1476]

Kyuubi can be considered as a more powerful Spark Thrift Server, it's worth a try.


was (Author: pan3793):
This issue has been fixed in Apache Kyuubi (Incubating), [https://github.com/apache/incubator-kyuubi/pull/1476]

Kyuubi can be considered as a more powerful Spark Thrift Server, it's worth a try.

> Spark catalog and Delta tables
> ------------------------------
>
>                 Key: SPARK-37648
>                 URL: https://issues.apache.org/jira/browse/SPARK-37648
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.2
>         Environment: Spark version 3.1.2
> Scala version 2.12.10
> Hive version 2.3.7
> Delta version 1.0.0
>            Reporter: Hanna Liashchuk
>            Priority: Major
>
> I'm using Spark with Delta tables, while tables are created, there are no columns in the table.
> Steps to reproduce:
> 1. Start spark-shell 
> {code:java}
> spark-shell --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog" --conf "spark.sql.legacy.parquet.int96RebaseModeInWrite=LEGACY"{code}
> 2. Create delta table
> {code:java}
> spark.range(10).write.format("delta").option("path", "tmp/delta").saveAsTable("delta"){code}
> 3. Make sure table exists 
> {code:java}
> spark.catalog.listTables.show{code}
> 4. Find out that columns are not
> {code:java}
> spark.catalog.listColumns("delta").show{code}
> This is critical for Delta integration with different BI tools such as Power BI or Tableau, as they are querying spark catalog for the metadata and we are getting errors that no columns are found. 
> Discussion can be found in Delta repository - https://github.com/delta-io/delta/issues/695



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org