You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (Jira)" <ji...@apache.org> on 2021/01/06 21:15:00 UTC

[jira] [Updated] (SPARK-34034) "show create table" doesn't work for v2 table

     [ https://issues.apache.org/jira/browse/SPARK-34034?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shixiong Zhu updated SPARK-34034:
---------------------------------
    Labels: regression  (was: regresion)

> "show create table" doesn't work for v2 table
> ---------------------------------------------
>
>                 Key: SPARK-34034
>                 URL: https://issues.apache.org/jira/browse/SPARK-34034
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Shixiong Zhu
>            Priority: Blocker
>              Labels: regression
>
> I was QAing Spark 3.1.0 RC1 and found one regression: "show create table" doesn't work for v2 table.
> But when using Spark 3.0.1, "show create table" works for v2 table.
> Steps to test:
> {code:java}
> /bin/spark-shell --packages io.delta:delta-core_2.12:0.7.0 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
> scala> spark.sql("create table foo(i INT) using delta")
>  res0: org.apache.spark.sql.DataFrame = []
> scala> spark.sql("show create table foo").show(false)
> +-----------------------------------------------+
> |createtab_stmt                                 |
> +-----------------------------------------------+
> |CREATE TABLE `default`.`foo` (
>   )
> USING delta
> |
> +-----------------------------------------------+
> {code}
> Looks like it's caused by [https://github.com/apache/spark/pull/30321]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org