You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/07/17 06:07:00 UTC

[jira] [Updated] (SPARK-39422) SHOW CREATE TABLE should suggest 'AS SERDE' for Hive tables with unsupported serde configurations

     [ https://issues.apache.org/jira/browse/SPARK-39422?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-39422:
----------------------------------
    Issue Type: Bug  (was: Improvement)

> SHOW CREATE TABLE should suggest 'AS SERDE' for Hive tables with unsupported serde configurations
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-39422
>                 URL: https://issues.apache.org/jira/browse/SPARK-39422
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Minor
>             Fix For: 3.2.2, 3.3.1
>
>
> If you run `SHOW CREATE TABLE` against a Hive table which uses an unsupported Serde configuration, Spark will return an error message like
> {code:java}
> org.apache.spark.sql.AnalysisException: Failed to execute SHOW CREATE TABLE against table rcFileTable, which is created by Hive and uses the following unsupported serde configuration
>  SERDE: org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe INPUTFORMAT: org.apache.hadoop.hive.ql.io.RCFileInputFormat OUTPUTFORMAT: org.apache.hadoop.hive.ql.io.RCFileOutputFormat {code}
> which is confusing to end users.
> In this situation, I think the error should suggest `SHOW CREATE TABLE ... AS SERDE` to users (similar to other error messages in this code path).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org