You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/03/29 02:37:00 UTC

[jira] [Updated] (SPARK-21823) ALTER TABLE table statements such as RENAME and CHANGE columns should raise error if there are any dependent constraints.

     [ https://issues.apache.org/jira/browse/SPARK-21823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-21823:
----------------------------------
    Affects Version/s:     (was: 3.0.0)
                       3.1.0

> ALTER TABLE table statements  such as RENAME and CHANGE columns should  raise  error if there are any dependent constraints. 
> -----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-21823
>                 URL: https://issues.apache.org/jira/browse/SPARK-21823
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Suresh Thalamati
>            Priority: Major
>
> Following ALTER TABLE DDL statements will impact  the  informational constraints defined on a table:
> {code:sql}
> ALTER TABLE name RENAME TO new_name
> ALTER TABLE name CHANGE column_name new_name new_type
> {code}
> Spark SQL should raise errors if there are 
 informational constraints defined on the columns  affected by the ALTER  and let the user drop constraints before proceeding with the DDL. In the future we can enhance the  ALTER  to automatically fix up the constraint definition in the catalog when possible, and not raise error
> When spark adds support for DROP/REPLACE of columns they will impact informational constraints.
> {code:sql}
> ALTER TABLE name DROP [COLUMN] column_name
> ALTER TABLE name REPLACE COLUMNS (col_spec[, col_spec ...])
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org