You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/05/06 23:53:00 UTC

[jira] [Commented] (SPARK-38334) Implement support for DEFAULT values for columns in tables

    [ https://issues.apache.org/jira/browse/SPARK-38334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17533158#comment-17533158 ] 

Apache Spark commented on SPARK-38334:
--------------------------------------

User 'dtenedor' has created a pull request for this issue:
https://github.com/apache/spark/pull/36475

> Implement support for DEFAULT values for columns in tables 
> -----------------------------------------------------------
>
>                 Key: SPARK-38334
>                 URL: https://issues.apache.org/jira/browse/SPARK-38334
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Daniel
>            Priority: Major
>
> This story tracks the implementation of DEFAULT values for columns in tables.
> CREATE TABLE and ALTER TABLE invocations will support setting column default values for future operations. Following INSERT, UPDATE, MERGE statements may then reference the value using the DEFAULT keyword as needed.
> Examples:
> {code:sql}
> CREATE TABLE T(a INT, b INT NOT NULL);
> -- The default default is NULL
> INSERT INTO T VALUES (DEFAULT, 0);
> INSERT INTO T(b)  VALUES (1);
> SELECT * FROM T;
> (NULL, 0)
> (NULL, 1)
> -- Adding a default to a table with rows, sets the values for the
> -- existing rows (exist default) and new rows (current default).
> ALTER TABLE T ADD COLUMN c INT DEFAULT 5;
> INSERT INTO T VALUES (1, 2, DEFAULT);
> SELECT * FROM T;
> (NULL, 0, 5)
> (NULL, 1, 5)
> (1, 2, 5) {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org