You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2018/06/10 16:37:00 UTC

[jira] [Commented] (SPARK-23210) Introduce the concept of default value to schema

    [ https://issues.apache.org/jira/browse/SPARK-23210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16507439#comment-16507439 ] 

Xiao Li commented on SPARK-23210:
---------------------------------

This is being considered. Thanks for submitting this request.

> Introduce the concept of default value to schema
> ------------------------------------------------
>
>                 Key: SPARK-23210
>                 URL: https://issues.apache.org/jira/browse/SPARK-23210
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.1
>            Reporter: LvDongrong
>            Priority: Major
>
> There is no concept of DEFAULT VALUE for schema in spark now.
> Our team want to support insert into serial columns of table,like "insert into (a, c) values ("value1", "value2") for our use case, but the default vaule of column is not definited. In hive, the default vaule of column is NULL if we don't specify.
> So I think maybe it is necessary to introduce the concept of default value to schema in spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org