You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2022/04/27 15:37:00 UTC

[jira] [Resolved] (SPARK-38914) Allow user to insert specified columns into insertable view

     [ https://issues.apache.org/jira/browse/SPARK-38914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang resolved SPARK-38914.
------------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Issue resolved by pull request 36212
[https://github.com/apache/spark/pull/36212]

> Allow user to insert specified columns into insertable view
> -----------------------------------------------------------
>
>                 Key: SPARK-38914
>                 URL: https://issues.apache.org/jira/browse/SPARK-38914
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.2.1
>            Reporter: morvenhuang
>            Assignee: morvenhuang
>            Priority: Minor
>             Fix For: 3.4.0
>
>
> The option `spark.sql.defaultColumn.useNullsForMissingDefautValues` allows us to insert specified columns into table (SPARK-38795), but currently this option does not work for insertable view, 
> Below INSERT INTO will result in AnalysisException even when the useNullsForMissingDefautValues option is true,
> {code:java}
> spark.sql("CREATE TEMPORARY VIEW v1 (c1 int, c2 string) USING org.apache.spark.sql.json.DefaultSource OPTIONS ( path 'json_dir')");
> spark.sql("INSERT INTO v1(c1) VALUES(100)");
> org.apache.spark.sql.AnalysisException: unknown requires that the data to be inserted have the same number of columns as the target table: target table has 2 column(s) but the inserted data has 1 column(s), including 0 partition column(s) having constant value(s).
> {code}
>  
> I can provide a fix for this issue.
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org