You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2023/03/24 01:55:00 UTC
[jira] [Resolved] (SPARK-42909) INSERT INTO with column list does not work
[ https://issues.apache.org/jira/browse/SPARK-42909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-42909.
----------------------------------
Resolution: Duplicate
> INSERT INTO with column list does not work
> ------------------------------------------
>
> Key: SPARK-42909
> URL: https://issues.apache.org/jira/browse/SPARK-42909
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.3.2
> Environment: Databricks DBR12.2 on AZure, running Spark 3.3.2
> Documentation: [INSERT - Azure Databricks - Databricks SQL | Microsoft Learn|https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into]
> Reporter: Tjomme Vergauwen
> Priority: Major
> Labels: databricks, documentation, spark-sql, sql
>
> Hi,
> When performing a INSERT INTO with a defined incomplete column list, the missing columns should get a NULL value. However, an error is thrown indicating that the column is missing.
> *Case simulation:*
> drop table if exists default.TVTest;
> create table default.TVTest
> ( col1 int NOT NULL
> , col2 int
> );
> insert into default.TVTest select 1,2;
> insert into default.TVTest select 2,NULL; --> col2 can contain NULL values
> insert into default.TVTest (col1) select 3; -- Error in SQL statement: DeltaAnalysisException: Column col2 is not specified in INSERT
> insert into default.TVTest (col1) VALUES (3); -- Error in SQL statement: DeltaAnalysisException: Column col2 is not specified in INSERT
> select * from default.TVTest;
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org