You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2019/10/24 18:37:00 UTC
[jira] [Updated] (SPARK-29595) Insertion with named_struct should
match by name
[ https://issues.apache.org/jira/browse/SPARK-29595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Gengliang Wang updated SPARK-29595:
-----------------------------------
Description:
{code:java}
spark-sql> create table str using parquet as(select named_struct('a', 1, 'b', 2) as data);
spark-sql> insert into str values named_struct("b", 3, "a", 1);
spark-sql> select * from str;
{"a":3,"b":1}
{"a":1,"b":2}
{code}
The result should be
{code:java}
{"a":1,"b":3}
{"a":1,"b":2}
{code}
Spark should match the field names of named_struct on insertion
was:
{code:java}
spark-sql> create table str using parquet as(select named_struct('a', 1, 'b', 2) as data);
spark-sql> insert into str values named_struct("b", 3, "a", 1);
spark-sql> select * from str;
{"a":3,"b":1}
{"a":1,"b":2}
{code}
The result should be
```
{"a":1,"b":3}
{"a":1,"b":2}
```
Spark should match the field names of named_struct on insertion
> Insertion with named_struct should match by name
> ------------------------------------------------
>
> Key: SPARK-29595
> URL: https://issues.apache.org/jira/browse/SPARK-29595
> Project: Spark
> Issue Type: Task
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Gengliang Wang
> Priority: Major
>
> {code:java}
> spark-sql> create table str using parquet as(select named_struct('a', 1, 'b', 2) as data);
> spark-sql> insert into str values named_struct("b", 3, "a", 1);
> spark-sql> select * from str;
> {"a":3,"b":1}
> {"a":1,"b":2}
> {code}
> The result should be
> {code:java}
> {"a":1,"b":3}
> {"a":1,"b":2}
> {code}
> Spark should match the field names of named_struct on insertion
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org