You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2021/07/01 17:39:00 UTC

[jira] [Assigned] (SPARK-35756) unionByName should support nested struct also

     [ https://issues.apache.org/jira/browse/SPARK-35756?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reassigned SPARK-35756:
-----------------------------------

    Assignee: Saurabh Chawla

> unionByName should support nested struct also
> ---------------------------------------------
>
>                 Key: SPARK-35756
>                 URL: https://issues.apache.org/jira/browse/SPARK-35756
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.1
>            Reporter: Wassim Almaaoui
>            Assignee: Saurabh Chawla
>            Priority: Major
>
> It would be cool if `unionByName` supports also nested struct. I don't kwon if it's the expected behaviour already or not so I am not sure if its a bug or an improvement proposal. 
> {code:java}
> case class Struct1(c1: Int, c2: Int)
> case class Struct2(c2: Int, c1: Int)
> val ds1 = Seq((1, Struct1(1,2))).toDS
> val ds2 = Seq((1, Struct2(1,2))).toDS
> ds1.unionByName(ds2.as[(Int,Struct1)]) {code}
> gives 
> {code:java}
> org.apache.spark.sql.AnalysisException: Union can only be performed on tables with the compatible column types. struct<c2:int,c1:int> <> struct<c1:int,c2:int> at the second column of the second table; 'Union false, false :- LocalRelation [_1#38, _2#39] +- LocalRelation _1#45, _2#46
> {code}
> The code documentation of the function `unionByName` says `Note that allowMissingColumns supports nested column in struct types` but doesn't say if the function itself supports the nested column ordering or not. 
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org