You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/02/11 02:31:00 UTC

[jira] [Resolved] (SPARK-22826) [SQL] findWiderTypeForTwo Fails over StructField of Array

     [ https://issues.apache.org/jira/browse/SPARK-22826?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-22826.
----------------------------------
    Resolution: Cannot Reproduce

It's fixed somewhere. Please link the Jira if anyone identifies.

> [SQL] findWiderTypeForTwo Fails over StructField of Array
> ---------------------------------------------------------
>
>                 Key: SPARK-22826
>                 URL: https://issues.apache.org/jira/browse/SPARK-22826
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Aleksander Eskilson
>            Priority: Major
>
> The {{findWiderTypeForTwo}} codepath in Catalyst {{TypeCoercion}} fails when applied to to {{StructType}} having the following fields:
> {noformat}
>       StructType(StructField("a", ArrayType(StringType, containsNull=true)) :: Nil),
>       StructType(StructField("a", ArrayType(StringType, containsNull=false)) :: Nil)
> {noformat}
> When in {{findTightestCommonType}}, the function attempts to recursively find the tightest common type of two arrays. These two arrays are not equal types (since one would admit null elements and the other would not), but {{findTightestCommonType}} has no match case for {{ArrayType}} (or {{MapType}}), so the [get|https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala#L108] operation on the dataType of the {{StructField}} throws a {{NoSuchElementException}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org