You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2019/05/10 10:48:00 UTC
[jira] [Assigned] (SPARK-27671) Analysis exception thrown when
casting from a nested null in a struct
[ https://issues.apache.org/jira/browse/SPARK-27671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-27671:
------------------------------------
Assignee: Apache Spark
> Analysis exception thrown when casting from a nested null in a struct
> ---------------------------------------------------------------------
>
> Key: SPARK-27671
> URL: https://issues.apache.org/jira/browse/SPARK-27671
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Liang-Chi Hsieh
> Assignee: Apache Spark
> Priority: Major
>
> When a null in a nested field in struct, casting from the struct throws error, currently.
> {code}
> scala> sql("select cast(struct(1, null) as struct<a:int,b:int>)").show
> scala.MatchError: NullType (of class org.apache.spark.sql.types.NullType$)
> at org.apache.spark.sql.catalyst.expressions.Cast.castToInt(Cast.scala:447)
> at org.apache.spark.sql.catalyst.expressions.Cast.cast(Cast.scala:635)
> at org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castStruct$1(Cast.scala:603)
> {code}
> {code}
> scala> sql("select * FROM VALUES (('a', (10, null))), (('b', (10, 50))), (('c', null)) AS tab(x, y)").show
> org.apache.spark.sql.AnalysisException: failed to evaluate expression named_struct('col1', 10, 'col2', NULL): NullType (of class org.apache.spark.sql.t
> ypes.NullType$); line 1 pos 14
> at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:47)
> at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables.$anonfun$convert$6(ResolveInlineTables.scala:106)
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org