You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2018/04/04 00:28:00 UTC

[jira] [Updated] (SPARK-23802) PropagateEmptyRelation can leave query plan in unresolved state

     [ https://issues.apache.org/jira/browse/SPARK-23802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li updated SPARK-23802:
----------------------------
    Fix Version/s: 2.4.0

> PropagateEmptyRelation can leave query plan in unresolved state
> ---------------------------------------------------------------
>
>                 Key: SPARK-23802
>                 URL: https://issues.apache.org/jira/browse/SPARK-23802
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Robert Kruszewski
>            Assignee: Robert Kruszewski
>            Priority: Minor
>             Fix For: 2.3.1, 2.4.0
>
>
> Since [https://github.com/apache/spark/pull/19825] PropagateEmptyRelation has been taught to handle more cases it can cause the optimized query plan to be unresolved.
> Simple repro is to run following through the optimizer
> {code:java}
> LocalRelation.fromExternalRows(Seq('a.int), data = Seq(Row(1))) .join(LocalRelation('a.int, 'b.int), UsingJoin(FullOuter, "a" :: Nil), None){code}
> Which results in
> {code:java}
> Project [coalesce(a#0, null) AS a#7, null AS b#6]
> +- LocalRelation [a#0]{code}
> This then fails type check on coalesce expression since `a` and null have different type.
>  
> Simple, targeted fix is to change PropagateEmptyRelation to add casts around nulls. More comprehensive fix would be to run type coercion at the end of optimization so it can fix cases like those. Alternatively the type checking code could treat NullType as equal to any other type and not fail the type check in the first place.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org