You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/11/26 02:33:00 UTC
[jira] [Commented] (SPARK-23356) Pushes Project to both sides of
Union when expression is non-deterministic
[ https://issues.apache.org/jira/browse/SPARK-23356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16698427#comment-16698427 ]
Apache Spark commented on SPARK-23356:
--------------------------------------
User 'heary-cao' has created a pull request for this issue:
https://github.com/apache/spark/pull/23138
> Pushes Project to both sides of Union when expression is non-deterministic
> --------------------------------------------------------------------------
>
> Key: SPARK-23356
> URL: https://issues.apache.org/jira/browse/SPARK-23356
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: caoxuewen
> Priority: Major
>
> Currently, PushProjectionThroughUnion optimizer only supports pushdown project operator to both sides of a Union operator when expression is deterministic , in fact, we can be like pushdown filters, also support pushdown project operator to both sides of a Union operator when expression is non-deterministic , this PR description fix this problem。now the explain looks like:
> === Applying Rule org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion ===
> Input LogicalPlan:
> Project [a#0, rand(10) AS rnd#9]
> +- Union
> :- LocalRelation <empty>, [a#0, b#1, c#2]
> :- LocalRelation <empty>, [d#3, e#4, f#5]
> +- LocalRelation <empty>, [g#6, h#7, i#8]
> Output LogicalPlan:
> Project [a#0, rand(10) AS rnd#9]
> +- Union
> :- Project [a#0]
> : +- LocalRelation <empty>, [a#0, b#1, c#2]
> :- Project [d#3]
> : +- LocalRelation <empty>, [d#3, e#4, f#5]
> +- Project [g#6]
> +- LocalRelation <empty>, [g#6, h#7, i#8]
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org