You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/09/11 10:30:00 UTC

[jira] [Assigned] (SPARK-32856) Prohibit binary comparisons chain

     [ https://issues.apache.org/jira/browse/SPARK-32856?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-32856:
------------------------------------

    Assignee: Apache Spark

> Prohibit binary comparisons chain
> ---------------------------------
>
>                 Key: SPARK-32856
>                 URL: https://issues.apache.org/jira/browse/SPARK-32856
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Yuming Wang
>            Assignee: Apache Spark
>            Priority: Major
>
> {code:scala}
> spark.range(10).selectExpr("id as a", "id as b", "id as c").createTempView("t1")
> spark.range(10).selectExpr("id as a", "id as b", "id as c").createTempView("t2")
> spark.sql("select * from t1 join t2 on t1.a = t2.a = t1.b = t2.b").explain()
> {code}
> {noformat}
> == Physical Plan ==
> BroadcastNestedLoopJoin BuildRight, Inner, (cast((cast((a#2704L = a#2712L) as bigint) = b#2705L) as bigint) = b#2713L)
> :- *(1) Project [id#2702L AS a#2704L, id#2702L AS b#2705L, id#2702L AS c#2706L]
> :  +- *(1) Range (0, 10, step=1, splits=2)
> +- BroadcastExchange IdentityBroadcastMode, [id=#207]
>    +- *(2) Project [id#2710L AS a#2712L, id#2710L AS b#2713L, id#2710L AS c#2714L]
>       +- *(2) Range (0, 10, step=1, splits=2)
> {noformat}
> {noformat}
> postgres=# create table t1(a int, b int, c int);
> CREATE TABLE
> postgres=# select * from t1 where a = b =c;
> ERROR:  syntax error at or near "="
> LINE 1: select * from t1 where a = b =c;
>                                      ^
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org