You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Adrian Ionescu (JIRA)" <ji...@apache.org> on 2018/01/04 17:55:00 UTC

[jira] [Created] (SPARK-22961) Constant columns no longer picked as constraints in 2.3

Adrian Ionescu created SPARK-22961:
--------------------------------------

             Summary: Constant columns no longer picked as constraints in 2.3
                 Key: SPARK-22961
                 URL: https://issues.apache.org/jira/browse/SPARK-22961
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.3.0, 3.0.0
            Reporter: Adrian Ionescu


We're no longer picking up {{x = 2}} as a constraint from something like {{df.withColumn("x", lit(2))}}

The unit test below succeeds in {{branch-2.2}}:
{code}
test("constraints should be inferred from aliased literals") {
    val originalLeft = testRelation.subquery('left).as("left")
    val optimizedLeft = testRelation.subquery('left).where(IsNotNull('a) && 'a <=> 2).as("left")

    val right = Project(Seq(Literal(2).as("two")), testRelation.subquery('right)).as("right")
    val condition = Some("left.a".attr === "right.two".attr)

    val original = originalLeft.join(right, Inner, condition)
    val correct = optimizedLeft.join(right, Inner, condition)

    comparePlans(Optimize.execute(original.analyze), correct.analyze)
  }
{code}
but fails in {{branch-2.3}} with:
{code}
== FAIL: Plans do not match ===
 'Join Inner, (two#0 = a#0)                     'Join Inner, (two#0 = a#0)
!:- Filter isnotnull(a#0)                       :- Filter ((2 <=> a#0) && isnotnull(a#0))
 :  +- LocalRelation <empty>, [a#0, b#0, c#0]   :  +- LocalRelation <empty>, [a#0, b#0, c#0]
 +- Project [2 AS two#0]                        +- Project [2 AS two#0]
    +- LocalRelation <empty>, [a#0, b#0, c#0]      +- LocalRelation <empty>, [a#0, b#0, c#0] 
{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org