You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2018/04/25 04:29:00 UTC
[jira] [Created] (SPARK-24080) Update the nullability of Filter
output based on inferred predicates
Takeshi Yamamuro created SPARK-24080:
----------------------------------------
Summary: Update the nullability of Filter output based on inferred predicates
Key: SPARK-24080
URL: https://issues.apache.org/jira/browse/SPARK-24080
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 2.3.0
Reporter: Takeshi Yamamuro
In the master, a logical `Filter` node does not respect the nullability that the optimizer rule `InferFiltersFromConstraints`
might change when inferred predicates have `IsNotNull`, e.g.,
{code}
scala> val df = Seq((Some(1), Some(2))).toDF("a", "b")
scala> val filteredDf = df.where("a = 3")
scala> val filteredDf.explain
scala> filteredDf.queryExecution.optimizedPlan.children(0)
res4: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan =
Filter (isnotnull(_1#2) && (_1#2 = 3))
+- LocalRelation [_1#2, _2#3]
scala> filteredDf.queryExecution.optimizedPlan.children(0).output.map(_.nullable)
res5: Seq[Boolean] = List(true, true)
{code}
But, these `nullable` values should be:
{code}
scala> filteredDf.queryExecution.optimizedPlan.children(0).output.map(_.nullable)
res5: Seq[Boolean] = List(false, true)
{code}
This ticket comes from the previous discussion: https://github.com/apache/spark/pull/18576#pullrequestreview-107585997
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org