You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/07/16 03:54:55 UTC

[GitHub] [spark] maropu commented on a change in pull request #24765: [SPARK-27915][SQL][WIP] Update logical Filter's output nullability based on IsNotNull conditions

maropu commented on a change in pull request #24765: [SPARK-27915][SQL][WIP] Update logical Filter's output nullability based on IsNotNull conditions
URL: https://github.com/apache/spark/pull/24765#discussion_r303719053
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
 ##########
 @@ -51,7 +51,14 @@ case class Subquery(child: LogicalPlan) extends OrderPreservingUnaryNode {
 
 case class Project(projectList: Seq[NamedExpression], child: LogicalPlan)
     extends OrderPreservingUnaryNode {
-  override def output: Seq[Attribute] = projectList.map(_.toAttribute)
+  override def output: Seq[Attribute] = {
+    // The child operator may have inferred more precise nullability information
+    // for the project expression, so leverage that information if it's availble:
+    val childOutputNullability = child.output.map(a => a.exprId -> a.nullable).toMap
+    projectList
+      .map(_.toAttribute)
+      .map{ a => childOutputNullability.get(a.exprId).map(a.withNullability).getOrElse(a) }
 
 Review comment:
   We need to fix this part? It seems `UpdateAttributeNullability` could handle this case if `Filter.output` works well?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org