You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/06/06 14:09:51 UTC

[GitHub] [spark] cloud-fan commented on a diff in pull request #36663: [SPARK-38899][SQL]DS V2 supports push down datetime functions

cloud-fan commented on code in PR #36663:
URL: https://github.com/apache/spark/pull/36663#discussion_r890188771


##########
sql/core/src/main/scala/org/apache/spark/sql/catalyst/util/V2ExpressionBuilder.scala:
##########
@@ -259,6 +259,55 @@ class V2ExpressionBuilder(
       } else {
         None
       }
+    case date: DateAdd =>
+      val childrenExpressions = date.children.flatMap(generateExpression(_))
+      if (childrenExpressions.length == date.children.length) {
+        Some(new GeneralScalarExpression("DATE_ADD", childrenExpressions.toArray[V2Expression]))
+      } else {
+        None
+      }
+    case date: DateDiff =>
+      val childrenExpressions = date.children.flatMap(generateExpression(_))
+      if (childrenExpressions.length == date.children.length) {
+        Some(new GeneralScalarExpression("DATE_DIFF", childrenExpressions.toArray[V2Expression]))
+      } else {
+        None
+      }
+    case date: TruncDate =>
+      val childrenExpressions = date.children.flatMap(generateExpression(_))
+      if (childrenExpressions.length == date.children.length) {
+        Some(new GeneralScalarExpression("TRUNC", childrenExpressions.toArray[V2Expression]))
+      } else {
+        None
+      }
+    case Second(child, _) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("SECOND", Array[V2Expression](v)))
+    case Minute(child, _) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("MINUTE", Array[V2Expression](v)))
+    case Hour(child, _) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("HOUR", Array[V2Expression](v)))
+    case Month(child) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("MONTH", Array[V2Expression](v)))
+    case Quarter(child) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("QUARTER", Array[V2Expression](v)))
+    case Year(child) => generateExpression(child)
+      .map(v => new GeneralScalarExpression("YEAR", Array[V2Expression](v)))
+    // The DAY_OF_WEEK function in Spark returns the day of the week for date/timestamp.
+    // Database dialects do not need to follow ISO semantics when handling DAY_OF_WEEK.

Review Comment:
   hmm, do you mean the Spark behavior is non-standard for DAY_OF_WEEK function?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org