You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by gatorsmile <gi...@git.apache.org> on 2018/12/02 19:21:43 UTC

[GitHub] spark pull request #23197: [SPARK-26165][Optimizer] Filter Query Date and Ti...

Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23197#discussion_r238110847
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala ---
    @@ -119,14 +121,26 @@ object TypeCoercion {
        * other is a Timestamp by making the target type to be String.
        */
       private def findCommonTypeForBinaryComparison(
    -      dt1: DataType, dt2: DataType, conf: SQLConf): Option[DataType] = (dt1, dt2) match {
    -    // We should cast all relative timestamp/date/string comparison into string comparisons
    -    // This behaves as a user would expect because timestamp strings sort lexicographically.
    -    // i.e. TimeStamp(2013-01-01 00:00 ...) < "2014" = true
    +      left: Expression,
    +      right: Expression,
    +      conf: SQLConf): Option[DataType] = (left.dataType, right.dataType) match {
    +    // We should cast all relative timestamp/date/string comparison into string comparisons only if
    +    // the particular literal value cannot been converted into a valid Timestamp/Date. If the value
    +    // can be converted into a valid TimeStamp/Date then we cast the right side literal value to
    +    // 'Timestamp'/'Date', for more details refer the description provided in stringToTimestamp()
    +    // method.
    --- End diff --
    
    We will not change the existing Type Coercion rules in the current stage. We plan to re-visit the whole type coercion rules by following some other systems, e.g., PostgreSQL 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org