You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sujith (JIRA)" <ji...@apache.org> on 2018/11/25 18:37:00 UTC

[jira] [Updated] (SPARK-26165) Date and Timestamp column expression is getting converted to string in less than/greater than filter query even though valid date/timestamp string literal is used in the right side filter expression

     [ https://issues.apache.org/jira/browse/SPARK-26165?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sujith updated SPARK-26165:
---------------------------
    Summary: Date and Timestamp column expression is getting converted to string in less than/greater than filter query even though valid date/timestamp string literal is used in the right side filter expression  (was: Date and Timestamp column is getting converted to string in less than/greater than filter query even though valid date/timestamp string literal is used in the right side filter expression)

> Date and Timestamp column expression is getting converted to string in less than/greater than filter query even though valid date/timestamp string literal is used in the right side filter expression
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26165
>                 URL: https://issues.apache.org/jira/browse/SPARK-26165
>             Project: Spark
>          Issue Type: Improvement
>          Components: Optimizer
>    Affects Versions: 2.3.2, 2.4.0
>            Reporter: Sujith
>            Priority: Major
>         Attachments: timestamp_filter_perf.PNG
>
>
> Date and Timestamp column is getting converted to string in less than/greater than filter query even though date strings that contains a time, like '2018-03-18" 12:39:40' to date. Besides it's not possible to cast a string like '2018-03-18 12:39:40' to a timestamp.
>  
> scala> spark.sql("""explain extended SELECT username FROM orders WHERE order_creation_date > '2017-02-26 13:45:12'""").show(false);
> +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> |== Parsed Logical Plan ==
> 'Project ['username]
> +- 'Filter ('order_creation_date > 2017-02-26 13:45:12)
>  +- 'UnresolvedRelation `orders`
> == Analyzed Logical Plan ==
> username: string
> Project [username#59]
> +- Filter (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12)
>  +- SubqueryAlias orders
>  +- HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation_date#60, amount#61]
> == Optimized Logical Plan ==
> Project [username#59]
> +- Filter (isnotnull(order_creation_date#60) && (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12))
>  +- HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation_date#60, amount#61]
> == Physical Plan ==
> *(1) Project [username#59]
> +- *(1) Filter (isnotnull(order_creation_date#60) && (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12))
>  +- HiveTableScan [order_creation_date#60, username#59], HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation
> +--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org