You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sujith (JIRA)" <ji...@apache.org> on 2018/11/25 18:31:00 UTC
[jira] [Comment Edited] (SPARK-26165) Date and Timestamp column is
getting converted to string in less than/greater than filter query even
though valid date/timestamp string literal is used in the right side filter
expression
[ https://issues.apache.org/jira/browse/SPARK-26165?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16698227#comment-16698227 ]
Sujith edited comment on SPARK-26165 at 11/25/18 6:30 PM:
----------------------------------------------------------
I think we shall avoid casting if Date/timestamp string which can be converted to a valid date or timestamp ,we can convert the filter right expression column to sting type only if filter expression with string literal cannot be converted to data/timestamp.
I wll raise a PR for handle this issue..
was (Author: s71955):
I think we shall avoid casting if Date/timestamp string which can be converted to a valid date or timestamp , if the filter expression is string literal value is not a valid type then we can convert the filter right expression column to sting type as per current logic.
I wll raise a PR for handle this issue..
> Date and Timestamp column is getting converted to string in less than/greater than filter query even though valid date/timestamp string literal is used in the right side filter expression
> -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-26165
> URL: https://issues.apache.org/jira/browse/SPARK-26165
> Project: Spark
> Issue Type: Improvement
> Components: Optimizer
> Affects Versions: 2.3.2, 2.4.0
> Reporter: Sujith
> Priority: Major
> Attachments: timestamp_filter_perf.PNG
>
>
> Date and Timestamp column is getting converted to string in less than/greater than filter query even though date strings that contains a time, like '2018-03-18" 12:39:40' to date. Besides it's not possible to cast a string like '2018-03-18 12:39:40' to a timestamp.
>
> scala> spark.sql("""explain extended SELECT username FROM orders WHERE order_creation_date > '2017-02-26 13:45:12'""").show(false);
> +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> |== Parsed Logical Plan ==
> 'Project ['username]
> +- 'Filter ('order_creation_date > 2017-02-26 13:45:12)
> +- 'UnresolvedRelation `orders`
> == Analyzed Logical Plan ==
> username: string
> Project [username#59]
> +- Filter (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12)
> +- SubqueryAlias orders
> +- HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation_date#60, amount#61]
> == Optimized Logical Plan ==
> Project [username#59]
> +- Filter (isnotnull(order_creation_date#60) && (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12))
> +- HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation_date#60, amount#61]
> == Physical Plan ==
> *(1) Project [username#59]
> +- *(1) Filter (isnotnull(order_creation_date#60) && (cast(order_creation_date#60 as string) > 2017-02-26 13:45:12))
> +- HiveTableScan [order_creation_date#60, username#59], HiveTableRelation `default`.`orders`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [username#59, order_creation
> +--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org