You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/01 07:31:00 UTC

[jira] [Commented] (SPARK-36604) timestamp type column analyze result is wrong

    [ https://issues.apache.org/jira/browse/SPARK-36604?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17407918#comment-17407918 ] 

Apache Spark commented on SPARK-36604:
--------------------------------------

User 'fhygh' has created a pull request for this issue:
https://github.com/apache/spark/pull/33886

> timestamp type column analyze result is wrong
> ---------------------------------------------
>
>                 Key: SPARK-36604
>                 URL: https://issues.apache.org/jira/browse/SPARK-36604
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.1, 3.1.2
>         Environment: Spark 3.1.1
>            Reporter: YuanGuanhu
>            Priority: Major
>
> when we create table with timestamp column type, the min and max data of the analyze result for the timestamp column is wrong
> eg:
> {code}
> > select * from a;
> {code}
> {code}
> 2021-08-15 15:30:01
> Time taken: 2.789 seconds, Fetched 1 row(s)
> spark-sql> desc formatted a a;
> col_name a
> data_type timestamp
> comment NULL
> min 2021-08-15 07:30:01.000000
> max 2021-08-15 07:30:01.000000
> num_nulls 0
> distinct_count 1
> avg_col_len 8
> max_col_len 8
> histogram NULL
> Time taken: 0.278 seconds, Fetched 10 row(s)
> spark-sql> desc a;
> a timestamp NULL
> Time taken: 1.432 seconds, Fetched 1 row(s)
> {code}
>  
> reproduce step:
> {code}
> create table a(a timestamp);
> insert into a select '2021-08-15 15:30:01';
> analyze table a compute statistics for columns a;
> desc formatted a a;
> select * from a;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org