You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jurriaan Pruis (JIRA)" <ji...@apache.org> on 2019/03/26 14:36:00 UTC

[jira] [Commented] (SPARK-17914) Spark SQL casting to TimestampType with nanosecond results in incorrect timestamp

    [ https://issues.apache.org/jira/browse/SPARK-17914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16801795#comment-16801795 ] 

Jurriaan Pruis commented on SPARK-17914:
----------------------------------------

I'm also seeing this issue where the millisecond part 'overflows' into the rest of the timestamp in Spark 2.4.0 as described in the comment above. To me it seems like this issue isn't resolved yet. cc [~ueshin]

> Spark SQL casting to TimestampType with nanosecond results in incorrect timestamp
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-17914
>                 URL: https://issues.apache.org/jira/browse/SPARK-17914
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>            Reporter: Oksana Romankova
>            Assignee: Anton Okolnychyi
>            Priority: Major
>             Fix For: 2.2.0, 2.3.0
>
>
> In some cases when timestamps contain nanoseconds they will be parsed incorrectly. 
> Examples: 
> "2016-05-14T15:12:14.0034567Z" -> "2016-05-14 15:12:14.034567"
> "2016-05-14T15:12:14.000345678Z" -> "2016-05-14 15:12:14.345678"
> The issue seems to be happening in DateTimeUtils.stringToTimestamp(). It assumes that only 6 digit fraction of a second will be passed.
> With this being the case I would suggest either discarding nanoseconds automatically, or throw an exception prompting to pre-format timestamps to microsecond precision first before casting to the Timestamp.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org