You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/10/16 09:22:58 UTC

[GitHub] [spark] MaxGekk commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds

MaxGekk commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
URL: https://github.com/apache/spark/pull/26134#issuecomment-542611504
 
 
   > Why are the changes needed?
   > One logical day interval may have different number of microseconds (daylight saving).
   For example, in PST timezone, there will be 25 hours from 2019-11-2 12:00:00 to
   2019-11-3 12:00:00
   
   Yes, actual duration of the daylight saving day can be 25 hours but logically any day has 24 hours. Spark `INTERVAL` type defines not duration, it is adjustment of local date-time. After your changes, it adjusts logical (local) months, days and for some reasons physical time duration of a day. It looks like inconsistent, doesn't it?
   
   As you point out, the daylight saving in PST will happen at `2019-11-03 02:00:00`. So, the next timestamp with granularity of 1 microsecond is `2019-11-03 01:00:01`. Even more, the local timestamp `2019-11-03 02:00:00` will be showed on local clock twice.
   At the first time:
   ```
   timestamp'2019-11-03 02:00:00' + interval 1 microsecond = timestamp'2019-11-03 01:00:00.000001'
   ```
   At the second time:
   ```
   timestamp'2019-11-03 02:00:00' + interval 1 microsecond = timestamp'2019-11-03 02:00:00.000001'
   ```
   Could you elaborate how your changes resolve this situation?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org