You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/07/24 07:42:02 UTC

[GitHub] [spark] MaxGekk edited a comment on issue #25230: [SPARK-28471][SQL] Replace `yyyy` by `uuuu` in date-timestamp patterns without era

MaxGekk edited a comment on issue #25230: [SPARK-28471][SQL] Replace `yyyy` by `uuuu` in date-timestamp patterns without era
URL: https://github.com/apache/spark/pull/25230#issuecomment-514516270
 
 
   > is this a breaking change though?
   
   @felixcheung No, negative years are out of the valid range for the `DATE` type: https://github.com/apache/spark/blob/4cb1cd6ab7b7bffd045a786b0ddb7c7783afdf46/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DateType.scala#L27
   
   It could be considered as a fix for correctness issue.
   
   Before: written -99 year was loaded back as 100. That's incorrect.
   ```Scala
   scala> Seq(java.time.LocalDate.of(-99, 1, 1)).toDF("d").write.mode("overwrite").json("neg_year2")
   scala> spark.read.schema("d date").json("/Users/maxim/tmp/neg_year2").show
   +----------+
   |         d|
   +----------+
   |0100-01-01|
   +----------+
   ```
   After:
   ```Scala
   scala> Seq(java.time.LocalDate.of(-99, 1, 1)).toDF("d").write.mode("overwrite").json("neg_year")
   scala> spark.read.schema("d date").json("neg_year").show
   +----+
   |   d|
   +----+
   |null|
   +----+
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org