You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/05 08:57:06 UTC

[GitHub] nandorKollar commented on a change in pull request #23721: [SPARK-26797][SQL][WIP] Start using the new logical types API of Parquet 1.11.0 instead of the deprecated one

nandorKollar commented on a change in pull request #23721: [SPARK-26797][SQL][WIP] Start using the new logical types API of Parquet 1.11.0 instead of the deprecated one
URL: https://github.com/apache/spark/pull/23721#discussion_r253776790
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFilters.scala
 ##########
 @@ -100,6 +104,28 @@ private[parquet] class ParquetFilters(
     Binary.fromConstantByteArray(fixedLengthBytes, 0, numBytes)
   }
 
+  private def timestampValue(timestampType: TimestampLogicalTypeAnnotation, v: Any): JLong =
+    if (timestampType.getUnit == TimeUnit.MICROS) {
+      Option(v).map(
 
 Review comment:
   Thanks, I also replaced the if to a match expression. With if, the code would silently convert NANO precision timestamps incorrectly, that's a bug, I think throwing an exception in this case is a better option.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org