You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "LuciferYang (via GitHub)" <gi...@apache.org> on 2023/03/14 05:13:10 UTC

[GitHub] [spark] LuciferYang commented on a diff in pull request #40395: [SPARK-42770][CONNECT] Add `truncatedTo(ChronoUnit.MICROS)` to make `SQLImplicitsTestSuite` in Java 17 daily test GA task pass

LuciferYang commented on code in PR #40395:
URL: https://github.com/apache/spark/pull/40395#discussion_r1134966350


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/SQLImplicitsTestSuite.scala:
##########
@@ -130,9 +132,21 @@ class SQLImplicitsTestSuite extends ConnectFunSuite with BeforeAndAfterAll {
     testImplicit(BigDecimal(decimal))
     testImplicit(Date.valueOf(LocalDate.now()))
     testImplicit(LocalDate.now())
-    testImplicit(LocalDateTime.now())
-    testImplicit(Instant.now())
-    testImplicit(Timestamp.from(Instant.now()))
+    // SPARK-42770: Run `LocalDateTime.now()` and `Instant.now()` with Java 8 & 11 always
+    // get microseconds on both Linux and MacOS, but there are some differences when
+    // using Java 17, it will get accurate nanoseconds on Linux, but still get the microseconds
+    // on MacOS. At present, Spark always converts them to microseconds, this will cause the
+    // test fail when using Java 17 on Linux, so add `truncatedTo(ChronoUnit.MICROS)` when
+    // testing on Linux using Java 17 to ensure the accuracy of input data is microseconds.
+    if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_17) && SystemUtils.IS_OS_LINUX) {

Review Comment:
   @dongjoon-hyun like this? Let me double check the new change on Linux & Java 17



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org