You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/03/08 06:19:05 UTC

[GitHub] [spark] MaxGekk commented on pull request #35756: [WIP][SPARK-38437][SQL] Dynamic serialization of Java datetime objects to micros/days

MaxGekk commented on pull request #35756:
URL: https://github.com/apache/spark/pull/35756#issuecomment-1061448283


   > Previously we have compile/analysis-time checks which generate code specifically tailored to either SQL or Java-native types.
   
   @xkrogen This PR doesn't weak any compile/analysis-time checks. The goal is to improve user experience with Spark SQL, and make it more flexible to user's input. Currently, Spark support 2 external Java types for Catalyst's timestamp type: `java.sql.Timestamp` and `java.time.Instant`, and users/datasource connector can use both but Spark accepts only one based its config `spark.sql.datetime.java8API.enabled`. Let's imagine the situation when a datasource connector is going to be re-used with new Spark version when the config has been already added and enabled. The datasource doesn't aware of new config, and still pushes old `java.sql.Timestamp` to Spark but Spark rejects them even it can properly handle it.
   
   > After this PR, we would **relax** that compile-time check and instead perform per-row runtime checks on the object type.
   
   No, it doesn't relax any compile-time checks. We declare that Spark support both Java types for timestamps depending on Spark SQL config. After the PR, Spark will accept both independently from the config.
   
   cc @cloud-fan 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org