You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2018/12/30 03:10:05 UTC

[GitHub] twdsilva commented on a change in pull request #23400: [SPARK-26499] [SQL] JdbcUtils.getCatalystType maps TINYINT to IntegerType instead of Byte…

twdsilva commented on a change in pull request #23400: [SPARK-26499] [SQL] JdbcUtils.getCatalystType maps TINYINT to IntegerType instead of Byte…
URL: https://github.com/apache/spark/pull/23400#discussion_r244523235
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
 ##########
 @@ -239,7 +239,7 @@ object JdbcUtils extends Logging {
       case java.sql.Types.TIMESTAMP     => TimestampType
       case java.sql.Types.TIMESTAMP_WITH_TIMEZONE
                                         => null
-      case java.sql.Types.TINYINT       => IntegerType
+      case java.sql.Types.TINYINT       => ByteType
 
 Review comment:
   @HyukjinKwon  Thank you for reviewing.
   
   The range of ```TINYINT``` varies by database. It ranges from -128 to 127 for MySQL(https://dev.mysql.com/doc/refman/8.0/en/integer-types.html), H2 (http://www.h2database.com/html/datatypes.html#tinyint_type) and Phoenix (https://phoenix.apache.org/language/datatypes.html#tinyint_type).
   It ranges from 0 to 255 for SQL Server (https://docs.microsoft.com/en-us/sql/t-sql/data-types/int-bigint-smallint-and-tinyint-transact-sql?view=sql-server-2017)
   
   I think users can implement ```JdbcDialect``` to override the default JDBC to catalyst type mapping when using the JDBC DataSource. Let me know what you think. 
   
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org