You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/02/09 00:27:00 UTC

[jira] [Updated] (SPARK-28152) Mapped ShortType to SMALLINT and FloatType to REAL for MsSqlServerDialect

     [ https://issues.apache.org/jira/browse/SPARK-28152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-28152:
----------------------------------
    Labels: release-notes  (was: )

> Mapped ShortType to SMALLINT and FloatType to REAL for MsSqlServerDialect
> -------------------------------------------------------------------------
>
>                 Key: SPARK-28152
>                 URL: https://issues.apache.org/jira/browse/SPARK-28152
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.3, 3.0.0
>            Reporter: Shiv Prashant Sood
>            Assignee: Shiv Prashant Sood
>            Priority: Minor
>              Labels: release-notes
>             Fix For: 3.0.0
>
>
>  ShortType and FloatTypes are not correctly mapped to right JDBC types when using JDBC connector. This results in tables and spark data frame being created with unintended types. The issue was observed when validating against SQLServer.
> Some example issue
>  * Write from df with column type results in a SQL table of with column type as INTEGER as opposed to SMALLINT. Thus a larger table that expected.
>  * read results in a dataframe with type INTEGER as opposed to ShortType 
> FloatTypes have a issue with read path. In the write path Spark data type 'FloatType' is correctly mapped to JDBC equivalent data type 'Real'. But in the read path when JDBC data types need to be converted to Catalyst data types ( getCatalystType) 'Real' gets incorrectly gets mapped to 'DoubleType' rather than 'FloatType'.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org