You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Lynch Lee (JIRA)" <ji...@apache.org> on 2018/02/03 07:25:00 UTC

[jira] [Created] (FLINK-8551) Should BIGINT in Flink SQL will be enchance to

Lynch Lee created FLINK-8551:
--------------------------------

             Summary: Should BIGINT in Flink SQL will be enchance to 
                 Key: FLINK-8551
                 URL: https://issues.apache.org/jira/browse/FLINK-8551
             Project: Flink
          Issue Type: Improvement
          Components: Table API &amp; SQL
    Affects Versions: 1.4.0
            Reporter: Lynch Lee
             Fix For: 1.4.0


As we all known , see [https://dev.mysql.com/doc/connector-j/5.1/en/connector-j-reference-type-conversions.html]

SQL data type BIGINT in mysql represents UNSIGNED BIGINT by default , it can receive value from some columns typed Long or BigInteger if we use java/scala driver to write data into db.
|{{BIGINT[(M)] [UNSIGNED]}}|{{BIGINT [UNSIGNED]}}|{{java.lang.Long}}, if UNSIGNED {{java.math.BigInteger}}|

 

By now , in Flink SQL BIGINT just represents SIGNED BIGINT, that to say it can only receive value from some columns typed Long .

all supported types of the Table API defined in this class org.apache.flink.table.api.Types

 

so if we should let BIGINT in Flink SQL match the BIGINT in some SQL Engine , like MySQL.

 

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)