You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2019/08/27 13:32:00 UTC

[jira] [Updated] (SPARK-28885) Follow ANSI store assignment rules in table insertion by default

     [ https://issues.apache.org/jira/browse/SPARK-28885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang updated SPARK-28885:
-----------------------------------
    Description: 
When inserting a value into a column with a different data type, Spark will perform type coercion. Currently, we support 3 policies for the type coercion rules: ANSI, legacy and strict
1. ANSI:  Spark performs the type coercion as per ANSI SQL. It is consistent with popular DBMS such as PostgreSQL/Mysql/Oracle. It disallows unreasonable casting such as converting `string` to `int`.
2. Legacy: Spark allows casting any value to any data type. It is the only behavior in Spark 2.x and it is compatible with Hive.
3. Strict: Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. `long` to `int`, `double` to `int` and `decimal` to `double` are not allowed.

This task is to make ANSI policy as the default choice of table insertion.

  was:
As now, Spark supports 3 policies for type coercion in table insertion:
1. ANSI
2. Legacy
3. Strict


> Follow ANSI store assignment rules in table insertion by default
> ----------------------------------------------------------------
>
>                 Key: SPARK-28885
>                 URL: https://issues.apache.org/jira/browse/SPARK-28885
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Gengliang Wang
>            Priority: Major
>
> When inserting a value into a column with a different data type, Spark will perform type coercion. Currently, we support 3 policies for the type coercion rules: ANSI, legacy and strict
> 1. ANSI:  Spark performs the type coercion as per ANSI SQL. It is consistent with popular DBMS such as PostgreSQL/Mysql/Oracle. It disallows unreasonable casting such as converting `string` to `int`.
> 2. Legacy: Spark allows casting any value to any data type. It is the only behavior in Spark 2.x and it is compatible with Hive.
> 3. Strict: Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. `long` to `int`, `double` to `int` and `decimal` to `double` are not allowed.
> This task is to make ANSI policy as the default choice of table insertion.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org