You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2019/08/22 09:18:00 UTC

[jira] [Updated] (SPARK-28730) Configurable type coercion policy for table insertion

     [ https://issues.apache.org/jira/browse/SPARK-28730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang updated SPARK-28730:
-----------------------------------
        Parent: SPARK-28589
    Issue Type: Sub-task  (was: Improvement)

> Configurable type coercion policy for table insertion
> -----------------------------------------------------
>
>                 Key: SPARK-28730
>                 URL: https://issues.apache.org/jira/browse/SPARK-28730
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Gengliang Wang
>            Priority: Major
>
> After all the discussions in the dev list: http://apache-spark-developers-list.1001551.n3.nabble.com/Discuss-Follow-ANSI-SQL-on-table-insertion-td27531.html#a27562. 
> Here I propose that we can make the store assignment rules in the analyzer configurable, and the behavior of V1 and V2 should be consistent.
> When inserting a value into a column with a different data type, Spark will perform type coercion. After this PR, we support 2 policies for the type coercion rules: 
> legacy and strict. 
> 1. With legacy policy, Spark allows casting any value to any data type and null result is returned when the conversion is invalid. The legacy policy is the only behavior in Spark 2.x and it is compatible with Hive. 
> 2. With strict policy, Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. `int` and `long`, `float` -> `double` are not allowed.
> To ensure backward compatibility with existing queries, the default store assignment policy is "legacy".



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org