You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (JIRA)" <ji...@apache.org> on 2019/08/14 12:33:00 UTC

[jira] [Created] (SPARK-28730) Configurable type coercion policy for table insertion

Gengliang Wang created SPARK-28730:
--------------------------------------

             Summary: Configurable type coercion policy for table insertion
                 Key: SPARK-28730
                 URL: https://issues.apache.org/jira/browse/SPARK-28730
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Gengliang Wang


After all the discussions in the dev list: http://apache-spark-developers-list.1001551.n3.nabble.com/Discuss-Follow-ANSI-SQL-on-table-insertion-td27531.html#a27562. 
Here I propose that we can make the store assignment rules in the analyzer configurable, and the behavior of V1 and V2 should be consistent.
When inserting a value into a column with a different data type, Spark will perform type coercion. After this PR, we support 2 policies for the type coercion rules: 
legacy and strict. 
1. With legacy policy, Spark allows casting any value to any data type and null result is returned when the conversion is invalid. The legacy policy is the only behavior in Spark 2.x and it is compatible with Hive. 
2. With strict policy, Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. `int` and `long`, `float` -> `double` are not allowed.

To ensure backward compatibility with existing queries, the default store assignment policy is "legacy".



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org