You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2021/08/11 14:45:00 UTC

[jira] [Assigned] (SPARK-35030) ANSI SQL compliance

     [ https://issues.apache.org/jira/browse/SPARK-35030?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang reassigned SPARK-35030:
--------------------------------------

    Assignee: Apache Spark

> ANSI SQL compliance
> -------------------
>
>                 Key: SPARK-35030
>                 URL: https://issues.apache.org/jira/browse/SPARK-35030
>             Project: Spark
>          Issue Type: Epic
>          Components: SQL
>    Affects Versions: 3.0.0, 3.1.1, 3.2.0
>            Reporter: Gengliang Wang
>            Assignee: Apache Spark
>            Priority: Major
>
> Build an ANSI compliant dialect in Spark, for better data quality and easier migration from traditional DBMS to Spark. For example, Spark will throw an exception at runtime instead of returning null results when the inputs to a SQL operator/function are invalid. 
> The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:
> {code:java}
> -- `spark.sql.ansi.enabled=true`
> SELECT 2147483647 + 1;
> java.lang.ArithmeticException: integer overflow
> -- `spark.sql.ansi.enabled=false`
> SELECT 2147483647 + 1;
> +----------------+
> |(2147483647 + 1)|
> +----------------+
> |     -2147483648|
> +----------------+
> {code}
> Full details of this dialect are documented in [https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html|https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html].
> Note that some ANSI dialect features maybe not from the ANSI SQL standard directly, but their behaviors align with ANSI SQL's style.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org