You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/02/17 00:56:37 UTC

[spark] branch branch-3.0 updated: [SPARK-30703][SQL][DOCS][FOLLOWUP] Declare the ANSI SQL compliance options as experimental

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 39a9e41  [SPARK-30703][SQL][DOCS][FOLLOWUP] Declare the ANSI SQL compliance options as experimental
39a9e41 is described below

commit 39a9e41753c6db606de501c53824b74d4927488f
Author: Gengliang Wang <ge...@databricks.com>
AuthorDate: Mon Feb 17 09:54:00 2020 +0900

    [SPARK-30703][SQL][DOCS][FOLLOWUP] Declare the ANSI SQL compliance options as experimental
    
    ### What changes were proposed in this pull request?
    
    This is a follow-up of https://github.com/apache/spark/pull/27489.
    It declares the ANSI SQL compliance options as experimental in the documentation.
    
    ### Why are the changes needed?
    
    The options are experimental. There can be new features/behaviors in future releases.
    
    ### Does this PR introduce any user-facing change?
    
    No
    
    ### How was this patch tested?
    
    Generating doc
    
    Closes #27590 from gengliangwang/ExperimentalAnsi.
    
    Authored-by: Gengliang Wang <ge...@databricks.com>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
    (cherry picked from commit da2ca85cee3960de7a86a21483de1d77767ca060)
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 docs/sql-ref-ansi-compliance.md | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index d023835..267184a 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -19,11 +19,13 @@ license: |
   limitations under the License.
 ---
 
-Spark SQL has two options to comply with the SQL standard: `spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table below for details).
+Since Spark 3.0, Spark SQL introduces two experimental options to comply with the SQL standard: `spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table below for details).
+
 When `spark.sql.ansi.enabled` is set to `true`, Spark SQL follows the standard in basic behaviours (e.g., arithmetic operations, type conversion, and SQL parsing).
 Moreover, Spark SQL has an independent option to control implicit casting behaviours when inserting rows in a table.
 The casting behaviours are defined as store assignment rules in the standard.
-When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules.
+
+When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules. This is a separate configuration because its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` is disabled by default.
 
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
@@ -31,7 +33,7 @@ When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with
   <td><code>spark.sql.ansi.enabled</code></td>
   <td>false</td>
   <td>
-    When true, Spark tries to conform to the ANSI SQL specification:
+    (Experimental) When true, Spark tries to conform to the ANSI SQL specification:
     1. Spark will throw a runtime exception if an overflow occurs in any operation on integral/decimal field.
     2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in the SQL parser.
   </td>
@@ -40,7 +42,7 @@ When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with
   <td><code>spark.sql.storeAssignmentPolicy</code></td>
   <td>ANSI</td>
   <td>
-    When inserting a value into a column with different data type, Spark will perform type coercion.
+    (Experimental) When inserting a value into a column with different data type, Spark will perform type coercion.
     Currently, we support 3 policies for the type coercion rules: ANSI, legacy and strict. With ANSI policy,
     Spark performs the type coercion as per ANSI SQL. In practice, the behavior is mostly the same as PostgreSQL.
     It disallows certain unreasonable type conversions such as converting string to int or double to boolean.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org