You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2022/04/20 10:04:00 UTC

[jira] [Created] (SPARK-38967) Turn spark.sql.ansi.strictIndexOperator into internal config

Gengliang Wang created SPARK-38967:
--------------------------------------

             Summary: Turn spark.sql.ansi.strictIndexOperator into internal config
                 Key: SPARK-38967
                 URL: https://issues.apache.org/jira/browse/SPARK-38967
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.3.0
            Reporter: Gengliang Wang
            Assignee: Gengliang Wang


Currently, all the ANSI error message shows the hint "If necessary set spark.sql.ansi.enabled to false to bypass this error." 

"Map key not exist" and "array index out of bound" errors are special. It shows the config spark.sql.ansi.strictIndexOperator instead.

This one special case can confuse users. To make it simple:
 * Show the configuration spark.sql.ansi.enabled instead
 * If it is "map key not exist" error, show the hint for using "try_element_at". Otherwise, we don't show it. For array, `[]` operator is using 0-based index while `try_element_at` is using 1-based index.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org