You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2022/04/20 13:44:00 UTC

[jira] [Resolved] (SPARK-38967) Turn spark.sql.ansi.strictIndexOperator into internal config

     [ https://issues.apache.org/jira/browse/SPARK-38967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-38967.
---------------------------------
    Fix Version/s: 3.3.0
       Resolution: Fixed

Issue resolved by pull request 36282
[https://github.com/apache/spark/pull/36282]

> Turn spark.sql.ansi.strictIndexOperator into internal config
> ------------------------------------------------------------
>
>                 Key: SPARK-38967
>                 URL: https://issues.apache.org/jira/browse/SPARK-38967
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Minor
>             Fix For: 3.3.0
>
>
> Currently, all the ANSI error message shows the hint "If necessary set spark.sql.ansi.enabled to false to bypass this error." 
> "Map key not exist" and "array index out of bound" errors are special. It shows the config spark.sql.ansi.strictIndexOperator instead.
> This one special case can confuse users. To make it simple:
>  * Show the configuration spark.sql.ansi.enabled instead
>  * If it is "map key not exist" error, show the hint for using "try_element_at". Otherwise, we don't show it. For array, `[]` operator is using 0-based index while `try_element_at` is using 1-based index.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org