You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/04/20 10:14:00 UTC
[jira] [Commented] (SPARK-38967) Turn spark.sql.ansi.strictIndexOperator into internal config
[ https://issues.apache.org/jira/browse/SPARK-38967?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17524864#comment-17524864 ]
Apache Spark commented on SPARK-38967:
--------------------------------------
User 'gengliangwang' has created a pull request for this issue:
https://github.com/apache/spark/pull/36282
> Turn spark.sql.ansi.strictIndexOperator into internal config
> ------------------------------------------------------------
>
> Key: SPARK-38967
> URL: https://issues.apache.org/jira/browse/SPARK-38967
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Gengliang Wang
> Assignee: Gengliang Wang
> Priority: Minor
>
> Currently, all the ANSI error message shows the hint "If necessary set spark.sql.ansi.enabled to false to bypass this error."
> "Map key not exist" and "array index out of bound" errors are special. It shows the config spark.sql.ansi.strictIndexOperator instead.
> This one special case can confuse users. To make it simple:
> * Show the configuration spark.sql.ansi.enabled instead
> * If it is "map key not exist" error, show the hint for using "try_element_at". Otherwise, we don't show it. For array, `[]` operator is using 0-based index while `try_element_at` is using 1-based index.
--
This message was sent by Atlassian Jira
(v8.20.7#820007)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org