You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/04/17 01:05:00 UTC

[jira] [Reopened] (SPARK-38750) Test the error class: SECOND_FUNCTION_ARGUMENT_NOT_INTEGER

     [ https://issues.apache.org/jira/browse/SPARK-38750?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun reopened SPARK-38750:
-----------------------------------
      Assignee:     (was: panbingkun)

This is reverted.

> Test the error class: SECOND_FUNCTION_ARGUMENT_NOT_INTEGER
> ----------------------------------------------------------
>
>                 Key: SPARK-38750
>                 URL: https://issues.apache.org/jira/browse/SPARK-38750
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Max Gekk
>            Priority: Minor
>              Labels: starter
>             Fix For: 3.4.0
>
>
> Add a test for the error classes *SECOND_FUNCTION_ARGUMENT_NOT_INTEGER* to QueryCompilationErrorsSuite. The test should cover the exception throw in QueryCompilationErrors:
> {code:scala}
>   def secondArgumentOfFunctionIsNotIntegerError(
>       function: String, e: NumberFormatException): Throwable = {
>     // The second argument of '{function}' function needs to be an integer
>     new AnalysisException(
>       errorClass = "SECOND_FUNCTION_ARGUMENT_NOT_INTEGER",
>       messageParameters = Array(function),
>       cause = Some(e))
>   }
> {code}
> For example, here is a test for the error class *UNSUPPORTED_FEATURE*: https://github.com/apache/spark/blob/34e3029a43d2a8241f70f2343be8285cb7f231b9/sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala#L151-L170
> +The test must have a check of:+
> # the entire error message
> # sqlState if it is defined in the error-classes.json file
> # the error class



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org