You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2022/05/04 18:53:00 UTC

[jira] [Created] (SPARK-39098) Test the error class: PIVOT_VALUE_DATA_TYPE_MISMATCH

Max Gekk created SPARK-39098:
--------------------------------

             Summary: Test the error class: PIVOT_VALUE_DATA_TYPE_MISMATCH
                 Key: SPARK-39098
                 URL: https://issues.apache.org/jira/browse/SPARK-39098
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.4.0
            Reporter: Max Gekk
            Assignee: panbingkun
             Fix For: 3.4.0


Add tests for the error classes *PIVOT_VALUE_DATA_TYPE_MISMATCH* and *NON_LITERAL_PIVOT_VALUES* to QueryCompilationErrorsSuite. The test should cover the exception throw in QueryCompilationErrors:

{code:scala}
  def nonLiteralPivotValError(pivotVal: Expression): Throwable = {
    new AnalysisException(
      errorClass = "NON_LITERAL_PIVOT_VALUES",
      messageParameters = Array(pivotVal.toString))
  }

  def pivotValDataTypeMismatchError(pivotVal: Expression, pivotCol: Expression): Throwable = {
    new AnalysisException(
      errorClass = "PIVOT_VALUE_DATA_TYPE_MISMATCH",
      messageParameters = Array(
        pivotVal.toString, pivotVal.dataType.simpleString, pivotCol.dataType.catalogString))
  }
{code}

For example, here is a test for the error class *UNSUPPORTED_FEATURE*: https://github.com/apache/spark/blob/34e3029a43d2a8241f70f2343be8285cb7f231b9/sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala#L151-L170

+The test must have a check of:+
# the entire error message
# sqlState if it is defined in the error-classes.json file
# the error class



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org