You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2023/01/30 03:59:00 UTC

[jira] [Assigned] (SPARK-41735) Any SparkThrowable (with an error class) not in error-classes.json is masked in SQLExecution.withNewExecutionId and end-user will see "org.apache.spark.SparkException: [INTERNAL_ERROR]"

     [ https://issues.apache.org/jira/browse/SPARK-41735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-41735:
------------------------------------

    Assignee:     (was: Apache Spark)

> Any SparkThrowable (with an error class) not in error-classes.json is masked in SQLExecution.withNewExecutionId and end-user will see "org.apache.spark.SparkException: [INTERNAL_ERROR]" 
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-41735
>                 URL: https://issues.apache.org/jira/browse/SPARK-41735
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Allison Portis
>            Priority: Major
>
> This change [here|https://github.com/apache/spark/pull/38302/files#diff-fdd1e9e26aa1ba9d1cc923ee7c84a1935dcc285502330a471f1ade7f3ad08bf9] means that any seen error is passed to SparkThrowableHelper.getMessage(...). Any SparkThrowable with an error class (for example, if a connector uses the spark error format i.e. see ErrorClassesJsonReader) will be masked as 
> {code:java}
> org.apache.spark.SparkException: [INTERNAL_ERROR] Cannot find main error class 'SOME_ERROR_CLASS'{code}
> in SparkThrowableHelper.getMessage since errorReader.getMessageTemplate(errorClass) will fail for the error class not defined in error-classes.json.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org