You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2023/04/06 05:12:00 UTC

[jira] [Resolved] (SPARK-43041) Restore constructors of exceptions for compatibility in connector API

     [ https://issues.apache.org/jira/browse/SPARK-43041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang resolved SPARK-43041.
------------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Issue resolved by pull request 40679
[https://github.com/apache/spark/pull/40679]

> Restore constructors of exceptions for compatibility in connector API
> ---------------------------------------------------------------------
>
>                 Key: SPARK-43041
>                 URL: https://issues.apache.org/jira/browse/SPARK-43041
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.4.0
>            Reporter: Xinrong Meng
>            Assignee: Anton Okolnychyi
>            Priority: Blocker
>             Fix For: 3.4.0
>
>
> Thanks [~aokolnychyi] for raising the issue as shown below:
> {quote}
> I have a question about changes to exceptions used in the public connector API, such as NoSuchTableException and TableAlreadyExistsException.
> I consider those as part of the public Catalog API (TableCatalog uses them in method definitions). However, it looks like PR #37887 has changed them in an incompatible way. Old constructors accepting Identifier objects got removed. The only way to construct such exceptions is either by passing database and table strings or Scala Seq. Shall we add back old constructors to avoid breaking connectors?
> {quote}
> We should restore constructors of those exceptions to preserve the compatibility in connector API.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org