You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/17 08:22:02 UTC

[GitHub] [spark] cloud-fan edited a comment on pull request #31541: Revert "[SPARK-34209][SQL] Delegate table name validation to the session catalog"

cloud-fan edited a comment on pull request #31541:
URL: https://github.com/apache/spark/pull/31541#issuecomment-780319947


   Technically people can replace the session catalog with a custom v2 implementation by setting `spark.sql.catalog.spark_catalog`, but this is not the expected usage:
   ```
     val V2_SESSION_CATALOG_IMPLEMENTATION =
       buildConf(s"spark.sql.catalog.$SESSION_CATALOG_NAME")
         .doc("A catalog implementation that will be used as the v2 interface to Spark's built-in " +
           s"v1 catalog: $SESSION_CATALOG_NAME. This catalog shares its identifier namespace with " +
           s"the $SESSION_CATALOG_NAME and must be consistent with it; for example, if a table can " +
           s"be loaded by the $SESSION_CATALOG_NAME, this catalog must also return the table " +
           s"metadata. To delegate operations to the $SESSION_CATALOG_NAME, implementations can " +
           "extend 'CatalogExtension'.")
         .version("3.0.0")
         .stringConf
         .createOptional
   ```
   
   Users are only expected to extend the session catalog, not replacing it. By "extend", I mean "This catalog shares its identifier namespace with the $SESSION_CATALOG_NAME and must be consistent with it". I believe that's what [Delta](https://docs.delta.io/latest/quick-start.html) does.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org