You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/17 07:39:47 UTC

[GitHub] [spark] cloud-fan commented on pull request #31553: [WIP][SPARK-34425][SQL] Clarify error message

cloud-fan commented on pull request #31553:
URL: https://github.com/apache/spark/pull/31553#issuecomment-780368362


   As I mentioned in https://github.com/apache/spark/pull/31541#issuecomment-780319947 , people can extend the session catalog by a custom v2 catalog, but it must share the same table/view/function namespace with the session catalog.
   
   That said, the original check does make sense. When we look up a table/view/function from the extended session catalog and eventually forward the request to the underlying Hive catalog, we should make sure the identifier is 2-parts.
   
   Looking at https://github.com/apache/spark/pull/31427 , it seems like your motivation is to extend the session catalog by using additional table/view/function namespaces. Then the original check doesn't make sense anymore as the session catalog can support arbitrary levels of namespaces. I don't have a problem with it, as long as it still follows the config doc: `if a table can be loaded by the $SESSION_CATALOG_NAME, this catalog must also return the table metadata`.
   
   Following this direction, my remaining concerns are:
   1. we should have a test case for the extended session catalog that supports more than 2 name parts in identifiers.
   2. we should unify the error message. For views/functions that have more than 2 name parts, we should throw view/function not found exception as well, not `Unsupported function name: xxx`


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org