You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/12/05 00:36:56 UTC

[GitHub] [spark] imback82 commented on a change in pull request #26684: [SPARK-30001][SQL] ResolveRelations should handle both V1 and V2 tables.

imback82 commented on a change in pull request #26684: [SPARK-30001][SQL] ResolveRelations should handle both V1 and V2 tables.
URL: https://github.com/apache/spark/pull/26684#discussion_r354058723
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
 ##########
 @@ -2836,35 +2869,17 @@ class Analyzer(
   }
 
   /**
-   * Performs the lookup of DataSourceV2 Tables. The order of resolution is:
-   *   1. Check if this relation is a temporary table.
-   *   2. Check if it has a catalog identifier. Here we try to load the table.
-   *      If we find the table, return the v2 relation and catalog.
-   *   3. Try resolving the relation using the V2SessionCatalog if that is defined.
-   *      If the V2SessionCatalog returns a V1 table definition,
-   *      return `None` so that we can fallback to the V1 code paths.
-   *      If the V2SessionCatalog returns a V2 table, return the v2 relation and V2SessionCatalog.
+   * Performs the lookup of DataSourceV2 Tables from v2 catalog.
    */
-  private def lookupV2RelationAndCatalog(
-      identifier: Seq[String]): Option[(DataSourceV2Relation, CatalogPlugin, Identifier)] =
+  private def lookupV2Relation(identifier: Seq[String]): Option[DataSourceV2Relation] =
 
 Review comment:
   good point. moved.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org