You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/11/24 14:12:38 UTC

[GitHub] [spark] peter-toth commented on pull request #38640: [WIP][SPARK-41124][SQL][TEST] Add DSv2 PlanStabilitySuites

peter-toth commented on PR #38640:
URL: https://github.com/apache/spark/pull/38640#issuecomment-1326507386

   @cloud-fan, thanks for the details!
   
   In this PR I'm focusing only on file source v2 implementation. I've found that for v1 sources are resolved in `FindDataSourceTable` but why do we need a new rule for v2? I thought that v2 tables should be resolved in `ResolveRelations`.
   
   Basically what I was trying to do in the first commit of this PR: https://github.com/apache/spark/pull/38640/commits/efe4311a150c4dbee0c09dec9e409bca76cfdc25 (and probably it should be extracted to a separate ticket/PR if it works) is to:
   - enable v2 file sources in `ResolveSessionCatalog.isV2Provider()` to not construct v1 commands
   - modify `V2SessionCatalog.loadTable()` to return with `FileTabes`s (instead of `V1Table`s) for v2 file sources to let `ResolveRelations` resolve v2 file tables
   - modify `FileTable` implementations to optionally contain `CatalogTable`s and implement `V2TableWithV1Fallback` to provide v1 fallback if needed (streaming usecases) and `FileTable`s file index and schema can be constructed from the catalog table
   - split `SQLQuerySuite` into v1 and v2 versions to add various v2 tests
   - adjust some of the existing v2 tests
   
   Do you think this is a viable solution? Or did I miss something?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org