You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/12/08 18:44:27 UTC

[GitHub] [spark] RussellSpitzer commented on a diff in pull request #38823: [SPARK-41290][SQL] Support GENERATED ALWAYS AS expressions for columns in create/replace table statements

RussellSpitzer commented on code in PR #38823:
URL: https://github.com/apache/spark/pull/38823#discussion_r1043702758


##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableProvider.java:
##########
@@ -93,4 +93,18 @@ default Transform[] inferPartitioning(CaseInsensitiveStringMap options) {
   default boolean supportsExternalMetadata() {
     return false;
   }
+
+  /**
+   * Returns true if the source supports defining generated columns upon table creation in SQL.
+   * When false: any create/replace table statements with a generated column defined in the table
+   * schema will throw an exception during analysis.
+   *
+   * A generated column is defined with syntax: {@code colName colType GENERATED ALWAYS AS (expr)}
+   * The generation expression is stored in the column metadata with key "generationExpression".
+   *
+   * Override this method to allow defining generated columns in create/replace table statements.
+   */
+  default boolean supportsGeneratedColumnsOnCreation() {

Review Comment:
   Sounds like it would need to be more like a CatalogCapability? Supports Creating Tables with Generated Columns? 
   
   Although in general i'm not sure how most other engines would be able to support this without supporting Spark Expressions. Do we have a limit on how much of the Spark API is included here?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org