You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/19 11:33:51 UTC

[GitHub] [spark] pan3793 commented on pull request #26167: [SPARK-28893][SQL] Support MERGE INTO in the parser and add the corresponding logical plan

pan3793 commented on pull request #26167:
URL: https://github.com/apache/spark/pull/26167#issuecomment-782019334


   Can we extend grammar to support quick insert/set by `struct_column.*`?
   ```
   <matched_action>  =
     DELETE  |
     UPDATE SET *  |
     UPDATE SET struct_column.*  |
     UPDATE SET column1 = value1 [, column2 = value2 ...]
   
   <not_matched_action>  =
     INSERT *  |
     INSERT struct_column.* |
     INSERT (column1 [, column2 ...]) VALUES (value1 [, value2 ...])
   ```
   
   My use case is that in CDC scenario I need to keep some fields which not required for target table to determine how to handle record.
   
   ```
     deltaDf
         // op: "i" or "u" or "d"
         // id: record primary key
         // record: struct which fields match targetTable schema
         .select("op", "id", "record")
         .createOrReplaceTempView("delta")
       spark.sql(
         s"""MERGE INTO $targetTable target
            |USING delta
            |   ON target.id = delta.id
            | WHEN MATCHED AND delta.op = 'd' THEN DELETE
            | WHEN MATCHED                    THEN UPDATE SET record.*
            | WHEN NOT MATCHED                THEN INSERT record.*
            |""".stripMargin).collect
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org