You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Flink CDC Issue Import (Jira)" <ji...@apache.org> on 2024/03/20 09:20:00 UTC

[jira] [Created] (FLINK-34769) [mysql-cdc] Add upsert changelog mode to avoid UPDATE_BEFORE records push down

Flink CDC Issue Import created FLINK-34769:
----------------------------------------------

             Summary: [mysql-cdc] Add upsert changelog mode to avoid UPDATE_BEFORE records push down
                 Key: FLINK-34769
                 URL: https://issues.apache.org/jira/browse/FLINK-34769
             Project: Flink
          Issue Type: Improvement
          Components: Flink CDC
            Reporter: Flink CDC Issue Import


**Is your feature request related to a problem? Please describe.**
I try to use flink sql to write mysql cdc-data into redis as a dimension table for other business use. When executing `UPDATE` DML, the cdc-data will be converted into `-D (UPDATE_BEFORE)` and `+I (UPDATE_AFTER)`  two records to sink redis. However, delete first will cause other data streams to be lost(NULL) when join data, which is unacceptable.


**Describe the solution you'd like**
I think we can add support for [upsert changelog mode|https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/concepts/dynamic_tables/#table-to-stream-conversion] by adding changelogMode option with mandatory primary key configuration.Basically, with `changelogMode=upsert` we will avoid `UPDATE_BEFORE` rows and we will require a primary key for the table. 

**Describe alternatives you've considered**
n/a

---------------- Imported from GitHub ----------------
Url: https://github.com/apache/flink-cdc/issues/1898
Created by: [yeezychao|https://github.com/yeezychao]
Labels: enhancement, 
Created at: Tue Feb 07 11:16:04 CST 2023
State: open




--
This message was sent by Atlassian Jira
(v8.20.10#820010)