You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@seatunnel.apache.org by arjun s <ar...@gmail.com> on 2023/06/05 13:07:18 UTC

Batch Insert on Load Data

Hi team,

1.When executing the load data command in the source MySQL, which involves
a CSV file with 1 million records, I noticed that after the execution
completes, I see 1 million individual insert operations being performed in
Seatunnal to synchronize the data with the destination MySQL. My concern is
whether there is an option to perform batch insert operations in Seatunnal
instead of individual inserts for better efficiency.

2.In the scenario where the Seatunnal application goes down while bulk
loading or performing insert operations, I encountered a synchronization
mismatch between the source and sink MySQL databases after restarting the
Seatunnal application. The following error is reported:

Caused by: io.debezium.DebeziumException: A replica with the same
server_uuid/server_id as this replica has connected to the source; the
first event 'binlog.000003' at 324102180, the last event read from
'./binlog.000003' at 126, the last byte read from './binlog.000003' at
324102180. Error code: 1236; SQLSTATE: HY000.


configuration file is attached below