You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/07/07 05:48:55 UTC

[GitHub] [flink] lpn666 commented on pull request #20192: [sql-jdbc]

lpn666 commented on PR #20192:
URL: https://github.com/apache/flink/pull/20192#issuecomment-1177110023

   > ## What is the purpose of the change
   > When I using the sql-jdbc to transform a big table from mysql to other database, the flink program load the entire table into memory. The source table is too big (16GB), and the taskmanager crashed. So What can I do, or what about add a new option to limit the speed of reading data (or batch the data )
   > 
   > ## Brief change log
   > ## Verifying this change
   > ## Does this pull request potentially affect one of the following parts:
   > ## Documentation
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org