You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by GitBox <gi...@apache.org> on 2020/07/21 13:01:29 UTC

[GitHub] [camel-kafka-connector] oscerd edited a comment on issue #324: The committed block count cannot exceed the maximum limit of 50,000 blocks.

oscerd edited a comment on issue #324:
URL: https://github.com/apache/camel-kafka-connector/issues/324#issuecomment-661846215


   Try by adding
   
   ```
   "camel.bean.aggregate": "#class:org.apache.camel.kafkaconnector.utils.SampleAggregator"
   "camel.beans.aggregation.size": "1000"
   "camel.beans.aggregation.timeout": "5000L"
   ```
   Where the size is the batch size before send to Azure and the timeout is the timeout needed in case you don't have enough records to complete the aggregation.
   
   https://github.com/apache/camel-kafka-connector/blob/master/core/src/test/java/org/apache/camel/kafkaconnector/utils/SampleAggregator.java
   
   This will concatenate the record with a space in between.
   
   You'll need to build the master and use the generated connector. This is not released yet.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org