You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by "Abhishek Gupta (BLOOMBERG/ 731 LEX)" <ag...@bloomberg.net> on 2019/09/27 18:29:59 UTC

Throttling/ Queue length for DataStreamers

Hello,
     I'm using datastreamers to ingest large amounts of data in batches. So the load on the grid is pretty spiky Some time I'm seeing pretty heavy GCing and that causes the ingestion to slow down on the grid, but the client continues to pump data which makes the GC pauses worse because I suspect that the queues on the grid keep bloating with requests and it really gets into a death spiral sometimes. It seems like having some throttling will help with these scenarios. Two questions - 


1. Is there a way to see the length of the MSG queue building for datastreamers 
2. Is there a way to throttle this? I.e. Set a max queue size or some way to slow down the data streaming clients?

Thanks,
Abhishek


Re: Throttling/ Queue length for DataStreamers

Posted by Ilya Kasnacheev <il...@gmail.com>.
Hello!

You can invoke flush() periodically, it will wait for all data to be
processed by server node before returning.

Regards,
-- 
Ilya Kasnacheev


пт, 27 сент. 2019 г. в 21:30, Abhishek Gupta (BLOOMBERG/ 731 LEX) <
agupta726@bloomberg.net>:

> Hello,
> I'm using datastreamers to ingest large amounts of data in batches. So the
> load on the grid is pretty spiky Some time I'm seeing pretty heavy GCing
> and that causes the ingestion to slow down on the grid, but the client
> continues to pump data which makes the GC pauses worse because I suspect
> that the queues on the grid keep bloating with requests and it really gets
> into a death spiral sometimes. It seems like having some throttling will
> help with these scenarios. Two questions -
>
>
> 1. Is there a way to see the length of the MSG queue building for
> datastreamers
> 2. Is there a way to throttle this? I.e. Set a max queue size or some way
> to slow down the data streaming clients?
>
> Thanks,
> Abhishek
>
>