You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anshu shukla <an...@gmail.com> on 2015/07/25 11:59:08 UTC

ReceiverStream SPARK not able to cope up with 20,000 events /sec .

My eventGen is emitting 20,000  events/sec ,and I am using store(s1)
in receive()  method to push data to receiverStream .

But this logic is working fine for upto 4000 events/sec and no batch
are seen emitting for larger rate .

*CODE:TOPOLOGY -*


*JavaDStream<String> sourcestream = ssc.receiverStream(        new
TetcCustomEventReceiver(datafilename,spoutlog,argumentClass.getScalingFactor(),datasetType));*

*CODE:TetcCustomEventReceiver -*

public void receive(List<String> event) {
StringBuffer tuple=new StringBuffer();
msgId++;
for(String s:event)
        {
            tuple.append(s).append(",");
        }
String s1=MsgIdAddandRemove.addMessageId(tuple.toString(),msgId);
store(s1);
    }




-- 
Thanks & Regards,
Anshu Shukla

Re: ReceiverStream SPARK not able to cope up with 20,000 events /sec .

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
You need to find the bottleneck here, it could your network (if the data is
huge) or your producer code isn't pushing at 20k/s, If you are able to
produce at 20k/s then make sure you are able to receive at that rate (try
it without spark).

Thanks
Best Regards

On Sat, Jul 25, 2015 at 3:29 PM, anshu shukla <an...@gmail.com>
wrote:

> My eventGen is emitting 20,000  events/sec ,and I am using store(s1)  in receive()  method to push data to receiverStream .
>
> But this logic is working fine for upto 4000 events/sec and no batch are seen emitting for larger rate .
>
> *CODE:TOPOLOGY -*
>
>
> *JavaDStream<String> sourcestream = ssc.receiverStream(        new TetcCustomEventReceiver(datafilename,spoutlog,argumentClass.getScalingFactor(),datasetType));*
>
> *CODE:TetcCustomEventReceiver -*
>
> public void receive(List<String> event) {
> StringBuffer tuple=new StringBuffer();
> msgId++;
> for(String s:event)
>         {
>             tuple.append(s).append(",");
>         }
> String s1=MsgIdAddandRemove.addMessageId(tuple.toString(),msgId);
> store(s1);
>     }
>
>
>
>
> --
> Thanks & Regards,
> Anshu Shukla
>

Re: ReceiverStream SPARK not able to cope up with 20,000 events /sec .

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
You need to find the bottleneck here, it could your network (if the data is
huge) or your producer code isn't pushing at 20k/s, If you are able to
produce at 20k/s then make sure you are able to receive at that rate (try
it without spark).

Thanks
Best Regards

On Sat, Jul 25, 2015 at 3:29 PM, anshu shukla <an...@gmail.com>
wrote:

> My eventGen is emitting 20,000  events/sec ,and I am using store(s1)  in receive()  method to push data to receiverStream .
>
> But this logic is working fine for upto 4000 events/sec and no batch are seen emitting for larger rate .
>
> *CODE:TOPOLOGY -*
>
>
> *JavaDStream<String> sourcestream = ssc.receiverStream(        new TetcCustomEventReceiver(datafilename,spoutlog,argumentClass.getScalingFactor(),datasetType));*
>
> *CODE:TetcCustomEventReceiver -*
>
> public void receive(List<String> event) {
> StringBuffer tuple=new StringBuffer();
> msgId++;
> for(String s:event)
>         {
>             tuple.append(s).append(",");
>         }
> String s1=MsgIdAddandRemove.addMessageId(tuple.toString(),msgId);
> store(s1);
>     }
>
>
>
>
> --
> Thanks & Regards,
> Anshu Shukla
>