You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hemalatha A <he...@googlemail.com> on 2016/07/27 05:12:33 UTC

Fail a batch in Spark Streaming forcefully based on business rules

Hello,

I have a uescase where in, I have  to fail certain batches in my streaming
batches, based on my application specific business rules.
Ex: If in a batch of 2 seconds, I don't receive 100 message, I should fail
the batch and move on.

How to achieve this behavior?

-- 


Regards
Hemalatha

Re: Fail a batch in Spark Streaming forcefully based on business rules

Posted by Lars Albertsson <la...@mapflat.com>.
I don't know your context, so I don't have a solution for you. If you
provide more information, the list might be able to suggest a
solution.

IIUYC, however, it sounds like you could benefit from decoupling
operational failure from business level failure. E.g. if there is a
failure according to your business rules, keep the job running, but
emit business level failure records. If records need to be
reprocessed, emit them to another stream topic and reprocess.

It is risky to inject system level failures under normal operations.
An operational failure is normally an anomaly that should be
addressed; if you induce failures, system failures would become part
of normal operations, and real failures risk passing unnoticed.

Regards,



Lars Albertsson
Data engineering consultant
www.mapflat.com
https://twitter.com/lalleal
+46 70 7687109
Calendar: https://goo.gl/6FBtlS


On Thu, Jul 28, 2016 at 12:11 PM, Hemalatha A
<he...@googlemail.com> wrote:
>
> Another usecase why I need to do this is, If Exception A is caught I should
> just print it and ignore, but ifException B occurs, I have to end the batch,
> fail it and stop processing the batch.
> Is it possible to achieve this?? Any hints on this please.
>
>
> On Wed, Jul 27, 2016 at 10:42 AM, Hemalatha A
> <he...@googlemail.com> wrote:
>>
>> Hello,
>>
>> I have a uescase where in, I have  to fail certain batches in my streaming
>> batches, based on my application specific business rules.
>> Ex: If in a batch of 2 seconds, I don't receive 100 message, I should fail
>> the batch and move on.
>>
>> How to achieve this behavior?
>>
>> --
>>
>>
>> Regards
>> Hemalatha
>
>
>
>
> --
>
>
> Regards
> Hemalatha

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Fail a batch in Spark Streaming forcefully based on business rules

Posted by Hemalatha A <he...@googlemail.com>.
Another usecase why I need to do this is, If Exception A is caught I should
just print it and ignore, but ifException B occurs, I have to end the
batch, fail it and stop processing the batch.
Is it possible to achieve this?? Any hints on this please.


On Wed, Jul 27, 2016 at 10:42 AM, Hemalatha A <
hemalatha.amrutha@googlemail.com> wrote:

> Hello,
>
> I have a uescase where in, I have  to fail certain batches in my
> streaming batches, based on my application specific business rules.
> Ex: If in a batch of 2 seconds, I don't receive 100 message, I should fail
> the batch and move on.
>
> How to achieve this behavior?
>
> --
>
>
> Regards
> Hemalatha
>



-- 


Regards
Hemalatha