You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Igor Makhlin <ig...@gmail.com> on 2018/03/31 10:59:24 UTC
In spark streaming application how to distinguish between normal and
abnormal termination of application?
Hi All,
I'm looking for a way to distinguish between normal and abnormal
termination of a spark streaming application with (checkpointing enabled).
Adding application listener doesn't really help because onApplicationEnd
event has no information regarding the cause of the termination.
ssc_.sc.addSparkListener(new SparkListener {
override def onApplicationEnd(applicationEnd:
SparkListenerApplicationEnd): Unit = {
I need to manage an internal metadata if streaming application has been
terminated and that termination is not recoverable I have to delete the
metadata (state of stream in this particular application).
--
Sincerely,
Igor Makhlin
Re: In spark streaming application how to distinguish between normal
and abnormal termination of application?
Posted by Igor Makhlin <ig...@gmail.com>.
looks like nobody knows the answer on this question ;)
On Sat, Mar 31, 2018 at 1:59 PM, Igor Makhlin <ig...@gmail.com>
wrote:
> Hi All,
>
> I'm looking for a way to distinguish between normal and abnormal
> termination of a spark streaming application with (checkpointing enabled).
>
> Adding application listener doesn't really help because onApplicationEnd
> event has no information regarding the cause of the termination.
>
> ssc_.sc.addSparkListener(new SparkListener {
> override def onApplicationEnd(applicationEnd: SparkListenerApplicationEnd): Unit = {
>
> I need to manage an internal metadata if streaming application has been
> terminated and that termination is not recoverable I have to delete the
> metadata (state of stream in this particular application).
>
>
>
> --
> Sincerely,
>
> Igor Makhlin
>
--
Sincerely,
Igor Makhlin