You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Elkhan Dadashov <el...@gmail.com> on 2016/10/28 17:23:55 UTC

Can i get callback notification on Spark job completion ?

Hi,

I know that we can use SparkAppHandle (introduced in SparkLauncher version
>=1.6), and lt the delegator map task stay alive until the Spark job
finishes. But i wonder, if this can be done via callback notification
instead of polling.

Can i get callback notification on Spark job completion ?

Similar to Hadoop, get a callback on MapReduce job completion - getting a
notification instead of polling.

At job completion, an HTTP request will be sent to
“job.end.notification.url” value. Can be retrieved from notification URL
both the JOB_ID and JOB_STATUS.

...
Configuration conf = this.getConf();
// Set the callback parameters
conf.set("job.end.notification.url", "
https://hadoopi.wordpress.com/api/hadoop/notification/$*jobId*?status=$
*jobStatus*");
...
// Submit your job in background
job.submit();

At job completion, an HTTP request will be sent to
“job.end.notification.url” value:

https://
<callback-rul>/api/hadoop/notification/job_1379509275868_0002?status=SUCCEEDED

Reference:
https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/


Thanks.

Re: Can i get callback notification on Spark job completion ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Fri, Oct 28, 2016 at 11:14 AM, Elkhan Dadashov <el...@gmail.com> wrote:
> But if the map task will finish before the Spark job finishes, that means
> SparkLauncher will go away. if the SparkLauncher handle goes away, then I
> lose the ability to track the app's state, right ?
>
> I'm investigating if there is a way to know Spark job completion (without
> Spark Job History Server) in asynchronous manner.

Correct. As I said in my other reply to you, if you can't use Spark's
API for whatever reason, you have to talk directly to the cluster
managers, and at that point it's out of Spark's hands to help you.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Can i get callback notification on Spark job completion ?

Posted by Elkhan Dadashov <el...@gmail.com>.
Hi Marcelo,

Thanks for the reply.

But that means SparkAppHandle need to stay alive until Spark job completes.

In my case  Iaunch Spark job from the delegator Map task in cluster. That
means the map task container need to stay alive, and wait until Spark Job
completes.

But if the map task will finish before the Spark job finishes, that means
SparkLauncher will go away. if the SparkLauncher handle goes away, then I
lose the ability to track the app's state, right ?

I'm investigating if there is a way to know Spark job completion (without
Spark Job History Server) in asynchronous manner.

Like get a callback on MapReduce job completion - getting a notification
instead of polling.

*[Another option]*
According to Spark docs
<http://spark.apache.org/docs/latest/monitoring.html#rest-api> Spark
Metrics can be configured with different sinks.

I wonder whether it is possible to determine the job completion status from
these metrics.

Do Spark metrics provide job state information for each job id too ?

On Fri, Oct 28, 2016 at 11:05 AM Marcelo Vanzin <va...@cloudera.com> wrote:

If you look at the "startApplication" method it takes listeners as
parameters.

On Fri, Oct 28, 2016 at 10:23 AM, Elkhan Dadashov <el...@gmail.com>
wrote:
> Hi,
>
> I know that we can use SparkAppHandle (introduced in SparkLauncher version
>>=1.6), and lt the delegator map task stay alive until the Spark job
> finishes. But i wonder, if this can be done via callback notification
> instead of polling.
>
> Can i get callback notification on Spark job completion ?
>
> Similar to Hadoop, get a callback on MapReduce job completion - getting a
> notification instead of polling.
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value. Can be retrieved from notification URL
> both the JOB_ID and JOB_STATUS.
>
> ...
> Configuration conf = this.getConf();
> // Set the callback parameters
> conf.set("job.end.notification.url",
> "
https://hadoopi.wordpress.com/api/hadoop/notification/$jobId?status=$jobStatus
");
> ...
> // Submit your job in background
> job.submit();
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value:
>
> https://
<callback-rul>/api/hadoop/notification/job_1379509275868_0002?status=SUCCEEDED
>
> Reference:
>
https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/
>
> Thanks.



--
Marcelo

Re: Can i get callback notification on Spark job completion ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
If you look at the "startApplication" method it takes listeners as parameters.

On Fri, Oct 28, 2016 at 10:23 AM, Elkhan Dadashov <el...@gmail.com> wrote:
> Hi,
>
> I know that we can use SparkAppHandle (introduced in SparkLauncher version
>>=1.6), and lt the delegator map task stay alive until the Spark job
> finishes. But i wonder, if this can be done via callback notification
> instead of polling.
>
> Can i get callback notification on Spark job completion ?
>
> Similar to Hadoop, get a callback on MapReduce job completion - getting a
> notification instead of polling.
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value. Can be retrieved from notification URL
> both the JOB_ID and JOB_STATUS.
>
> ...
> Configuration conf = this.getConf();
> // Set the callback parameters
> conf.set("job.end.notification.url",
> "https://hadoopi.wordpress.com/api/hadoop/notification/$jobId?status=$jobStatus");
> ...
> // Submit your job in background
> job.submit();
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value:
>
> https://<callback-rul>/api/hadoop/notification/job_1379509275868_0002?status=SUCCEEDED
>
> Reference:
> https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/
>
> Thanks.



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org