You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by prabeesh k <pr...@gmail.com> on 2013/10/11 06:04:22 UTC

Execution time of spark job

Is there any way to get execution time in the program?
Actually I got from log
 INFO spark.SparkContext: Job finished: collect at Kmeans.scala:109, took
0.242050892 s
But I want to use execution time in my code. Please help me

Re: Execution time of spark job

Posted by prabeesh k <pr...@gmail.com>.
Thanks for your reply.


On Fri, Oct 11, 2013 at 10:03 AM, Matei Zaharia <ma...@gmail.com>wrote:

> Take a look at the org.apache.spark.scheduler.SparkListener class. You can
> register your own SparkListener with SparkContext that listens for
> job-start and job-end events.
>
> Matei
>
> On Oct 10, 2013, at 9:04 PM, prabeesh k <pr...@gmail.com> wrote:
>
> > Is there any way to get execution time in the program?
> > Actually I got from log
> >  INFO spark.SparkContext: Job finished: collect at Kmeans.scala:109,
> took 0.242050892 s
> > But I want to use execution time in my code. Please help me
> >
>
>

Re: Execution time of spark job

Posted by Matei Zaharia <ma...@gmail.com>.
Take a look at the org.apache.spark.scheduler.SparkListener class. You can register your own SparkListener with SparkContext that listens for job-start and job-end events.

Matei

On Oct 10, 2013, at 9:04 PM, prabeesh k <pr...@gmail.com> wrote:

> Is there any way to get execution time in the program? 
> Actually I got from log 
>  INFO spark.SparkContext: Job finished: collect at Kmeans.scala:109, took 0.242050892 s
> But I want to use execution time in my code. Please help me
>