You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by mark <ma...@googlemail.com> on 2015/08/10 14:12:11 UTC

How to programmatically create, submit and report on Spark jobs?

Hi All

I need to be able to create, submit and report on Spark jobs
programmatically in response to events arriving on a Kafka bus. I also need
end-users to be able to create data queries that launch Spark jobs 'behind
the scenes'.

I would expect to use the same API for both, and be able to provide a user
friendly view (ie. *not *the Spark web UI) of all jobs (user and system)
that are currently running, have completed, failed etc.

Are there any tools / add-ons for this? Or is there a suggested approach?

Thanks

Re: How to programmatically create, submit and report on Spark jobs?

Posted by Ted Yu <yu...@gmail.com>.
For monitoring, please take a look at
http://spark.apache.org/docs/latest/monitoring.html
Especially REST API section.

Cheers

On Mon, Aug 10, 2015 at 8:33 AM, Ted Yu <yu...@gmail.com> wrote:

> I found SPARK-3733 which was marked dup of SPARK-4924 which went to 1.4.0
>
> FYI
>
> On Mon, Aug 10, 2015 at 5:12 AM, mark <ma...@googlemail.com> wrote:
>
>> Hi All
>>
>> I need to be able to create, submit and report on Spark jobs
>> programmatically in response to events arriving on a Kafka bus. I also need
>> end-users to be able to create data queries that launch Spark jobs 'behind
>> the scenes'.
>>
>> I would expect to use the same API for both, and be able to provide a
>> user friendly view (ie. *not *the Spark web UI) of all jobs (user and
>> system) that are currently running, have completed, failed etc.
>>
>> Are there any tools / add-ons for this? Or is there a suggested approach?
>>
>> Thanks
>>
>
>

Re: How to programmatically create, submit and report on Spark jobs?

Posted by Ted Yu <yu...@gmail.com>.
I found SPARK-3733 which was marked dup of SPARK-4924 which went to 1.4.0

FYI

On Mon, Aug 10, 2015 at 5:12 AM, mark <ma...@googlemail.com> wrote:

> Hi All
>
> I need to be able to create, submit and report on Spark jobs
> programmatically in response to events arriving on a Kafka bus. I also need
> end-users to be able to create data queries that launch Spark jobs 'behind
> the scenes'.
>
> I would expect to use the same API for both, and be able to provide a user
> friendly view (ie. *not *the Spark web UI) of all jobs (user and system)
> that are currently running, have completed, failed etc.
>
> Are there any tools / add-ons for this? Or is there a suggested approach?
>
> Thanks
>