You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "☼ R Nair (रविशंकर नायर)" <ra...@gmail.com> on 2018/03/07 03:56:38 UTC

Spark StreamingContext Question

Hi all,

Understand from documentation that, only one streaming context can be
active in a JVM at the same time.

Hence in an enterprise cluster, how can we manage/handle multiple users are
having many different streaming applications, one may be ingesting data
from Flume, another from Twitter etc? Is this not available now?

Best,
Passion

Re: Spark StreamingContext Question

Posted by "☼ R Nair (रविशंकर नायर)" <ra...@gmail.com>.
Got it, thanks....

On Wed, Mar 7, 2018 at 4:32 AM, Gerard Maas <ge...@gmail.com> wrote:

> Hi,
>
> You can run as many jobs in your cluster as you want, provided you have
> enough capacity.
> The one streaming context constrain is per job.
>
> You can submit several jobs for Flume and some other for Twitter, Kafka,
> etc...
>
> If you are getting started with Streaming with Spark, I'd recommend you to
> look into Structured Streaming first.
> In Structured Streaming, you can have many streaming queries running under
> the same spark session.
> Yet, that does not mean you need to put them all in the same job. You can
> (and should) still submit different jobs for different application concerns.
>
> kind regards, Gerard.
>
>
>
> On Wed, Mar 7, 2018 at 4:56 AM, ☼ R Nair (रविशंकर नायर) <
> ravishankar.nair@gmail.com> wrote:
>
>> Hi all,
>>
>> Understand from documentation that, only one streaming context can be
>> active in a JVM at the same time.
>>
>> Hence in an enterprise cluster, how can we manage/handle multiple users
>> are having many different streaming applications, one may be ingesting data
>> from Flume, another from Twitter etc? Is this not available now?
>>
>> Best,
>> Passion
>>
>
>

Re: Spark StreamingContext Question

Posted by Gerard Maas <ge...@gmail.com>.
Hi,

You can run as many jobs in your cluster as you want, provided you have
enough capacity.
The one streaming context constrain is per job.

You can submit several jobs for Flume and some other for Twitter, Kafka,
etc...

If you are getting started with Streaming with Spark, I'd recommend you to
look into Structured Streaming first.
In Structured Streaming, you can have many streaming queries running under
the same spark session.
Yet, that does not mean you need to put them all in the same job. You can
(and should) still submit different jobs for different application concerns.

kind regards, Gerard.



On Wed, Mar 7, 2018 at 4:56 AM, ☼ R Nair (रविशंकर नायर) <
ravishankar.nair@gmail.com> wrote:

> Hi all,
>
> Understand from documentation that, only one streaming context can be
> active in a JVM at the same time.
>
> Hence in an enterprise cluster, how can we manage/handle multiple users
> are having many different streaming applications, one may be ingesting data
> from Flume, another from Twitter etc? Is this not available now?
>
> Best,
> Passion
>

Re: Spark StreamingContext Question

Posted by sagar grover <sa...@gmail.com>.
Hi,
You can have multiple streams under same streaming context and process them
accordingly.

With regards,
Sagar Grover
Phone - 7022175584

On Wed, Mar 7, 2018 at 9:26 AM, ☼ R Nair (रविशंकर नायर) <
ravishankar.nair@gmail.com> wrote:

> Hi all,
>
> Understand from documentation that, only one streaming context can be
> active in a JVM at the same time.
>
> Hence in an enterprise cluster, how can we manage/handle multiple users
> are having many different streaming applications, one may be ingesting data
> from Flume, another from Twitter etc? Is this not available now?
>
> Best,
> Passion
>