You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Subacini B <su...@gmail.com> on 2014/09/25 01:20:18 UTC

Processing multiple request in cluster

hi All,

How to run concurrently multiple requests on same cluster.

I have a program using *spark streaming context *which reads* streaming
data* and writes it to HBase. It works fine, the problem is when multiple
requests are submitted to cluster, only first request is processed as the
entire cluster is used for this request. Rest of the requests are in
waiting mode.

i have set  spark.cores.max to 2 or less, so that it can process another
request,but if there is only one request cluster is not utilized properly.

Is there any way, that spark cluster can process streaming request
concurrently at the same time effectively utitlizing cluster, something
like sharkserver

Thanks
Subacini

Re: Processing multiple request in cluster

Posted by Mayur Rustagi <ma...@gmail.com>.
There are two problems you may be facing.
1. your application is taking all resources
2. inside your application task submission is not scheduling properly.

for 1  you can either configure your app to take less resources or use
mesos/yarn types scheduler to dynamically change or juggle resources
for 2. you can use fair scheduler so that application tasks can be
scheduled more fairly.

Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>


On Thu, Sep 25, 2014 at 12:32 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> You can try spark on Mesos or Yarn since they have lot more support for
> scheduling and all
>
> Thanks
> Best Regards
>
> On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <su...@gmail.com> wrote:
>
>> hi All,
>>
>> How to run concurrently multiple requests on same cluster.
>>
>> I have a program using *spark streaming context *which reads* streaming
>> data* and writes it to HBase. It works fine, the problem is when
>> multiple requests are submitted to cluster, only first request is processed
>> as the entire cluster is used for this request. Rest of the requests are in
>> waiting mode.
>>
>> i have set  spark.cores.max to 2 or less, so that it can process another
>> request,but if there is only one request cluster is not utilized properly.
>>
>> Is there any way, that spark cluster can process streaming request
>> concurrently at the same time effectively utitlizing cluster, something
>> like sharkserver
>>
>> Thanks
>> Subacini
>>
>
>

Re: Processing multiple request in cluster

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
You can try spark on Mesos or Yarn since they have lot more support for
scheduling and all

Thanks
Best Regards

On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <su...@gmail.com> wrote:

> hi All,
>
> How to run concurrently multiple requests on same cluster.
>
> I have a program using *spark streaming context *which reads* streaming
> data* and writes it to HBase. It works fine, the problem is when multiple
> requests are submitted to cluster, only first request is processed as the
> entire cluster is used for this request. Rest of the requests are in
> waiting mode.
>
> i have set  spark.cores.max to 2 or less, so that it can process another
> request,but if there is only one request cluster is not utilized properly.
>
> Is there any way, that spark cluster can process streaming request
> concurrently at the same time effectively utitlizing cluster, something
> like sharkserver
>
> Thanks
> Subacini
>