You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by anshu shukla <an...@gmail.com> on 2015/09/01 19:55:01 UTC
Resource allocation in SPARK streaming
I am not much clear about resource allocation (CPU/CORE/Thread level
allocation) as per the parallelism by setting number of cores in spark
standalone mode .
Any guidelines for that .
--
Thanks & Regards,
Anshu Shukla
Re: Resource allocation in SPARK streaming
Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Well in spark, you can get the information that you need from the driver ui
running on port 4040, click on the active job, then click on the stages and
inside the stages you will find the tasks and the machine address on which
the task is being executed, you can also check the cpu load on that machine
(by doing a top or htop). In spark, a single core is allocated to a single
task.
Thanks
Best Regards
On Thu, Sep 3, 2015 at 4:49 AM, anshu shukla <an...@gmail.com> wrote:
> I tried to find but is *unable to get clear picture about the resource
> allocation* at the level of thread/CORE done in spark .
>
> Actually my problem is that I am comparing CPU usage by spark and storm
> but in case of storm I know which bolt is running on which machine and over
> how many cores .How to do that in spark .
> I am searching for explanation for the high CPU Whiskers in plot
> attached . Every Box plot is a topology with CPU usage .
>
> Thanks in advance !
>
> On Tue, Sep 1, 2015 at 11:25 PM, anshu shukla <an...@gmail.com>
> wrote:
>
>> I am not much clear about resource allocation (CPU/CORE/Thread level
>> allocation) as per the parallelism by setting number of cores in spark
>> standalone mode .
>>
>> Any guidelines for that .
>>
>> --
>> Thanks & Regards,
>> Anshu Shukla
>>
>
>
>
> --
> Thanks & Regards,
> Anshu Shukla
>