You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by شجاع الرحمن بیگ <sh...@gmail.com> on 2016/05/11 15:38:07 UTC

Setting Spark Worker Memory

Hi All,

I need to set same memory and core for each worker on same machine and for
this purpose, I have set the following properties in conf/spark-env.sh

export SPARK_EXECUTOR_INSTANCE=3
export SPARK_WORKER_CORES=3
export SPARK_WORKER_MEMORY=8g

but only one worker is getting desired memory and cores and other two are
not. here is the log of master.

...
6/05/11 17:04:40 INFO Master: I have been elected leader! New state: ALIVE
16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923 with 3
cores, 8.0 GB RAM
16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072 with 2
cores, 1020.7 GB RAM
16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702 with 2
cores, 1020.7 GB RAM
...


Could you please let me know the solution

Thanks
Shuja

-- 
Regards
Shuja-ur-Rehman Baig
<http://pk.linkedin.com/in/shujamughal>

Re: Setting Spark Worker Memory

Posted by Mich Talebzadeh <mi...@gmail.com>.
run JPS like below

 jps
19724 SparkSubmit
10612 Worker

and do ps awx|grep  PID

 for each number that represents these two descriptions. something like

ps awx|grep 30208
30208 pts/2    Sl+    1:05 /usr/java/latest/bin/java -cp
/home/hduser/jars/jconn4.jar:/home/hduser/jars/ojdbc6.jar:/usr/lib/spark-1.6.1-bin-hadoop2.6/conf/:/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar:/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/home/hduser/hadoop-2.6.0/etc/hadoop/
-Xms4g -Xmx4g *org.apache.spark.deploy.SparkSubmit --master
spark://50.140.197.217:7077 <http://50.140.197.217:7077> --conf
spark.driver.memory=4g --class CEP_streaming --num-executors 1
--executor-memory 4G --executor-cores 2 *--packages
com.databricks:spark-csv_2.11:1.3.0 --jars
/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar

Also

send the output of OS command for available memory


*free*             total       used       free     shared    buffers
cached
Mem:      24546308   24398672     147636          0     347464   17130900
-/+ buffers/cache:    6920308   17626000
Swap:      2031608     226288    1805320

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 11 May 2016 at 19:47, شجاع الرحمن بیگ <sh...@gmail.com> wrote:

> yes, i m running this as standalone mode.
>
> On Wed, May 11, 2016 at 6:23 PM, Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> are you running this in standalone  mode? that is one physical host, and
>> the executor will live inside the driver.
>>
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 11 May 2016 at 16:45, شجاع الرحمن بیگ <sh...@gmail.com> wrote:
>>
>>> yes,
>>>
>>> On Wed, May 11, 2016 at 5:43 PM, Deepak Sharma <de...@gmail.com>
>>> wrote:
>>>
>>>> Since you are registering workers from the same node , do you have
>>>> enough cores and RAM(In this case >=9 cores and > = 24 GB ) on this
>>>> node(11.14.224.24)?
>>>>
>>>> Thanks
>>>> Deepak
>>>>
>>>> On Wed, May 11, 2016 at 9:08 PM, شجاع الرحمن بیگ <shujamughal@gmail.com
>>>> > wrote:
>>>>
>>>>> Hi All,
>>>>>
>>>>> I need to set same memory and core for each worker on same machine and
>>>>> for this purpose, I have set the following properties in conf/spark-env.sh
>>>>>
>>>>> export SPARK_EXECUTOR_INSTANCE=3
>>>>> export SPARK_WORKER_CORES=3
>>>>> export SPARK_WORKER_MEMORY=8g
>>>>>
>>>>> but only one worker is getting desired memory and cores and other two
>>>>> are not. here is the log of master.
>>>>>
>>>>> ...
>>>>> 6/05/11 17:04:40 INFO Master: I have been elected leader! New state:
>>>>> ALIVE
>>>>> 16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923
>>>>> with 3 cores, 8.0 GB RAM
>>>>> 16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072
>>>>> with 2 cores, 1020.7 GB RAM
>>>>> 16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702
>>>>> with 2 cores, 1020.7 GB RAM
>>>>> ...
>>>>>
>>>>>
>>>>> Could you please let me know the solution
>>>>>
>>>>> Thanks
>>>>> Shuja
>>>>>
>>>>> --
>>>>> Regards
>>>>> Shuja-ur-Rehman Baig
>>>>> <http://pk.linkedin.com/in/shujamughal>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Thanks
>>>> Deepak
>>>> www.bigdatabig.com
>>>> www.keosha.net
>>>>
>>>
>>>
>>>
>>> --
>>> Regards
>>> Shuja-ur-Rehman Baig
>>> <http://pk.linkedin.com/in/shujamughal>
>>>
>>
>>
>
>
> --
> Regards
> Shuja-ur-Rehman Baig
> <http://pk.linkedin.com/in/shujamughal>
>

Re: Setting Spark Worker Memory

Posted by شجاع الرحمن بیگ <sh...@gmail.com>.
yes, i m running this as standalone mode.

On Wed, May 11, 2016 at 6:23 PM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> are you running this in standalone  mode? that is one physical host, and
> the executor will live inside the driver.
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 11 May 2016 at 16:45, شجاع الرحمن بیگ <sh...@gmail.com> wrote:
>
>> yes,
>>
>> On Wed, May 11, 2016 at 5:43 PM, Deepak Sharma <de...@gmail.com>
>> wrote:
>>
>>> Since you are registering workers from the same node , do you have
>>> enough cores and RAM(In this case >=9 cores and > = 24 GB ) on this
>>> node(11.14.224.24)?
>>>
>>> Thanks
>>> Deepak
>>>
>>> On Wed, May 11, 2016 at 9:08 PM, شجاع الرحمن بیگ <sh...@gmail.com>
>>> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I need to set same memory and core for each worker on same machine and
>>>> for this purpose, I have set the following properties in conf/spark-env.sh
>>>>
>>>> export SPARK_EXECUTOR_INSTANCE=3
>>>> export SPARK_WORKER_CORES=3
>>>> export SPARK_WORKER_MEMORY=8g
>>>>
>>>> but only one worker is getting desired memory and cores and other two
>>>> are not. here is the log of master.
>>>>
>>>> ...
>>>> 6/05/11 17:04:40 INFO Master: I have been elected leader! New state:
>>>> ALIVE
>>>> 16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923
>>>> with 3 cores, 8.0 GB RAM
>>>> 16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072
>>>> with 2 cores, 1020.7 GB RAM
>>>> 16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702
>>>> with 2 cores, 1020.7 GB RAM
>>>> ...
>>>>
>>>>
>>>> Could you please let me know the solution
>>>>
>>>> Thanks
>>>> Shuja
>>>>
>>>> --
>>>> Regards
>>>> Shuja-ur-Rehman Baig
>>>> <http://pk.linkedin.com/in/shujamughal>
>>>>
>>>
>>>
>>>
>>> --
>>> Thanks
>>> Deepak
>>> www.bigdatabig.com
>>> www.keosha.net
>>>
>>
>>
>>
>> --
>> Regards
>> Shuja-ur-Rehman Baig
>> <http://pk.linkedin.com/in/shujamughal>
>>
>
>


-- 
Regards
Shuja-ur-Rehman Baig
<http://pk.linkedin.com/in/shujamughal>

Re: Setting Spark Worker Memory

Posted by Mich Talebzadeh <mi...@gmail.com>.
are you running this in standalone  mode? that is one physical host, and
the executor will live inside the driver.



Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 11 May 2016 at 16:45, شجاع الرحمن بیگ <sh...@gmail.com> wrote:

> yes,
>
> On Wed, May 11, 2016 at 5:43 PM, Deepak Sharma <de...@gmail.com>
> wrote:
>
>> Since you are registering workers from the same node , do you have enough
>> cores and RAM(In this case >=9 cores and > = 24 GB ) on this
>> node(11.14.224.24)?
>>
>> Thanks
>> Deepak
>>
>> On Wed, May 11, 2016 at 9:08 PM, شجاع الرحمن بیگ <sh...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I need to set same memory and core for each worker on same machine and
>>> for this purpose, I have set the following properties in conf/spark-env.sh
>>>
>>> export SPARK_EXECUTOR_INSTANCE=3
>>> export SPARK_WORKER_CORES=3
>>> export SPARK_WORKER_MEMORY=8g
>>>
>>> but only one worker is getting desired memory and cores and other two
>>> are not. here is the log of master.
>>>
>>> ...
>>> 6/05/11 17:04:40 INFO Master: I have been elected leader! New state:
>>> ALIVE
>>> 16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923
>>> with 3 cores, 8.0 GB RAM
>>> 16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072
>>> with 2 cores, 1020.7 GB RAM
>>> 16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702
>>> with 2 cores, 1020.7 GB RAM
>>> ...
>>>
>>>
>>> Could you please let me know the solution
>>>
>>> Thanks
>>> Shuja
>>>
>>> --
>>> Regards
>>> Shuja-ur-Rehman Baig
>>> <http://pk.linkedin.com/in/shujamughal>
>>>
>>
>>
>>
>> --
>> Thanks
>> Deepak
>> www.bigdatabig.com
>> www.keosha.net
>>
>
>
>
> --
> Regards
> Shuja-ur-Rehman Baig
> <http://pk.linkedin.com/in/shujamughal>
>

Re: Setting Spark Worker Memory

Posted by شجاع الرحمن بیگ <sh...@gmail.com>.
yes,

On Wed, May 11, 2016 at 5:43 PM, Deepak Sharma <de...@gmail.com>
wrote:

> Since you are registering workers from the same node , do you have enough
> cores and RAM(In this case >=9 cores and > = 24 GB ) on this
> node(11.14.224.24)?
>
> Thanks
> Deepak
>
> On Wed, May 11, 2016 at 9:08 PM, شجاع الرحمن بیگ <sh...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I need to set same memory and core for each worker on same machine and
>> for this purpose, I have set the following properties in conf/spark-env.sh
>>
>> export SPARK_EXECUTOR_INSTANCE=3
>> export SPARK_WORKER_CORES=3
>> export SPARK_WORKER_MEMORY=8g
>>
>> but only one worker is getting desired memory and cores and other two are
>> not. here is the log of master.
>>
>> ...
>> 6/05/11 17:04:40 INFO Master: I have been elected leader! New state: ALIVE
>> 16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923
>> with 3 cores, 8.0 GB RAM
>> 16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072
>> with 2 cores, 1020.7 GB RAM
>> 16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702
>> with 2 cores, 1020.7 GB RAM
>> ...
>>
>>
>> Could you please let me know the solution
>>
>> Thanks
>> Shuja
>>
>> --
>> Regards
>> Shuja-ur-Rehman Baig
>> <http://pk.linkedin.com/in/shujamughal>
>>
>
>
>
> --
> Thanks
> Deepak
> www.bigdatabig.com
> www.keosha.net
>



-- 
Regards
Shuja-ur-Rehman Baig
<http://pk.linkedin.com/in/shujamughal>

Re: Setting Spark Worker Memory

Posted by Deepak Sharma <de...@gmail.com>.
Since you are registering workers from the same node , do you have enough
cores and RAM(In this case >=9 cores and > = 24 GB ) on this
node(11.14.224.24)?

Thanks
Deepak

On Wed, May 11, 2016 at 9:08 PM, شجاع الرحمن بیگ <sh...@gmail.com>
wrote:

> Hi All,
>
> I need to set same memory and core for each worker on same machine and for
> this purpose, I have set the following properties in conf/spark-env.sh
>
> export SPARK_EXECUTOR_INSTANCE=3
> export SPARK_WORKER_CORES=3
> export SPARK_WORKER_MEMORY=8g
>
> but only one worker is getting desired memory and cores and other two are
> not. here is the log of master.
>
> ...
> 6/05/11 17:04:40 INFO Master: I have been elected leader! New state: ALIVE
> 16/05/11 17:04:43 INFO Master: Registering worker 11.14.224.24:53923 with
> 3 cores, 8.0 GB RAM
> 16/05/11 17:04:49 INFO Master: Registering worker 11.14.224.24:55072 with
> 2 cores, 1020.7 GB RAM
> 16/05/11 17:05:07 INFO Master: Registering worker 11.14.224.24:49702 with
> 2 cores, 1020.7 GB RAM
> ...
>
>
> Could you please let me know the solution
>
> Thanks
> Shuja
>
> --
> Regards
> Shuja-ur-Rehman Baig
> <http://pk.linkedin.com/in/shujamughal>
>



-- 
Thanks
Deepak
www.bigdatabig.com
www.keosha.net