You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mulugeta Mammo <mu...@gmail.com> on 2015/07/02 21:05:54 UTC

Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Hi,

I'm running Spark 1.4.0, I want to specify the start and max size (-Xms and
Xmx) of the jvm heap size for my executors, I tried:

executor.cores.memory="-Xms1g -Xms8g"

but doesn't work. How do I specify?

Appreciate your help.

Thanks,

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Mulugeta Mammo <mu...@gmail.com>.
Ya, I think its a limitation too.I looked at the source code,
SparkConf.scala and ExecutorRunnable.scala both Xms and Xmx are set equal
value which is spark.executor.memory.

Thanks

On Thu, Jul 2, 2015 at 1:18 PM, Todd Nist <ts...@gmail.com> wrote:

> Yes, that does appear to be the case.  The documentation is very clear
> about the heap settings and that they can not be used with
> spark.executor.extraJavaOptions
>
> spark.executor.extraJavaOptions(none)A string of extra JVM options to
> pass to executors. For instance, GC settings or other logging. *Note that
> it is illegal to set Spark properties or heap size settings with this
> option.* Spark properties should be set using a SparkConf object or the
> spark-defaults.conf file used with the spark-submit script. *Heap size
> settings can be set with spark.executor.memory*.
> So it appears to be a limitation at this time.
>
> -Todd
>
>
>
> On Thu, Jul 2, 2015 at 4:13 PM, Mulugeta Mammo <mu...@gmail.com>
> wrote:
>
>> thanks but my use case requires I specify different start and max heap
>> sizes. Looks like spark sets start and max sizes  same value.
>>
>> On Thu, Jul 2, 2015 at 1:08 PM, Todd Nist <ts...@gmail.com> wrote:
>>
>>> You should use:
>>>
>>> spark.executor.memory
>>>
>>> from the docs <https://spark.apache.org/docs/latest/configuration.html>:
>>> spark.executor.memory512mAmount of memory to use per executor process,
>>> in the same format as JVM memory strings (e.g.512m, 2g).
>>>
>>> -Todd
>>>
>>>
>>>
>>> On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo <
>>> mulugeta.abebaw@gmail.com> wrote:
>>>
>>>> tried that one and it throws error - extraJavaOptions is not allowed to
>>>> alter memory settings, use spakr.executor.memory instead.
>>>>
>>>> On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <
>>>> benjamin.fradet@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> You can set those parameters through the
>>>>>
>>>>> spark.executor.extraJavaOptions
>>>>>
>>>>> Which is documented in the configuration guide:
>>>>> spark.apache.org/docs/latest/configuration.htnl
>>>>> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I'm running Spark 1.4.0, I want to specify the start and max size
>>>>>> (-Xms and Xmx) of the jvm heap size for my executors, I tried:
>>>>>>
>>>>>> executor.cores.memory="-Xms1g -Xms8g"
>>>>>>
>>>>>> but doesn't work. How do I specify?
>>>>>>
>>>>>> Appreciate your help.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>>
>>>>
>>>
>>
>

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Todd Nist <ts...@gmail.com>.
Yes, that does appear to be the case.  The documentation is very clear
about the heap settings and that they can not be used with
spark.executor.extraJavaOptions

spark.executor.extraJavaOptions(none)A string of extra JVM options to pass
to executors. For instance, GC settings or other logging. *Note that it is
illegal to set Spark properties or heap size settings with this option.*
Spark properties should be set using a SparkConf object or the
spark-defaults.conf file used with the spark-submit script. *Heap size
settings can be set with spark.executor.memory*.
So it appears to be a limitation at this time.

-Todd



On Thu, Jul 2, 2015 at 4:13 PM, Mulugeta Mammo <mu...@gmail.com>
wrote:

> thanks but my use case requires I specify different start and max heap
> sizes. Looks like spark sets start and max sizes  same value.
>
> On Thu, Jul 2, 2015 at 1:08 PM, Todd Nist <ts...@gmail.com> wrote:
>
>> You should use:
>>
>> spark.executor.memory
>>
>> from the docs <https://spark.apache.org/docs/latest/configuration.html>:
>> spark.executor.memory512mAmount of memory to use per executor process,
>> in the same format as JVM memory strings (e.g.512m, 2g).
>>
>> -Todd
>>
>>
>>
>> On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo <mulugeta.abebaw@gmail.com
>> > wrote:
>>
>>> tried that one and it throws error - extraJavaOptions is not allowed to
>>> alter memory settings, use spakr.executor.memory instead.
>>>
>>> On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <
>>> benjamin.fradet@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> You can set those parameters through the
>>>>
>>>> spark.executor.extraJavaOptions
>>>>
>>>> Which is documented in the configuration guide:
>>>> spark.apache.org/docs/latest/configuration.htnl
>>>> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I'm running Spark 1.4.0, I want to specify the start and max size
>>>>> (-Xms and Xmx) of the jvm heap size for my executors, I tried:
>>>>>
>>>>> executor.cores.memory="-Xms1g -Xms8g"
>>>>>
>>>>> but doesn't work. How do I specify?
>>>>>
>>>>> Appreciate your help.
>>>>>
>>>>> Thanks,
>>>>>
>>>>>
>>>
>>
>

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Mulugeta Mammo <mu...@gmail.com>.
thanks but my use case requires I specify different start and max heap
sizes. Looks like spark sets start and max sizes  same value.

On Thu, Jul 2, 2015 at 1:08 PM, Todd Nist <ts...@gmail.com> wrote:

> You should use:
>
> spark.executor.memory
>
> from the docs <https://spark.apache.org/docs/latest/configuration.html>:
> spark.executor.memory512mAmount of memory to use per executor process, in
> the same format as JVM memory strings (e.g.512m, 2g).
>
> -Todd
>
>
>
> On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo <mu...@gmail.com>
> wrote:
>
>> tried that one and it throws error - extraJavaOptions is not allowed to
>> alter memory settings, use spakr.executor.memory instead.
>>
>> On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <
>> benjamin.fradet@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> You can set those parameters through the
>>>
>>> spark.executor.extraJavaOptions
>>>
>>> Which is documented in the configuration guide:
>>> spark.apache.org/docs/latest/configuration.htnl
>>> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I'm running Spark 1.4.0, I want to specify the start and max size (-Xms
>>>> and Xmx) of the jvm heap size for my executors, I tried:
>>>>
>>>> executor.cores.memory="-Xms1g -Xms8g"
>>>>
>>>> but doesn't work. How do I specify?
>>>>
>>>> Appreciate your help.
>>>>
>>>> Thanks,
>>>>
>>>>
>>
>

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Todd Nist <ts...@gmail.com>.
You should use:

spark.executor.memory

from the docs <https://spark.apache.org/docs/latest/configuration.html>:
spark.executor.memory512mAmount of memory to use per executor process, in
the same format as JVM memory strings (e.g.512m, 2g).

-Todd



On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo <mu...@gmail.com>
wrote:

> tried that one and it throws error - extraJavaOptions is not allowed to
> alter memory settings, use spakr.executor.memory instead.
>
> On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <
> benjamin.fradet@gmail.com> wrote:
>
>> Hi,
>>
>> You can set those parameters through the
>>
>> spark.executor.extraJavaOptions
>>
>> Which is documented in the configuration guide:
>> spark.apache.org/docs/latest/configuration.htnl
>> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I'm running Spark 1.4.0, I want to specify the start and max size (-Xms
>>> and Xmx) of the jvm heap size for my executors, I tried:
>>>
>>> executor.cores.memory="-Xms1g -Xms8g"
>>>
>>> but doesn't work. How do I specify?
>>>
>>> Appreciate your help.
>>>
>>> Thanks,
>>>
>>>
>

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Mulugeta Mammo <mu...@gmail.com>.
tried that one and it throws error - extraJavaOptions is not allowed to
alter memory settings, use spakr.executor.memory instead.

On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <be...@gmail.com>
wrote:

> Hi,
>
> You can set those parameters through the
>
> spark.executor.extraJavaOptions
>
> Which is documented in the configuration guide:
> spark.apache.org/docs/latest/configuration.htnl
> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com> wrote:
>
>> Hi,
>>
>> I'm running Spark 1.4.0, I want to specify the start and max size (-Xms
>> and Xmx) of the jvm heap size for my executors, I tried:
>>
>> executor.cores.memory="-Xms1g -Xms8g"
>>
>> but doesn't work. How do I specify?
>>
>> Appreciate your help.
>>
>> Thanks,
>>
>>

Re: Setting JVM heap start and max sizes, -Xms and -Xmx, for executors

Posted by Benjamin Fradet <be...@gmail.com>.
Hi,

You can set those parameters through the

spark.executor.extraJavaOptions

Which is documented in the configuration guide:
spark.apache.org/docs/latest/configuration.htnl
On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mu...@gmail.com> wrote:

> Hi,
>
> I'm running Spark 1.4.0, I want to specify the start and max size (-Xms
> and Xmx) of the jvm heap size for my executors, I tried:
>
> executor.cores.memory="-Xms1g -Xms8g"
>
> but doesn't work. How do I specify?
>
> Appreciate your help.
>
> Thanks,
>
>