You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Xi Shen <da...@gmail.com> on 2015/03/16 08:22:45 UTC

How to set Spark executor memory?

Hi,

I have set spark.executor.memory to 2048m, and in the UI "Environment"
page, I can see this value has been set correctly. But in the "Executors"
page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
value. why not 256MB, or just as what I set?

What am I missing here?


Thanks,
David

Re: How to set Spark executor memory?

Posted by Sean Owen <so...@cloudera.com>.
If you are running from your IDE, then I don't know what you are
running or in what mode. The discussion here concerns using standard
mechanisms like spark-submit to configure executor memory. Please try
these first instead of trying to directly invoke Spark, which will
require more understanding of how the props are set.

On Sat, Mar 21, 2015 at 5:30 AM, Xi Shen <da...@gmail.com> wrote:
> Hi Sean,
>
> It's getting strange now. If I ran from IDE, my executor memory is always
> set to 6.7G, no matter what value I set in code. I have check my environment
> variable, and there's no value of 6.7, or 12.5
>
> Any idea?
>
> Thanks,
> David
>
>
> On Tue, 17 Mar 2015 00:35 null <ji...@wipro.com> wrote:
>>
>> Hi Xi Shen,
>>
>> You could set the spark.executor.memory in the code itself . new
>> SparkConf()..set("spark.executor.memory", "2g")
>>
>> Or you can try the -- spark.executor.memory 2g while submitting the jar.
>>
>>
>>
>> Regards
>>
>> Jishnu Prathap
>>
>>
>>
>> From: Akhil Das [mailto:akhil@sigmoidanalytics.com]
>> Sent: Monday, March 16, 2015 2:06 PM
>> To: Xi Shen
>> Cc: user@spark.apache.org
>> Subject: Re: How to set Spark executor memory?
>>
>>
>>
>> By default spark.executor.memory is set to 512m, I'm assuming since you
>> are submiting the job using spark-submit and it is not able to override the
>> value since you are running in local mode. Can you try it without using
>> spark-submit as a standalone project?
>>
>>
>> Thanks
>>
>> Best Regards
>>
>>
>>
>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>>
>> I set it in code, not by configuration. I submit my jar file to local. I
>> am working in my developer environment.
>>
>>
>>
>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>
>> How are you setting it? and how are you submitting the job?
>>
>>
>> Thanks
>>
>> Best Regards
>>
>>
>>
>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:
>>
>> Hi,
>>
>>
>>
>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>> page, I can see this value has been set correctly. But in the "Executors"
>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>> value. why not 256MB, or just as what I set?
>>
>>
>>
>> What am I missing here?
>>
>>
>>
>>
>>
>> Thanks,
>>
>> David
>>
>>
>>
>>
>>
>>
>>
>> The information contained in this electronic message and any attachments
>> to this message are intended for the exclusive use of the addressee(s) and
>> may contain proprietary, confidential or privileged information. If you are
>> not the intended recipient, you should not disseminate, distribute or copy
>> this e-mail. Please notify the sender immediately and destroy all copies of
>> this message and any attachments. WARNING: Computer viruses can be
>> transmitted via email. The recipient should check this email and any
>> attachments for the presence of viruses. The company accepts no liability
>> for any damage caused by any virus transmitted by this email. www.wipro.com

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to set Spark executor memory?

Posted by Xi Shen <da...@gmail.com>.
Hi Sean,

It's getting strange now. If I ran from IDE, my executor memory is always
set to 6.7G, no matter what value I set in code. I have check my
environment variable, and there's no value of 6.7, or 12.5

Any idea?

Thanks,
David

On Tue, 17 Mar 2015 00:35 null <ji...@wipro.com> wrote:

>  Hi Xi Shen,
>
> You could set the spark.executor.memory in the code itself . new SparkConf()..set("spark.executor.memory", "2g")
>
> Or you can try the -- spark.executor.memory 2g while submitting the jar.
>
>
>
> Regards
>
> Jishnu Prathap
>
>
>
> *From:* Akhil Das [mailto:akhil@sigmoidanalytics.com]
> *Sent:* Monday, March 16, 2015 2:06 PM
> *To:* Xi Shen
> *Cc:* user@spark.apache.org
> *Subject:* Re: How to set Spark executor memory?
>
>
>
> By default spark.executor.memory is set to 512m, I'm assuming since you
> are submiting the job using spark-submit and it is not able to override the
> value since you are running in local mode. Can you try it without using
> spark-submit as a standalone project?
>
>
>   Thanks
>
> Best Regards
>
>
>
> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>
> I set it in code, not by configuration. I submit my jar file to local. I
> am working in my developer environment.
>
>
>
> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>
> How are you setting it? and how are you submitting the job?
>
>
>   Thanks
>
> Best Regards
>
>
>
> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:
>
> Hi,
>
>
>
> I have set spark.executor.memory to 2048m, and in the UI "Environment"
> page, I can see this value has been set correctly. But in the "Executors"
> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
> value. why not 256MB, or just as what I set?
>
>
>
> What am I missing here?
>
>
>
>
>
> Thanks,
>
> David
>
>
>
>
>
>
>  The information contained in this electronic message and any attachments
> to this message are intended for the exclusive use of the addressee(s) and
> may contain proprietary, confidential or privileged information. If you are
> not the intended recipient, you should not disseminate, distribute or copy
> this e-mail. Please notify the sender immediately and destroy all copies of
> this message and any attachments. WARNING: Computer viruses can be
> transmitted via email. The recipient should check this email and any
> attachments for the presence of viruses. The company accepts no liability
> for any damage caused by any virus transmitted by this email.
> www.wipro.com
>

RE: How to set Spark executor memory?

Posted by ji...@wipro.com.
Hi Xi Shen,

You could set the spark.executor.memory in the code itself . new SparkConf()..set("spark.executor.memory", "2g")
Or you can try the -- spark.executor.memory 2g while submitting the jar.

Regards
Jishnu Prathap

From: Akhil Das [mailto:akhil@sigmoidanalytics.com]
Sent: Monday, March 16, 2015 2:06 PM
To: Xi Shen
Cc: user@spark.apache.org
Subject: Re: How to set Spark executor memory?

By default spark.executor.memory is set to 512m, I'm assuming since you are submiting the job using spark-submit and it is not able to override the value since you are running in local mode. Can you try it without using spark-submit as a standalone project?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com>> wrote:

I set it in code, not by configuration. I submit my jar file to local. I am working in my developer environment.

On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com>> wrote:
How are you setting it? and how are you submitting the job?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com>> wrote:
Hi,

I have set spark.executor.memory to 2048m, and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB. Very strange value. why not 256MB, or just as what I set?

What am I missing here?


Thanks,
David



The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com

Re: How to set Spark executor memory?

Posted by Sean Owen <so...@cloudera.com>.
There are a number of small misunderstandings here.

In the first instance, the executor memory is not actually being set
to 2g and the default of 512m is being used. If you are writing code
to launch an app, then you are trying to duplicate what spark-submit
does, and you don't use spark-submit. If you use spark-submit, your
configuration happens "too late".

The memory you see in the UI is not total executor memory. it is
memory available for caching. The default formula is actually 0.6 *
0.9 * total, not 0.6 * total.

This is not a function of your machines total memory, but of the
configured executor memory.

if this value is 6.7GB it implies that you somehow configured the
executors to use 12.4GB of memory. Double-check for typos and maybe
confirm what figure you are quoting here.

In the last instance -- you are looking at driver memory, not executor
memory. The 1g you are trying to configure affects executors.

On Mon, Mar 16, 2015 at 9:21 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> Strange, even i'm having it while running in local mode.
>
>
>
> I set it as .set("spark.executor.memory", "1g")
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen <da...@gmail.com> wrote:
>>
>> I set "spark.executor.memory" to "2048m". If the executor storage memory
>> is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.
>>
>> My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
>> xD
>>
>>
>> On Mon, Mar 16, 2015 at 7:59 PM Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>>
>>> How much memory are you having on your machine? I think default value is
>>> 0.6 of the spark.executor.memory as you can see from here.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <da...@gmail.com> wrote:
>>>>
>>>> Hi Akhil,
>>>>
>>>> Yes, you are right. If I ran the program from IDE as a normal java
>>>> program, the executor's memory is increased...but not to 2048m, it is set to
>>>> 6.7GB...Looks like there's some formula to calculate this value.
>>>>
>>>>
>>>> Thanks,
>>>> David
>>>>
>>>>
>>>> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
>>>> wrote:
>>>>>
>>>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>>>> are submiting the job using spark-submit and it is not able to override the
>>>>> value since you are running in local mode. Can you try it without using
>>>>> spark-submit as a standalone project?
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>>>>>>
>>>>>> I set it in code, not by configuration. I submit my jar file to local.
>>>>>> I am working in my developer environment.
>>>>>>
>>>>>>
>>>>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com>
>>>>>> wrote:
>>>>>>>
>>>>>>> How are you setting it? and how are you submitting the job?
>>>>>>>
>>>>>>> Thanks
>>>>>>> Best Regards
>>>>>>>
>>>>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com>
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I have set spark.executor.memory to 2048m, and in the UI
>>>>>>>> "Environment" page, I can see this value has been set correctly. But in the
>>>>>>>> "Executors" page, I saw there's only 1 executor and its memory is 265.4MB.
>>>>>>>> Very strange value. why not 256MB, or just as what I set?
>>>>>>>>
>>>>>>>> What am I missing here?
>>>>>>>>
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> David
>>>>>>>>
>>>>>>>
>>>>>
>>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to set Spark executor memory?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Strange, even i'm having it while running in local mode.

[image: Inline image 1]

I set it as .set("spark.executor.memory", "1g")

Thanks
Best Regards

On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen <da...@gmail.com> wrote:

> I set "spark.executor.memory" to "2048m". If the executor storage memory
> is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.
>
> My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
> xD
>
>
> On Mon, Mar 16, 2015 at 7:59 PM Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> How much memory are you having on your machine? I think default value is
>> 0.6 of the spark.executor.memory as you can see from here
>> <http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior>
>> .
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <da...@gmail.com> wrote:
>>
>>> Hi Akhil,
>>>
>>> Yes, you are right. If I ran the program from IDE as a normal java
>>> program, the executor's memory is increased...but not to 2048m, it is set
>>> to 6.7GB...Looks like there's some formula to calculate this value.
>>>
>>>
>>> Thanks,
>>> David
>>>
>>>
>>> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>>> are submiting the job using spark-submit and it is not able to override the
>>>> value since you are running in local mode. Can you try it without using
>>>> spark-submit as a standalone project?
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>>>>
>>>>> I set it in code, not by configuration. I submit my jar file to local.
>>>>> I am working in my developer environment.
>>>>>
>>>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com>
>>>>> wrote:
>>>>>
>>>>>> How are you setting it? and how are you submitting the job?
>>>>>>
>>>>>> Thanks
>>>>>> Best Regards
>>>>>>
>>>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I have set spark.executor.memory to 2048m, and in the UI
>>>>>>> "Environment" page, I can see this value has been set correctly. But in the
>>>>>>> "Executors" page, I saw there's only 1 executor and its memory is 265.4MB.
>>>>>>> Very strange value. why not 256MB, or just as what I set?
>>>>>>>
>>>>>>> What am I missing here?
>>>>>>>
>>>>>>>
>>>>>>> Thanks,
>>>>>>> David
>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>

Re: How to set Spark executor memory?

Posted by Xi Shen <da...@gmail.com>.
I set "spark.executor.memory" to "2048m". If the executor storage memory is
0.6 of executor memory, it should be 2g * 0.6 = 1.2g.

My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math xD


On Mon, Mar 16, 2015 at 7:59 PM Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> How much memory are you having on your machine? I think default value is
> 0.6 of the spark.executor.memory as you can see from here
> <http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior>
> .
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <da...@gmail.com> wrote:
>
>> Hi Akhil,
>>
>> Yes, you are right. If I ran the program from IDE as a normal java
>> program, the executor's memory is increased...but not to 2048m, it is set
>> to 6.7GB...Looks like there's some formula to calculate this value.
>>
>>
>> Thanks,
>> David
>>
>>
>> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>> are submiting the job using spark-submit and it is not able to override the
>>> value since you are running in local mode. Can you try it without using
>>> spark-submit as a standalone project?
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>>>
>>>> I set it in code, not by configuration. I submit my jar file to local.
>>>> I am working in my developer environment.
>>>>
>>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>>>
>>>>> How are you setting it? and how are you submitting the job?
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I have set spark.executor.memory to 2048m, and in the UI
>>>>>> "Environment" page, I can see this value has been set correctly. But in the
>>>>>> "Executors" page, I saw there's only 1 executor and its memory is 265.4MB.
>>>>>> Very strange value. why not 256MB, or just as what I set?
>>>>>>
>>>>>> What am I missing here?
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>> David
>>>>>>
>>>>>>
>>>>>
>>>
>

Re: How to set Spark executor memory?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
How much memory are you having on your machine? I think default value is
0.6 of the spark.executor.memory as you can see from here
<http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior>.

Thanks
Best Regards

On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <da...@gmail.com> wrote:

> Hi Akhil,
>
> Yes, you are right. If I ran the program from IDE as a normal java
> program, the executor's memory is increased...but not to 2048m, it is set
> to 6.7GB...Looks like there's some formula to calculate this value.
>
>
> Thanks,
> David
>
>
> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> By default spark.executor.memory is set to 512m, I'm assuming since you
>> are submiting the job using spark-submit and it is not able to override the
>> value since you are running in local mode. Can you try it without using
>> spark-submit as a standalone project?
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>>
>>> I set it in code, not by configuration. I submit my jar file to local. I
>>> am working in my developer environment.
>>>
>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>>
>>>> How are you setting it? and how are you submitting the job?
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>>>>> page, I can see this value has been set correctly. But in the "Executors"
>>>>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>>>>> value. why not 256MB, or just as what I set?
>>>>>
>>>>> What am I missing here?
>>>>>
>>>>>
>>>>> Thanks,
>>>>> David
>>>>>
>>>>>
>>>>
>>

Re: How to set Spark executor memory?

Posted by Xi Shen <da...@gmail.com>.
Hi Akhil,

Yes, you are right. If I ran the program from IDE as a normal java program,
the executor's memory is increased...but not to 2048m, it is set to
6.7GB...Looks like there's some formula to calculate this value.


Thanks,
David


On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> By default spark.executor.memory is set to 512m, I'm assuming since you
> are submiting the job using spark-submit and it is not able to override the
> value since you are running in local mode. Can you try it without using
> spark-submit as a standalone project?
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:
>
>> I set it in code, not by configuration. I submit my jar file to local. I
>> am working in my developer environment.
>>
>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>
>>> How are you setting it? and how are you submitting the job?
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>>>> page, I can see this value has been set correctly. But in the "Executors"
>>>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>>>> value. why not 256MB, or just as what I set?
>>>>
>>>> What am I missing here?
>>>>
>>>>
>>>> Thanks,
>>>> David
>>>>
>>>>
>>>
>

Re: How to set Spark executor memory?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
By default spark.executor.memory is set to 512m, I'm assuming since you are
submiting the job using spark-submit and it is not able to override the
value since you are running in local mode. Can you try it without using
spark-submit as a standalone project?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <da...@gmail.com> wrote:

> I set it in code, not by configuration. I submit my jar file to local. I
> am working in my developer environment.
>
> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>
>> How are you setting it? and how are you submitting the job?
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>>> page, I can see this value has been set correctly. But in the "Executors"
>>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>>> value. why not 256MB, or just as what I set?
>>>
>>> What am I missing here?
>>>
>>>
>>> Thanks,
>>> David
>>>
>>>
>>

Re: How to set Spark executor memory?

Posted by Xi Shen <da...@gmail.com>.
I set it in code, not by configuration. I submit my jar file to local. I am
working in my developer environment.

On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:

> How are you setting it? and how are you submitting the job?
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:
>
>> Hi,
>>
>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>> page, I can see this value has been set correctly. But in the "Executors"
>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>> value. why not 256MB, or just as what I set?
>>
>> What am I missing here?
>>
>>
>> Thanks,
>> David
>>
>>
>

Re: How to set Spark executor memory?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
How are you setting it? and how are you submitting the job?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <da...@gmail.com> wrote:

> Hi,
>
> I have set spark.executor.memory to 2048m, and in the UI "Environment"
> page, I can see this value has been set correctly. But in the "Executors"
> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
> value. why not 256MB, or just as what I set?
>
> What am I missing here?
>
>
> Thanks,
> David
>
>