You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by msumbul <mi...@gmail.com> on 2020/03/20 14:40:49 UTC

Exact meaning of spark.memory.storageFraction in spark 2.3.x

Hello,

Im asking mysef the exact meaning of the setting of
spark.memory.storageFraction.
The documentation mention:

"Amount of storage memory immune to eviction, expressed as a fraction of the
size of the region set aside by spark.memory.fraction. The higher this is,
the less working memory may be available to execution and tasks may spill to
disk more often"

Does that mean that if there is no caching that part of the memory will not
be used at all?
In the spark UI, in the tab "Executor", I can see that the "storage memory"
is always zero. Does that mean that that part of the memory is never used at
all and I can reduce it or never used for storage specifically?

Thanks in advance for your help,
Michel



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail] [Marketing Mail]

Posted by Michel Sumbul <mi...@gmail.com>.
Hi  Iacovos,

thansk for the reply its super clear.
Do you know if there is a way to know the max memory usage?
In the spark ui 2.3.x the "peak memory usage" metris is always at zero.

Thanks,
Michel

<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Garanti
sans virus. www.avast.com
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

Le ven. 20 mars 2020 à 14:56, Jack Kolokasis <ko...@ics.forth.gr> a
écrit :

> This is just a counter to show you the size of cached RDDs. If it is zero
> means that no caching has occurred. Also, even storage memory is used for
> computing the counter will show as zero.
>
> Iacovos
> On 20/3/20 4:51 μ.μ., Michel Sumbul wrote:
>
> Hi,
>
> Thanks for the very quick reply!
> If I see the metrics "storage memory", always at 0, does that mean that
> the memory is neither used for caching or computing?
>
> Thanks,
> Michel
>
>
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail> Garanti
> sans virus. www.avast.com
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>
> Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <ko...@ics.forth.gr> a
> écrit :
>
>> Hello Michel,
>>
>> Spark seperates executors memory using an adaptive boundary between
>> storage and execution memory. If there is no caching and execution
>> memory needs more space, then it will use a portion of the storage memory.
>>
>> If your program does not use caching then you can reduce storage memory.
>>
>> Iacovos
>>
>> On 20/3/20 4:40 μ.μ., msumbul wrote:
>> > Hello,
>> >
>> > Im asking mysef the exact meaning of the setting of
>> > spark.memory.storageFraction.
>> > The documentation mention:
>> >
>> > "Amount of storage memory immune to eviction, expressed as a fraction
>> of the
>> > size of the region set aside by spark.memory.fraction. The higher this
>> is,
>> > the less working memory may be available to execution and tasks may
>> spill to
>> > disk more often"
>> >
>> > Does that mean that if there is no caching that part of the memory will
>> not
>> > be used at all?
>> > In the spark UI, in the tab "Executor", I can see that the "storage
>> memory"
>> > is always zero. Does that mean that that part of the memory is never
>> used at
>> > all and I can reduce it or never used for storage specifically?
>> >
>> > Thanks in advance for your help,
>> > Michel
>> >
>> >
>> >
>> > --
>> > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail] [Marketing Mail]

Posted by Jack Kolokasis <ko...@ics.forth.gr>.
This is just a counter to show you the size of cached RDDs. If it is 
zero means that no caching has occurred. Also, even storage memory is 
used for computing the counter will show as zero.

Iacovos

On 20/3/20 4:51 μ.μ., Michel Sumbul wrote:
> Hi,
>
> Thanks for the very quick reply!
> If I see the metrics "storage memory", always at 0, does that mean 
> that the memory is neither used for caching or computing?
>
> Thanks,
> Michel
>
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail> 
> 	Garanti sans virus. www.avast.com 
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail> 
>
>
>
> Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <kolokasis@ics.forth.gr 
> <ma...@ics.forth.gr>> a écrit :
>
>     Hello Michel,
>
>     Spark seperates executors memory using an adaptive boundary between
>     storage and execution memory. If there is no caching and execution
>     memory needs more space, then it will use a portion of the storage
>     memory.
>
>     If your program does not use caching then you can reduce storage
>     memory.
>
>     Iacovos
>
>     On 20/3/20 4:40 μ.μ., msumbul wrote:
>     > Hello,
>     >
>     > Im asking mysef the exact meaning of the setting of
>     > spark.memory.storageFraction.
>     > The documentation mention:
>     >
>     > "Amount of storage memory immune to eviction, expressed as a
>     fraction of the
>     > size of the region set aside by spark.memory.fraction. The
>     higher this is,
>     > the less working memory may be available to execution and tasks
>     may spill to
>     > disk more often"
>     >
>     > Does that mean that if there is no caching that part of the
>     memory will not
>     > be used at all?
>     > In the spark UI, in the tab "Executor", I can see that the
>     "storage memory"
>     > is always zero. Does that mean that that part of the memory is
>     never used at
>     > all and I can reduce it or never used for storage specifically?
>     >
>     > Thanks in advance for your help,
>     > Michel
>     >
>     >
>     >
>     > --
>     > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>     >
>     >
>     ---------------------------------------------------------------------
>     > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     >
>
>     ---------------------------------------------------------------------
>     To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail]

Posted by Michel Sumbul <mi...@gmail.com>.
Hi,

Thanks for the very quick reply!
If I see the metrics "storage memory", always at 0, does that mean that the
memory is neither used for caching or computing?

Thanks,
Michel

<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Garanti
sans virus. www.avast.com
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <ko...@ics.forth.gr> a
écrit :

> Hello Michel,
>
> Spark seperates executors memory using an adaptive boundary between
> storage and execution memory. If there is no caching and execution
> memory needs more space, then it will use a portion of the storage memory.
>
> If your program does not use caching then you can reduce storage memory.
>
> Iacovos
>
> On 20/3/20 4:40 μ.μ., msumbul wrote:
> > Hello,
> >
> > Im asking mysef the exact meaning of the setting of
> > spark.memory.storageFraction.
> > The documentation mention:
> >
> > "Amount of storage memory immune to eviction, expressed as a fraction of
> the
> > size of the region set aside by spark.memory.fraction. The higher this
> is,
> > the less working memory may be available to execution and tasks may
> spill to
> > disk more often"
> >
> > Does that mean that if there is no caching that part of the memory will
> not
> > be used at all?
> > In the spark UI, in the tab "Executor", I can see that the "storage
> memory"
> > is always zero. Does that mean that that part of the memory is never
> used at
> > all and I can reduce it or never used for storage specifically?
> >
> > Thanks in advance for your help,
> > Michel
> >
> >
> >
> > --
> > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail]

Posted by Jack Kolokasis <ko...@ics.forth.gr>.
Hello Michel,

Spark seperates executors memory using an adaptive boundary between 
storage and execution memory. If there is no caching and execution 
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:
> Hello,
>
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
>
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
>
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
>
> Thanks in advance for your help,
> Michel
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org