You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Paul Wais <pw...@yelp.com> on 2014/11/03 05:40:35 UTC

Do Spark executors restrict native heap vs JVM heap?

Thanks Sean! My novice understanding is that the 'native heap' is the
address space not allocated to the JVM heap, but I wanted to check to see
if I'm missing something.  I found out my issue appeared to be actual
memory pressure on the executor machine.  There was space for the JVM heap
but not much more.

On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen <sowen@cloudera.com
<javascript:;>> wrote:
> No, but, the JVM also does not allocate memory for native code on the
heap.
> I dont think heap has any bearing on whether your native code can't
allocate
> more memory except that of course the heap is also taking memory.
>
> On Oct 30, 2014 6:43 PM, "Paul Wais" <pwais@yelp.com <javascript:;>>
wrote:
>>
>> Dear Spark List,
>>
>> I have a Spark app that runs native code inside map functions.  I've
>> noticed that the native code sometimes sets errno to ENOMEM indicating
>> a lack of available memory.  However, I've verified that the /JVM/ has
>> plenty of heap space available-- Runtime.getRuntime().freeMemory()
>> shows gigabytes free and the native code needs only megabytes.  Does
>> spark limit the /native/ heap size somehow?  Am poking through the
>> executor code now but don't see anything obvious.
>>
>> Best Regards,
>> -Paul Wais
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <javascript:;>
>> For additional commands, e-mail: user-help@spark.apache.org
<javascript:;>
>>
>

Re: Do Spark executors restrict native heap vs JVM heap?

Posted by Sean Owen <so...@cloudera.com>.
Yes, that's correct to my understanding and the probable explanation of
your issue. There are no additional limits or differences from how the JVM
works here.
On Nov 3, 2014 4:40 AM, "Paul Wais" <pw...@yelp.com> wrote:

> Thanks Sean! My novice understanding is that the 'native heap' is the
> address space not allocated to the JVM heap, but I wanted to check to see
> if I'm missing something.  I found out my issue appeared to be actual
> memory pressure on the executor machine.  There was space for the JVM heap
> but not much more.
>
> On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen <so...@cloudera.com> wrote:
> > No, but, the JVM also does not allocate memory for native code on the
> heap.
> > I dont think heap has any bearing on whether your native code can't
> allocate
> > more memory except that of course the heap is also taking memory.
> >
> > On Oct 30, 2014 6:43 PM, "Paul Wais" <pw...@yelp.com> wrote:
> >>
> >> Dear Spark List,
> >>
> >> I have a Spark app that runs native code inside map functions.  I've
> >> noticed that the native code sometimes sets errno to ENOMEM indicating
> >> a lack of available memory.  However, I've verified that the /JVM/ has
> >> plenty of heap space available-- Runtime.getRuntime().freeMemory()
> >> shows gigabytes free and the native code needs only megabytes.  Does
> >> spark limit the /native/ heap size somehow?  Am poking through the
> >> executor code now but don't see anything obvious.
> >>
> >> Best Regards,
> >> -Paul Wais
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>
>