You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by Darpan R <da...@gmail.com> on 2015/12/01 16:19:30 UTC

Ambari Server going Out Of Memory?

Hi all,

Ambari was working fine in production for last 30 days and today I see that
is down. While checking the logs and trying to restart the Ambari Server I
see following error.

Java HotSpot(TM) 64-Bit Server VM warning: INFO:
os::commit_memory(0x00000000a0000000, 402653184, 0) failed; error='Cannot
allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 402653184 bytes for
committing reserved memory.
# An error report file with more information is saved as:
# //hs_err_pid31856.log

On opening the hs_err_pid31856.log I see that it says out of memory.
Attaching the log file here with for your information.

Thanks,
DP

Re: Ambari Server going Out Of Memory?

Posted by Darpan Patel <da...@gmail.com>.
Hi Myroslav,

You were correct. Out of 14 GB physical memory around 150 MB were available.
Manually I killed few processes. But When I run the command to check the
memory eaters I see many processes are hanged related to Hive.
I am not sure how come so much of memory got eaten up. Any guidelines on
killing the processes?


On 1 December 2015 at 15:27, Myroslav Papyrkovskyy <
mpapyrkovskyy@hortonworks.com> wrote:

> Hello. This is JVM error, and not Ambari one.
> It basically says that theres not enough memory at host to handle heap of
> allowed size (2GB by default).
> In your case it tries to allocate ~400MB and is unable to do so.
> May be some other application consumed all available memory.
>
> Regards,
> Myroslav
>
> > 1 груд. 2015 р. о 17:19 Darpan R <da...@gmail.com> написав(ла):
> >
> > Hi all,
> >
> > Ambari was working fine in production for last 30 days and today I see
> that is down. While checking the logs and trying to restart the Ambari
> Server I see following error.
> >
> > Java HotSpot(TM) 64-Bit Server VM warning: INFO:
> os::commit_memory(0x00000000a0000000, 402653184, 0) failed; error='Cannot
> allocate memory' (errno=12)
> > #
> > # There is insufficient memory for the Java Runtime Environment to
> continue.
> > # Native memory allocation (mmap) failed to map 402653184 bytes for
> committing reserved memory.
> > # An error report file with more information is saved as:
> > # //hs_err_pid31856.log
> >
> > On opening the hs_err_pid31856.log I see that it says out of memory.
> Attaching the log file here with for your information.
> >
> > Thanks,
> > DP
> >
> >
> > <AmbariRestart.log>
>
>

Re: Ambari Server going Out Of Memory?

Posted by Myroslav Papyrkovskyy <mp...@hortonworks.com>.
Hello. This is JVM error, and not Ambari one.
It basically says that theres not enough memory at host to handle heap of allowed size (2GB by default).
In your case it tries to allocate ~400MB and is unable to do so.
May be some other application consumed all available memory.

Regards,
Myroslav

> 1 груд. 2015 р. о 17:19 Darpan R <da...@gmail.com> написав(ла):
> 
> Hi all,
> 
> Ambari was working fine in production for last 30 days and today I see that is down. While checking the logs and trying to restart the Ambari Server I see following error.
> 
> Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000a0000000, 402653184, 0) failed; error='Cannot allocate memory' (errno=12)
> #
> # There is insufficient memory for the Java Runtime Environment to continue.
> # Native memory allocation (mmap) failed to map 402653184 bytes for committing reserved memory.
> # An error report file with more information is saved as:
> # //hs_err_pid31856.log
> 
> On opening the hs_err_pid31856.log I see that it says out of memory. Attaching the log file here with for your information.
> 
> Thanks,
> DP
> 
> 
> <AmbariRestart.log>