You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Akira AJISAKA (JIRA)" <ji...@apache.org> on 2015/12/01 03:14:11 UTC

[jira] [Commented] (HADOOP-11364) [Java 8] Over usage of virtual memory

    [ https://issues.apache.org/jira/browse/HADOOP-11364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15032936#comment-15032936 ] 

Akira AJISAKA commented on HADOOP-11364:
----------------------------------------

This issue is Java 8 only. If you are using Java7 and seeing this message, the allocated memory is too small for the task. I'm thinking  you should increase the allocated memory for the container to fix this issue.

> [Java 8] Over usage of virtual memory
> -------------------------------------
>
>                 Key: HADOOP-11364
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11364
>             Project: Hadoop Common
>          Issue Type: Bug
>            Reporter: Mohammad Kamrul Islam
>            Assignee: Mohammad Kamrul Islam
>
> In our Hadoop 2 + Java8 effort , we found few jobs are being Killed by Hadoop due to excessive virtual memory allocation.  Although the physical memory usage is low.
> The most common error message is "Container [pid=??,containerID=container_??] is running beyond virtual memory limits. Current usage: 365.1 MB of 1 GB physical memory used; 3.2 GB of 2.1 GB virtual memory used. Killing container."
> We see this problem for MR job as well as in spark driver/executor.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)