You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/07/25 11:43:01 UTC
[jira] [Resolved] (SPARK-21528) spark failed with native memory
exhausted , need immediate attention
[ https://issues.apache.org/jira/browse/SPARK-21528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-21528.
-------------------------------
Resolution: Invalid
Please direct questions like this to StackOverflow
> spark failed with native memory exhausted , need immediate attention
> --------------------------------------------------------------------
>
> Key: SPARK-21528
> URL: https://issues.apache.org/jira/browse/SPARK-21528
> Project: Spark
> Issue Type: Bug
> Components: Deploy
> Affects Versions: 1.6.1
> Environment: Red Hat Enterprise Linux Server release 6.7
> Linux hostname 2.6.32-696.1.1.el6.x86_64 #1 SMP Tue Mar 21 12:19:18 EDT 2017 x86_64 x86_64 x86_64 GNU/Linux
> Reporter: Mansr
> Attachments: hs_err_pid1004.log
>
>
> Native memory allocation (mmap) failed to map 7158628352 bytes for committing reserved memory.
> #
> # There is insufficient memory for the Java Runtime Environment to continue.
> # Native memory allocation (mmap) failed to map 7158628352 bytes for committing reserved memory.
> # Possible reasons:
> # The system is out of physical RAM or swap space
> # In 32 bit mode, the process size limit was hit
> # Possible solutions:
> # Reduce memory load on the system
> # Increase physical memory or swap space
> # Check if swap backing store is full
> # Use 64 bit Java on a 64 bit OS
> # Decrease Java heap size (-Xmx/-Xms)
> # Decrease number of Java threads
> # Decrease Java thread stack sizes (-Xss)
> # Set larger code cache with -XX:ReservedCodeCacheSize=
> # This output file may be truncated or incomplete.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org