You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Luis Angel Hernández Acosta (JIRA)" <ji...@apache.org> on 2016/08/02 14:25:20 UTC

[jira] [Commented] (SPARK-16667) Spark driver executor dont release unused memory

    [ https://issues.apache.org/jira/browse/SPARK-16667?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15404070#comment-15404070 ] 

Luis Angel Hernández Acosta commented on SPARK-16667:
-----------------------------------------------------

The problem is that memory never freed and my cluster collapsed.

> Spark driver executor dont release unused memory
> ------------------------------------------------
>
>                 Key: SPARK-16667
>                 URL: https://issues.apache.org/jira/browse/SPARK-16667
>             Project: Spark
>          Issue Type: Bug
>          Components: GraphX, Spark Core
>    Affects Versions: 1.6.0
>         Environment: Ubuntu wily 64 bits
> java 1.8
> 3 slaves(4GB) 1 master(2GB) virtual machines in Vmware over i7 4th generation with 16 gb RAM)
>            Reporter: Luis Angel Hernández Acosta
>
> I'm running spark app in standalone cluster. My app create sparkContext and make many calculation with graphx over the time. To calculate, my app create new java thread and wait for it's ending signal. Betwenn any calculation, memory grows 50mb-100mb. I make a thread to be sure that any object created for calculate is destryed after calculate's end, but memory still growing. I tray stoping the sparkContext and all executor memory allocated by app is freed but my driver's memory still growing same 50m-100mb.
> My graph calculaiton include hdfs seralization of rdd and load graph from hdfs
> Spark env:
> export SPARK_MASTER_IP=master
> export SPARK_WORKER_CORES=4
> export SPARK_WORKER_MEMORY=2919m
> export SPARK_WORKER_INSTANCES=1
> export SPARK_DAEMON_MEMORY=256m
> export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=10"
> That are my only configurations



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org