You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by vonnagy <iv...@vadio.com> on 2016/10/12 20:32:47 UTC

Memory leak warnings in Spark 2.0.1

I am getting excessive memory leak warnings when running multiple mapping and
aggregations and using DataSets. Is there anything I should be looking for
to resolve this or is this a known issue?

WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak a page:
org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak a page:
org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor -
Managed memory leak detected; size = 17039360 bytes, TID = 88341

Thanks,

Ivan



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Memory-leak-warnings-in-Spark-2-0-1-tp27883.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org