You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by vonnagy <iv...@vadio.com> on 2016/10/12 20:33:09 UTC

Memory leak warnings in Spark 2.0.1

I am getting excessive memory leak warnings when running multiple mapping and
aggregations and using DataSets. Is there anything I should be looking for
to resolve this or is this a known issue?

WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak a page:
org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
WARN  [Executor task launch worker-0]
org.apache.spark.memory.TaskMemoryManager - leak a page:
org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor -
Managed memory leak detected; size = 17039360 bytes, TID = 88341

Thanks,

Ivan



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Memory-leak-warnings-in-Spark-2-0-1-tp19424.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Memory leak warnings in Spark 2.0.1

Posted by Nicholas Chammas <ni...@gmail.com>.
👍  Thanks for the reference and PR.

On Wed, Nov 23, 2016 at 2:59 AM Reynold Xin <rx...@databricks.com> wrote:

> See https://issues.apache.org/jira/browse/SPARK-18557
> <https://issues.apache.org/jira/browse/SPARK-18557>
>
> On Mon, Nov 21, 2016 at 1:16 PM, Nicholas Chammas <
> nicholas.chammas@gmail.com> wrote:
>
> I'm also curious about this. Is there something we can do to help
> troubleshoot these leaks and file useful bug reports?
>
> On Wed, Oct 12, 2016 at 4:33 PM vonnagy <iv...@vadio.com> wrote:
>
> I am getting excessive memory leak warnings when running multiple mapping
> and
> aggregations and using DataSets. Is there anything I should be looking for
> to resolve this or is this a known issue?
>
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
> org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak a page:
> org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak a page:
> org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
> WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor -
> Managed memory leak detected; size = 17039360 bytes, TID = 88341
>
> Thanks,
>
> Ivan
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Memory-leak-warnings-in-Spark-2-0-1-tp19424.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>

Re: Memory leak warnings in Spark 2.0.1

Posted by Reynold Xin <rx...@databricks.com>.
See https://issues.apache.org/jira/browse/SPARK-18557
<https://issues.apache.org/jira/browse/SPARK-18557>

On Mon, Nov 21, 2016 at 1:16 PM, Nicholas Chammas <
nicholas.chammas@gmail.com> wrote:

> I'm also curious about this. Is there something we can do to help
> troubleshoot these leaks and file useful bug reports?
>
> On Wed, Oct 12, 2016 at 4:33 PM vonnagy <iv...@vadio.com> wrote:
>
>> I am getting excessive memory leak warnings when running multiple mapping
>> and
>> aggregations and using DataSets. Is there anything I should be looking for
>> to resolve this or is this a known issue?
>>
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
>> org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak a page:
>> org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak a page:
>> org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
>> WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor
>> -
>> Managed memory leak detected; size = 17039360 bytes, TID = 88341
>>
>> Thanks,
>>
>> Ivan
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-
>> developers-list.1001551.n3.nabble.com/Memory-leak-
>> warnings-in-Spark-2-0-1-tp19424.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: Memory leak warnings in Spark 2.0.1

Posted by Nicholas Chammas <ni...@gmail.com>.
I'm also curious about this. Is there something we can do to help
troubleshoot these leaks and file useful bug reports?

On Wed, Oct 12, 2016 at 4:33 PM vonnagy <iv...@vadio.com> wrote:

> I am getting excessive memory leak warnings when running multiple mapping
> and
> aggregations and using DataSets. Is there anything I should be looking for
> to resolve this or is this a known issue?
>
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
> org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak a page:
> org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak a page:
> org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
> WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor -
> Managed memory leak detected; size = 17039360 bytes, TID = 88341
>
> Thanks,
>
> Ivan
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Memory-leak-warnings-in-Spark-2-0-1-tp19424.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>