You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Deenbandhu Agarwal (JIRA)" <ji...@apache.org> on 2017/02/17 06:00:46 UTC

[jira] [Created] (SPARK-19644) Memory leak in Spark Streaming

Deenbandhu Agarwal created SPARK-19644:
------------------------------------------

             Summary: Memory leak in Spark Streaming
                 Key: SPARK-19644
                 URL: https://issues.apache.org/jira/browse/SPARK-19644
             Project: Spark
          Issue Type: Bug
          Components: Structured Streaming
    Affects Versions: 2.0.1
         Environment: 3 AWS EC2 c3.xLarge
Number of cores - 3
Number of executers 3 
Memory to each executor 2GB
            Reporter: Deenbandhu Agarwal
            Priority: Critical


I am using streaming on the production for some aggregation and fetching data from cassandra and saving data back to cassandra. 

I see a gradual increase in old generation heap capacity from 1161216 Bytes to 1397760 Bytes over a period of six hours.

After 50 hours of processing instances of class scala.collection.immutable.$colon$colon incresed to 12,811,793 which is a huge number. 

I think this is a clear case of memory leak



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org