You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Walid LEZZAR <wa...@gmail.com> on 2016/02/21 01:37:01 UTC

Constantly increasing Spark streaming heap memory

Hi,

I'm running a Spark Streaming job that pulls data from Kafka (using the
direct approach method - without receiver) and pushes it into
elasticsearch. The job is running fine but I was suprised once I opened
jconsole to monitor it : I noticed that the heap memory is constantly
increasing until the GC triggers, and then it restarts increasing again and
so on.

I tried to use a profiler to understand what is happening in the heap. All
I found is a byte[] object that is constantly increasing. But no more
details.

Is there an explanation to that ? Is this behaviour inherent to Spark
Streaming jobs ?

Thanks for your help.

Re: Constantly increasing Spark streaming heap memory

Posted by Robin East <ro...@xense.co.uk>.
Hi

What you describe looks like normal behaviour for almost any Java/Scala application - objects are created on the heap until a limit point is reached and then GC clears away memory allocated to objects that are no longer referenced. Is there an issue you are experiencing?





> On 21 Feb 2016, at 00:37, Walid LEZZAR <wa...@gmail.com> wrote:
> 
> Hi,
> 
> I'm running a Spark Streaming job that pulls data from Kafka (using the direct approach method - without receiver) and pushes it into elasticsearch. The job is running fine but I was suprised once I opened jconsole to monitor it : I noticed that the heap memory is constantly increasing until the GC triggers, and then it restarts increasing again and so on.
> 
> I tried to use a profiler to understand what is happening in the heap. All I found is a byte[] object that is constantly increasing. But no more details. 
> 
> Is there an explanation to that ? Is this behaviour inherent to Spark Streaming jobs ?
> 
> Thanks for your help.


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org