You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by Sachin Janani <sj...@snappydata.io> on 2016/07/25 13:48:31 UTC

Garbage collection in Spark interpreter

Hi All,
I was looking into the spark interpreter code to understand how it works
and found that it is using Spark REPL for executing scala code.As per my
understanding spark REPL will not do GC unless we restart the
interpreter.So my question here is
1) if we are using Spark interpreter will the variables that we create will
ever be destroyed without restarting the interpreter?
2) Have someone faced any issue related to GC in interpreter process before?


Thanks and Regards,
Sachin J

Re: Garbage collection in Spark interpreter

Posted by Sachin Janani <sj...@snappydata.io>.
Thanks Moon. It means that this is the issue with the scala interpreter
itself.

Regards,
-Sachin
On Jul 30, 2016 04:48, "moon soo Lee" <mo...@apache.org> wrote:

> Hi,
>
> Here's related issue.
> https://issues.scala-lang.org/browse/SI-4331
>
> Restart interpreter will be the only way to release memory.
> It looks like there're some unreleased memory for keeping variable and
> generated class, even if you're manually releasing reference to the object,
> like
>
> var myObject = new MyObject
> ....
> myObject = null
>
> Thanks,
> moon
>
>
> On Mon, Jul 25, 2016 at 10:48 PM Sachin Janani <sj...@snappydata.io>
> wrote:
>
> > Hi All,
> > I was looking into the spark interpreter code to understand how it works
> > and found that it is using Spark REPL for executing scala code.As per my
> > understanding spark REPL will not do GC unless we restart the
> > interpreter.So my question here is
> > 1) if we are using Spark interpreter will the variables that we create
> will
> > ever be destroyed without restarting the interpreter?
> > 2) Have someone faced any issue related to GC in interpreter process
> > before?
> >
> >
> > Thanks and Regards,
> > Sachin J
> >
>

Re: Garbage collection in Spark interpreter

Posted by moon soo Lee <mo...@apache.org>.
Hi,

Here's related issue.
https://issues.scala-lang.org/browse/SI-4331

Restart interpreter will be the only way to release memory.
It looks like there're some unreleased memory for keeping variable and
generated class, even if you're manually releasing reference to the object,
like

var myObject = new MyObject
....
myObject = null

Thanks,
moon


On Mon, Jul 25, 2016 at 10:48 PM Sachin Janani <sj...@snappydata.io>
wrote:

> Hi All,
> I was looking into the spark interpreter code to understand how it works
> and found that it is using Spark REPL for executing scala code.As per my
> understanding spark REPL will not do GC unless we restart the
> interpreter.So my question here is
> 1) if we are using Spark interpreter will the variables that we create will
> ever be destroyed without restarting the interpreter?
> 2) Have someone faced any issue related to GC in interpreter process
> before?
>
>
> Thanks and Regards,
> Sachin J
>