You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by "Partridge, Lucas (GE Aviation)" <Lu...@ge.com> on 2018/02/22 10:21:49 UTC

Jar dependencies are not reloaded when Spark interpreter is restarted?

I'm using Zeppelin 0.7.3 against a local standalone Spark 'cluster'. I've added a Scala jar dependency to my Spark interpreter using Zeppelin's UI. I thought if I changed my Scala code and updated the jar (using sbt outside of Zeppelin) then all I'd have to do is restart the interpreter for the new code to be picked up in Zeppelin in a regular scala paragraph.  However restarting the interpreter appears to have no effect - the new code is not detected. Is that expected behaviour or a bug?

The workaround I'm using at the moment is to edit the spark interpreter, remove the jar, re-add it, save the changes and then restart the interpreter. Clumsy but that's better than restarting Zeppelin altogether.

Also, if anyone knows of a better way to reload code without restarting the interpreter then I'm open to suggestions:). Having to re-run lots of paragraphs after a restart is pretty tedious.

Thanks, Lucas.


Jar dependencies are not reloaded when Spark interpreter is restarted?

Posted by "Partridge, Lucas (GE Aviation)" <Lu...@ge.com>.
I only change the content of the jar, not the name or version of the jar (otherwise I’d have to re-add it as a dependency anyway).  Or do you mean something else by ’version’?

This dependency is a local file. Zeppelin and Spark are all running on the same machine. So I’m just specifying the file system path of the jar; it’s not even prefixed with file:///.

From: Jhon Anderson Cardenas Diaz [mailto:jhonderson2007@gmail.com]
Sent: 22 February 2018 12:18
To: users@zeppelin.apache.org
Subject: EXT: Re: Jar dependencies are not reloaded when Spark interpreter is restarted?

When you say you change the dependency, is only about the content? Or content and version. I think the dependency should be reloaded only if its version change.

I do not think it's optimal to re-download the dependencies every time the interpreter reboots.

El 22 feb. 2018 05:22, "Partridge, Lucas (GE Aviation)" <Lu...@ge.com>> escribió:
I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve added a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I thought if I changed my Scala code and updated the jar (using sbt outside of Zeppelin) then all I’d have to do is restart the interpreter for the new code to be picked up in Zeppelin in a regular scala paragraph.  However restarting the interpreter appears to have no effect – the new code is not detected. Is that expected behaviour or a bug?

The workaround I’m using at the moment is to edit the spark interpreter, remove the jar, re-add it, save the changes and then restart the interpreter. Clumsy but that’s better than restarting Zeppelin altogether.

Also, if anyone knows of a better way to reload code without restarting the interpreter then I’m open to suggestions:). Having to re-run lots of paragraphs after a restart is pretty tedious.

Thanks, Lucas.


Re: Jar dependencies are not reloaded when Spark interpreter is restarted?

Posted by Jhon Anderson Cardenas Diaz <jh...@gmail.com>.
When you say you change the dependency, is only about the content? Or
content and version. I think the dependency should be reloaded only if its
version change.

I do not think it's optimal to re-download the dependencies every time the
interpreter reboots.

El 22 feb. 2018 05:22, "Partridge, Lucas (GE Aviation)" <
Lucas.Partridge@ge.com> escribió:

> I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve
> added a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I
> thought if I changed my Scala code and updated the jar (using sbt outside
> of Zeppelin) then all I’d have to do is restart the interpreter for the new
> code to be picked up in Zeppelin in a regular scala paragraph.  However
> restarting the interpreter appears to have no effect – the new code is not
> detected. Is that expected behaviour or a bug?
>
>
>
> The workaround I’m using at the moment is to edit the spark interpreter,
> remove the jar, re-add it, save the changes and then restart the
> interpreter. Clumsy but that’s better than restarting Zeppelin altogether.
>
>
>
> Also, if anyone knows of a better way to reload code without restarting
> the interpreter then I’m open to suggestions:). Having to re-run lots of
> paragraphs after a restart is pretty tedious.
>
>
>
> Thanks, Lucas.
>
>
>