You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by Yohana Khoury <yo...@gigaspaces.com> on 2018/05/27 14:21:46 UTC

%dep in Zeppelin 0.8.0

Hi,

I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:

*%dep*
*z.load("/tmp/product.jar")*

and then

*%spark*
*import model.v1._*

The result is:

*<console>:25: error: not found: value model*
*       import model.v1._*
*              ^*


It appears that Zeppelin does not load the jar file into the spark
interpreter.
Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be fixed?

It worked with Zeppelin 0.7.2.


Thanks!

Re: %dep in Zeppelin 0.8.0

Posted by Sanjay Dasgupta <sa...@gmail.com>.
Thanks Jeff.

No, I was not suggesting any change.

Just needed to understand some unusual behaviors.

Thanks for your help.
Sanjay

On Mon, May 28, 2018 at 8:39 AM, Jeff Zhang <zj...@gmail.com> wrote:

> %spark.dep is only for spark interpreter. Although we could introduce the
> same thing for jdbc interpreter, but I don't see much benefit compared to
> setting driver in interpreter setting page. Because jdbc interpreter is
> different from spark interpreter, the driver jar for jdbc is not adhoc
> compared to spark interpreter, if user want to use jdbc interpreter to
> connect to different databases, he has to make custom setting in
> interpreter setting for the specific database. And it is recommended to
> create different interpreter for each database regarding this doc
> https://zeppelin.apache.org/docs/0.8.0-SNAPSHOT/
> interpreter/jdbc.html#create-a-new-jdbc-interpreter
>
>
>
> Sanjay Dasgupta <sa...@gmail.com>于2018年5月28日周一 上午10:59写道:
>
> > Hi Jeff,
> >
> > Apologies for raising a somewhat unrelated issue:
> >
> > Should the JDBC interpreter also be able to find a driver if the JAR file
> > containing the driver is loaded using the dep interpreter (instead of
> > defining it as an "artifact" at the bottom of the interpreter
> configuration
> > page)?
> >
> > Thanks,
> > Sanjay
> >
> > On Mon, May 28, 2018 at 7:54 AM, Jeff Zhang <zj...@gmail.com> wrote:
> >
> > > And this seems a bug in the DepInterpreter, I have created ticket
> > > https://issues.apache.org/jira/browse/ZEPPELIN-3506
> > >
> > >
> > >
> > > Jeff Zhang <zj...@gmail.com>于2018年5月28日周一 上午8:24写道:
> > >
> > > >
> > > > You can use %spark.conf to add custom jars to spark interpreter.
> > > > %spark.conf is more powerful that it could not customize jars and
> also
> > > > other spark configurations.
> > > >
> > > > e.g.
> > > >
> > > > %spark.conf
> > > >
> > > > spark.jars /tmp/product.jar
> > > >
> > > >
> > > > See the Generic ConfInterpeter Section of this article
> > > > https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235
> > > >
> > > > Yohana Khoury <yo...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:
> > > >
> > > >> Hi,
> > > >>
> > > >> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
> > > >>
> > > >> *%dep*
> > > >> *z.load("/tmp/product.jar")*
> > > >>
> > > >> and then
> > > >>
> > > >> *%spark*
> > > >> *import model.v1._*
> > > >>
> > > >> The result is:
> > > >>
> > > >> *<console>:25: error: not found: value model*
> > > >> *       import model.v1._*
> > > >> *              ^*
> > > >>
> > > >>
> > > >> It appears that Zeppelin does not load the jar file into the spark
> > > >> interpreter.
> > > >> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be
> > > >> fixed?
> > > >>
> > > >> It worked with Zeppelin 0.7.2.
> > > >>
> > > >>
> > > >> Thanks!
> > > >>
> > > >
> > >
> >
>

Re: %dep in Zeppelin 0.8.0

Posted by Jeff Zhang <zj...@gmail.com>.
%spark.dep is only for spark interpreter. Although we could introduce the
same thing for jdbc interpreter, but I don't see much benefit compared to
setting driver in interpreter setting page. Because jdbc interpreter is
different from spark interpreter, the driver jar for jdbc is not adhoc
compared to spark interpreter, if user want to use jdbc interpreter to
connect to different databases, he has to make custom setting in
interpreter setting for the specific database. And it is recommended to
create different interpreter for each database regarding this doc
https://zeppelin.apache.org/docs/0.8.0-SNAPSHOT/interpreter/jdbc.html#create-a-new-jdbc-interpreter



Sanjay Dasgupta <sa...@gmail.com>于2018年5月28日周一 上午10:59写道:

> Hi Jeff,
>
> Apologies for raising a somewhat unrelated issue:
>
> Should the JDBC interpreter also be able to find a driver if the JAR file
> containing the driver is loaded using the dep interpreter (instead of
> defining it as an "artifact" at the bottom of the interpreter configuration
> page)?
>
> Thanks,
> Sanjay
>
> On Mon, May 28, 2018 at 7:54 AM, Jeff Zhang <zj...@gmail.com> wrote:
>
> > And this seems a bug in the DepInterpreter, I have created ticket
> > https://issues.apache.org/jira/browse/ZEPPELIN-3506
> >
> >
> >
> > Jeff Zhang <zj...@gmail.com>于2018年5月28日周一 上午8:24写道:
> >
> > >
> > > You can use %spark.conf to add custom jars to spark interpreter.
> > > %spark.conf is more powerful that it could not customize jars and also
> > > other spark configurations.
> > >
> > > e.g.
> > >
> > > %spark.conf
> > >
> > > spark.jars /tmp/product.jar
> > >
> > >
> > > See the Generic ConfInterpeter Section of this article
> > > https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235
> > >
> > > Yohana Khoury <yo...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:
> > >
> > >> Hi,
> > >>
> > >> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
> > >>
> > >> *%dep*
> > >> *z.load("/tmp/product.jar")*
> > >>
> > >> and then
> > >>
> > >> *%spark*
> > >> *import model.v1._*
> > >>
> > >> The result is:
> > >>
> > >> *<console>:25: error: not found: value model*
> > >> *       import model.v1._*
> > >> *              ^*
> > >>
> > >>
> > >> It appears that Zeppelin does not load the jar file into the spark
> > >> interpreter.
> > >> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be
> > >> fixed?
> > >>
> > >> It worked with Zeppelin 0.7.2.
> > >>
> > >>
> > >> Thanks!
> > >>
> > >
> >
>

Re: %dep in Zeppelin 0.8.0

Posted by Sanjay Dasgupta <sa...@gmail.com>.
Hi Jeff,

Apologies for raising a somewhat unrelated issue:

Should the JDBC interpreter also be able to find a driver if the JAR file
containing the driver is loaded using the dep interpreter (instead of
defining it as an "artifact" at the bottom of the interpreter configuration
page)?

Thanks,
Sanjay

On Mon, May 28, 2018 at 7:54 AM, Jeff Zhang <zj...@gmail.com> wrote:

> And this seems a bug in the DepInterpreter, I have created ticket
> https://issues.apache.org/jira/browse/ZEPPELIN-3506
>
>
>
> Jeff Zhang <zj...@gmail.com>于2018年5月28日周一 上午8:24写道:
>
> >
> > You can use %spark.conf to add custom jars to spark interpreter.
> > %spark.conf is more powerful that it could not customize jars and also
> > other spark configurations.
> >
> > e.g.
> >
> > %spark.conf
> >
> > spark.jars /tmp/product.jar
> >
> >
> > See the Generic ConfInterpeter Section of this article
> > https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235
> >
> > Yohana Khoury <yo...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:
> >
> >> Hi,
> >>
> >> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
> >>
> >> *%dep*
> >> *z.load("/tmp/product.jar")*
> >>
> >> and then
> >>
> >> *%spark*
> >> *import model.v1._*
> >>
> >> The result is:
> >>
> >> *<console>:25: error: not found: value model*
> >> *       import model.v1._*
> >> *              ^*
> >>
> >>
> >> It appears that Zeppelin does not load the jar file into the spark
> >> interpreter.
> >> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be
> >> fixed?
> >>
> >> It worked with Zeppelin 0.7.2.
> >>
> >>
> >> Thanks!
> >>
> >
>

Re: %dep in Zeppelin 0.8.0

Posted by Jeff Zhang <zj...@gmail.com>.
And this seems a bug in the DepInterpreter, I have created ticket
https://issues.apache.org/jira/browse/ZEPPELIN-3506



Jeff Zhang <zj...@gmail.com>于2018年5月28日周一 上午8:24写道:

>
> You can use %spark.conf to add custom jars to spark interpreter.
> %spark.conf is more powerful that it could not customize jars and also
> other spark configurations.
>
> e.g.
>
> %spark.conf
>
> spark.jars /tmp/product.jar
>
>
> See the Generic ConfInterpeter Section of this article
> https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235
>
> Yohana Khoury <yo...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:
>
>> Hi,
>>
>> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
>>
>> *%dep*
>> *z.load("/tmp/product.jar")*
>>
>> and then
>>
>> *%spark*
>> *import model.v1._*
>>
>> The result is:
>>
>> *<console>:25: error: not found: value model*
>> *       import model.v1._*
>> *              ^*
>>
>>
>> It appears that Zeppelin does not load the jar file into the spark
>> interpreter.
>> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be
>> fixed?
>>
>> It worked with Zeppelin 0.7.2.
>>
>>
>> Thanks!
>>
>

Re: %dep in Zeppelin 0.8.0

Posted by Jeff Zhang <zj...@gmail.com>.
You can use %spark.conf to add custom jars to spark interpreter.
%spark.conf is more powerful that it could not customize jars and also
other spark configurations.

e.g.

%spark.conf

spark.jars /tmp/product.jar


See the Generic ConfInterpeter Section of this article
https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235

Yohana Khoury <yo...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:

> Hi,
>
> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
>
> *%dep*
> *z.load("/tmp/product.jar")*
>
> and then
>
> *%spark*
> *import model.v1._*
>
> The result is:
>
> *<console>:25: error: not found: value model*
> *       import model.v1._*
> *              ^*
>
>
> It appears that Zeppelin does not load the jar file into the spark
> interpreter.
> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be fixed?
>
> It worked with Zeppelin 0.7.2.
>
>
> Thanks!
>